Fundraising and Funds Dissemination/Online fundraising
|Future of fundraising discussions - Index|
Guiding Principles discussion
Fundraising models/future discussion
Wikimedia Foundation resolutions
Wikimedia chapter statements
What makes the difference in Wikimedia online fundraising
When the 2012 fundraising and funds dissemination discussion started, I shared a spreadsheet of totals by country for 2010 and 2011 that also shows percentage increases between the two years in each country. That led to speculation about why some countries saw bigger increases than others. Several explanations were offered such as tax deductibility, better localization, and others. I’m sorry that we didn’t have more granular data at the beginning of this discussion, but it takes a little while after the scramble of the fundraiser to get all our data in order and double checked for accuracy. We’ve got more detailed data now that I think will help clarify what made a difference in fundraising in 2011.
The biggest factor that explains the variance among the increases from country to country is the number of days that Jimmy banners and appeals (and other high performers such as Brandon) were run by WMF compared to each payment processing chapter. A second factor is the usability and simplicity of the donation forms. But before we go to those points, I want to talk about what was behind the overall increase seen in all countries in 2011 over 2010.
The primary cause of the big 2011 increases overall was new and improved banners and appeals. Everyone involved in fundraising (at WMF and chapters) who A/B tested the highest performing new 2011 banners and appeals against the highest performing 2010 material has no question about this.
Different banners and appeals have different rates of effectiveness. The only way to measure relative effectiveness is to test different banners/LPs against each other at the same time. Not only does the volume of donations vary throughout the day and the campaign, but click and donation _rates_ also vary. In other words, comparing banner A’s click rate at 9:00 to banner B’s at 19:00 is not a fair test. Through constant creative experimentation (trying as many different things as we can) and careful, precise testing and statistical analysis, we have been able to dramatically improve the effectiveness of banners and appeals. And the good news is that, as a rule, the better we represent the movement's values in appeals, the more money they seem to make.
The WMF and chapter fundraising teams carried out many tests. The WMF carried out hundreds of tests in 2011, often retesting and rechecking old assumptions under new conditions to make sure we could trust our results. Some things we tested didn’t make a big difference. But some things made a very big difference. The big creative gains (like the new Jimmy appeal, and the "green" banners) were all adopted by chapters.
In other words, we tried a million things to find one or two that would really make a huge difference. Here's one example: As crazy as this sounds, we learned that putting people in front of trees or bushes made a huge difference. It increased click rates by up to 66% and donation rates up to 38% in repeated testing. (Please don’t tell anyone in the advertising industry!) We discovered this in our frenetic creative/testing process at WMF where we tried as many different things as possible. We have user Invertzoo’s husband to thank for snapping her picture in front of a tree in their front yard, and her beautiful long hair for making it impossible for us to photoshop away the green. We had learned the previous year that a blank background performed better than most other backgrounds.
We are also constantly working to improve the appeals, which is more difficult and time consuming than working on banners. Jimmy’s has always been the strongest. He can say things that no one else can, and the voice of a founder is a powerful one in our context. In 2010, we tested many different kinds of appeals and identified a basic format that seemed to work better than any other. We nagged Jimmy to write us a version in his own words, which he did about a month into the 2010 fundraiser. The version in his own words did much better (20-30% better in different tests) than what we had before. Authenticity makes a difference. Donors can hear it. And Jimmy certainly knows how to connect with Wikipedia readers.
In 2011 pre-campaign testing, we developed some new lines that were very powerful. One, for example, was the comparison by size to Google and Yahoo. That and other improvements to the Jimmy appeal greatly increased the Jimmy appeal’s effectiveness. Those gains came in incremental stages, and so it’s hard to say exactly how much of an increase came out of the new-and-improved Jimmy appeal, but I’m guessing it was about 15-30%. In one "reality check" test, the new and improved Jimmy appeal did 17% better than the one we ended with in 2010. (It did 65% better than the one we started 2010 with!)
The WMF fundraising team’s primary goal in 2011 was to find appeals from editors, written in their own words, that performed as well or better than the Jimmy appeal. We succeeded in the sense that our best non-Jimmy appeals in 2011 performed better than our best Jimmy appeal of 2010. But because we kept improving Jimmy, the new Jimmy again did better in fair AB tests than all other appeals in 2011.
Nevertheless, a few other appeals did very well -- far, far better than our much more hastily created editor appeals from 2010. And so, for the first time, editor (and staff) appeals did a significant portion of the work of the fundraiser. In particular WMF staffer Brandon Harris’ appeal did very well. While it lost in pre-campaign AB tests against the new Jimmy, it beat new Jimmy in the U.S. after Jimmy had been up for only seven days there. Likewise, Dr. Sengai Pahuvan's appeal beat Jimmy and Brandon in India (and did very well all over the world). Susan Hewitt, Gorilla Warfare and Alan Sohn all did particularly well too.
The secret behind the success with editor appeals was a process of interviewing editors in person and then collaborating with them to craft appeals made up of sentences taken from the actual interviews. Our story teller trio (hired for the 2011 fundraiser) interviewed more 200 editors from around the world. They traveled to Haifa for Wikimania as well as India, Brazil and Argentina. Every interview was fascinating from a community perspective, but only a handful also contained material that would make a good donation ask. The process of creating these successful editor appeals was the most satisfying part of the fundraiser and we are expanding that process in 2012.
More Jimmy = More money
Explaining the variance among the increases from country to country, two variables stand above all others:
- The number of days Jimmy banners and appeals (and other high performers such as Brandon) were run in each country.
- The quality of donation forms in 2011 vs 2010 in each country.
For 2011, the WMF board approved a budget that required the fundraising team to set a fundraising goal that was about 50% larger than what we raised in 2010. We thought we were taking a little bit of a risk by starting the fundraiser several days later in 2011 than we did in 2010. But it turned out that our 2011 banners and appeals were so much more effective in 2011, that we would be able to raise much more than we needed if we ran Jimmy and other high performers the whole campaign. That allowed us to spend most of the fundraiser running appeals from editors that did not raise as much money as Jimmy’s appeal, but did more all together to educate readers about the Wikimedia projects and community. We were very happy with that.
WMDE, WMFR and WMUK set much more aggressive budget increase goals for themselves in 2011 than WMF. To reach those goals, they ran Jimmy (and Brandon, and other high performing appeals) for much longer than WMF did. They also ran their fundraisers for about 2, 4 and 6 days longer than WMF. The number of “Jimmy days” and the length of the fundraiser is the biggest factor that explains the variance in total campaign increases in 2011 over 2010.
(Side note: The chapters that processed payments in 2011 mostly ran banners and appeals that were produced by WMF. WMUK ran no localized appeals. WMDE ran a few for a very short time during the middle/lull of the campaign. WMFR ran a few localized appeals for almost the whole middle/lull of the campaign. In the discussion, some have speculated that the increases in the chapter processing countries were due to better localization of appeals. We really want that to be true in the future, regardless of who processes payments, but it has not happened yet. This year, the WMF fundraising team is prepared to spend time with chapters and others to help this happen and regardless of who payment processes, we’d love to see more people in chapters and local communities involved in the creative aspect of fundraising.)
The table I shared also showed very large increases in most countries that switched from chapter to WMF in 2011, but showed more moderate increases in the Netherlands. The countries, such as IT, SE and RU, that saw the giant increases benefited from two different positive effects in 2011: (1) They got the same increases from the improved banners and appeals that everyone else got, but (2) they also had donation form problems in 2010 that suppressed donations, making larger 2011 gains easier. In 2010, WMNL had no serious issues with their donation form, and therefore started from approximately the same position as WMFR, WMDE, WMUK and most other WMF countries, making larger gains in NL more difficult.
WMF and all chapters share the goal of featuring as many different voices as possible in the fundraiser, favoring editors. The chapters had to run Jimmy for longer -- and fundraise for more days -- than WMF because they decided to go for larger budget increases than WMF.
Here are the approximate number of days WMF and the three big processing chapters ran Jimmy:
- Total fundraising days: WMF 46; DE 52; FR 50; UK 48.
- Full Jimmy days: WMF 12-14 (depending on country); DE 31; FR 23; UK 24.
- ~Half Jimmy days: WMF 0-7 (depending on country); DE 5; FR 7; UK 20.
- Combined approx full Jimmy days: WMF 13; DE 32.5; FR 26.5; UK 31.
(This data was gathered from banner impression logs and checked against calendars kept during the campaign.)
A comparison of increases that look at the whole campaign between, for example, DE and US is mostly a comparison of the different proportions of “Jimmy days” in each country, not of fundraising efficiency. To compare fundraising efficiency, it would be better to look at just the beginning of the campaign when both the US and DE ran Jimmy.
If we do that, we see that there is less variance in the increases across most countries. For example:
- +58%, counting the first 7 Jimmy days from Tuesday to Tuesday to avoid a Monday banking anomaly. (Comparing simply the first 7 days shows a -14% difference, but is not a fair comparison.)
- +41%, comparing the average total per day of the first 17 days (WMDE ran Jimmy for the first 17 days of 2011).
- +61%, counting the first 7 full Jimmy days. (WMF switched off Jimmy on day 8 in 2011.)
- +63%, comparing the average total per day of the first 7 Jimmy days (WMF ran Jimmy for the first 7 days in 2011).
Here are a few other increases for comparison:
- NL: (WMNL processed donations in 2010, WMF processed in 2011.)
- +51%, counting the first 7 full Jimmy days.
- BE: (WMF)
- +82% counting only the first 7 full Jimmy days.
- ES: (WMF)
- +94% counting only the first 7 full Jimmy days.
There are a number of possible explanations for the variance above. But in the end, I would not make too much of any of these small differences. The lower NL number could be simply because our Dutch translation of 2010’s Jimmy was better than the Spanish and French ones. Or maybe by offering four different payment methods in NL in 2011 (2 more than in 2010) we actually reduced donations a little bit by overwhelming donors with choice (something we’ve seen before). We did some tests of number of choices of payment methods in 2011 that we haven’t had time to fully analyze yet. In 2011, we AB tested to optimize many local issues like that. The 2012 fundraiser will be primarily focused on optimizing as many local details like that as possible.
Remember, all the comparisons above are between Jimmy banners and appeals across 2010 and 2011. The increases are caused by improvements to the banners and appeals that we made in 2011.
The story of the increases represented above is mostly a story of improved banners and appeals. We created them through a year-long process of testing all sorts of different text and images. Even though we tested hundreds of variations, and found many small improvements, major improvements were needles in haystacks. A few of those needles made all the difference in the world, as I explain below. And some were so random that they could be hard to believe. When we told the chapter fundraising contacts that a green leafy background in the banner photo increase donations by 30% or more, one chapter contact didn’t believe it until he tested it himself. He then emailed: “Hallelujah!”
After the first 7 days of the fundraiser, in most WMF countries, we switched to Brandon, and then soon after to many other editor appeals. We knew some of these appeals performed at a much lower level than Jimmy. But we knew we were on course to far exceed our fundraising goal for the year and so we stuck with editor appeals for the sake of education of Wikimedia readers about our community. At the very end of the fundraiser, we came back with Jimmy for year end for just 5 days. In hindsight, we would have also shortened the fundraiser significantly, which we plan to do in 2012.
Even comparing only the 2011 Jimmy beginning periods, there are still inconsistencies in the comparisons that require caveats. For example:
- WMUK, in 2011, ran a mix of three WMF appeals (Jimmy, Brandon and Susan) for the first three days before switching to 100% Jimmy.
- WMFR, in 2010 began the campaign with a poorly performing appeal featuring a fossil instead of a person (it was a good experiment, but didn’t raise much money). Therefore, if you compare just those “fossil days” of the 2010 campaign to the same days of 2011, the increase is very high.
With those caveats, here are some ranges of how the 2010-2011 comparisons look for UK and FR:
- +49%, counting the first three split Jimmy/Brandon/Susan days.
- +61%, starting from the first full Jimmy day.
- +865%, counting only the first 4 days, which is comparing a rock/fossil in 2010 against Jimmy in 2011.
- +129%, comparing the first 13 days of Jimmy in 2011 to the first 13 days of Jimmy in 2010. This comparison favors a higher increase in 2011 because there is a steep natural decline each day of the fundraiser. The increase goes down to 59% if you compare the same time period in 2011 as the post-fossil days in 2010.
That last example brings me to the second big explanation of increases: Donation form usability and simplicity.
The second factor: Simpler/easier donation forms = More money
A confusing or hard to use donation form reduces donations. There also may be a negative effect of highlighting that donations go to a local chapter as opposed to the Wikimedia Foundation or “Wikipedia” (as some more-successful chapter donation forms presented it).
I don’t want to pick on anyone’s forms. We’ve had plenty of bad forms ourselves over the years at WMF. But this is important for understanding how fundraising works.
If you look at the number of Jimmy days that WMUK had compared to WMDE, you would expect its increase for the whole campaign to be closer to WMDE’s. My guess is that WMUK would have raised much more if not for a usibility problem with their donation page that made it a bit difficult and confusing for donors to make a one-time donation. They also would have raised more if they could have offered direct credit card (as opposed to PayPal) donations.
France ran the same new banners and landing pages as WMDE, WMUK and WMF for the beginning and end of the 2011 campaign (when the vast majority of the money comes in). So why such a higher increase? Their donations were tax deductible both years, so that can’t be the reason. The reason was that they made a handful of significant improvements to their donation form:
There isn’t a single huge difference between the 2010 and 2011 forms, but the 2010 form has many significant problems that WMFR corrected in 2011: The 2010 form (a) has more text to parse around the donation button, (b) has less direct language about tax deduction and places the text above and in the way of the donation button, (c) headlines “Donate to Wikimedia France” instead of “Support Wikipedia” (as the 2011 form does), (d) asks only for high amounts vs. lower amounts (as in 2011) and (e) only offers three amount choices instead of four (as in 2011).
I don’t know if WMFR tested each one of those changes, but based on WMF testing of donation form design, we know that those changes, added together, made a significant difference and helped WMFR make a bigger increase.
With WMIT, WMRU and several other 2010 chapter forms, there are bigger differences that led to even greater percent increases from 2010 to 2011.
WMIT is an interesting example. There was a 331% increase in Italy between 2010 and 2011 if you compare the first 7 Jimmy days of each year. About 50-70% of that increase must come from the improvements to the banner and appeals (i.e. same as all the other countries). The rest of the increase comes mainly from the improvements over the 2010 form. The 2010 form has several elements that hold it back: (a) it doesn’t offer an amount choice on the first step, (b) there is too much text on the second step for donors to parse, (c) there is a complex choice between donation methods, each one involving more text to parse and (d) it places the focus on donating to Wikimedia Italy instead of the movement and site in general.
It has been suggested that the Italian community’s blackout of Wikipedia fueled the increase in donations. That must have had a positive effect, but other countries with similar 2010 forms saw a similar increase to Italy, such as RU, SE and AU.
The Russian form in 2010 has some of the same problems as the Italian 2010 form.
The Mission of Wikimedia Fundraising
My purpose here has been to explain what factors really make a difference in fundraising in the Wikimedia environment. I’m not arguing for any particular outcome regarding who payment processes here, I’m just trying to lay out some facts about how fundraising in our very unusual environment actually functions.
To me, fundraising success isn’t about maximizing income, it’s about achieving the mission of Wikimedia fundraising. Here’s how we define that mission on the WMF fundraising team (and I think chapters would all agree with this):
- We do our best to teach Wikimedia users about our communities and projects,
- while raising the budget for the Foundation and movement
- in the shortest possible time,
- with the easiest, most localized experience possible for donors.
In 2011 the work that really increased the movement’s ability to achieve that mission was primarily creative work: creating, discovering and testing new ways of communicating visually and textually in banners and appeals. Secondary to that, but still important, was optimizing form usability and simplicity.
Whoever payment processes in 2012, what’s most important is that we gather more and better appeals from community members from as many languages, projects and geographies as possible. That will not only help us to better teach our nearly 500 million readers who we are as a movement, but will also allow us to continue to shorten the fundraiser while increasing the resources available to movement.
On the WMF fundraising team, we’ve settled a lot of the non-creative issues with the fundraiser over the past two years. And, personally, I’m really looking forward to working more closely this year with both chapter staff and volunteers and volunteers not affiliated with chapters all over the world to focus even more now on telling the Wikimedia movement’s story even more powerfully in our next fundraiser.