On this page, you will find answers to: What are the key findings to take away from the report?
What are the next steps in understanding the impact of edit-a-thons?
How does the program deliver against stated goals?[1] [edit]
In terms of participation, edit-a-thons tend to engage more existing users than new users, and most of those existing users are already active editors. The total number of participants engaged in the 121 edit-a-thons included in this report was 2,328; 913 (89%) were existing users. Active editors were a large proportion of existing users: 573 active editors made up 63% of existing editors or 25% of the total number of all editors. Further measures would be needed to understand in what other ways, beyond working together to produce content, edit-a-thons build and engage communities.
The 121 edit-a-thons reported here were held in 19 different countries in a variety of styles, mostly relating to generating content around a specific topic. If increasing awareness of the Wikimedia project continues to be a central goal of edit-a-thons, it will be important to develop and capture measures of awareness.
Increasing diversity of information coverage[edit]
Edit-a-thons are useful for targeting specific topics, and therefore edit-a-thons events contribute a wide variety of information, from the Ada Lovelace edit-a-thon to the ZSL London Zoo Library edit-a-thon (choosing the two ends of the alphabetical spectrum in our sample.) However, measures of information diversity do not yet exist, and will need to be developed if we are going to accurately capture increased diversity of information coverage in Wikimedia content.
How this information can apply to program planning [edit]
Use the information to help plan for program inputs and outputs.
Keep in mind who the target audience of a edit-a-thon will be and design the event for that audience. For example, edit-a-thons that aim to engage new editors may need a different model for recruiting participants than edit-a-thons that aim to encourage experienced editors to contribute more. Keeping your audience in mind and using the data in this report can help you set expectations and design a strong edit-a-thon.
For resource planning, there are approaches to resourcing beyond monetary funds--remember all the edit-a-thons examined here leveraged donated resources in some way. All 44 reported having received a donated meeting space, and 17 edit-a-thons (39%) reported receiving donated materials or equipment.
The data from different edit-a-thons can help find the right combination of participants for your contribution goals.
This table represents the middle 50% of edit-a-thons for each metric:[2]
Lower
Higher
Event length
3.8 hours
6.5 hours
Total Participants
7
22
For each participant,
Text pages added
5
41
Articles created or improved
1
35
Reach out and connect to other edit-a-thon leaders.
Among all the benefits of connecting with fellow program leaders, you can find program leaders who ran similar edit-a-thons in similar contexts, and ask about the resources needed and outcomes expected. When using budgets presented here for planning purposes, try to find an event in a location with a similar economy to your area and consider reaching out to a successful program leader to discuss potential resource needs (including possible budget or donated resources). Alternatively, you can find an event based on the same model in a different location and talk to the program leader about the costs before translating those expenses into local prices.
Use the distribution statistics as guardrails against costly plans that may not produce scaled results.
Information on cost per participant and cost per text page or articles created/improved can also be helpful references for comparing the cost of your event with how much content is produced. As with overall budget information, these should be taken in the context of each event. If planning a new program, you might expect the costs to fall within middle 50% of costs per output reported. Programs nearer the bottom of the middle 50% create better outcomes with fewer inputs. We hope, as we continue to evaluate programs and feed the results back into program design, that we can learn more from the programs achieving the most impact using the fewest resources.
Edit-a-thons differ in goals, length, subject area and scope, yet they are organized successfully within and across many Wikimedian communities. They are replicable and adaptable to many different contexts to meet many different goals. Edit-a-thons can be scaleable, ranging from 1 to 300 participants. A collaborative setting is the main theme of an edit-a-thon, the goals and design are up to each program leader. Program leaders use many different methods to encourage and track contributions, such as event pages, bots and wiki-based tools. Use the data tables in the report to find program leaders who are successfully running edit-a-thons you would like to replicate or build off of.
Join the conversation! Visit the report talk page to share and discuss:[edit]
Sharing learning and practices
If you ran a program that delivered excellently against goals, please speak up! Consider writing a blog post, how-to guide, or contribute to the talk page sharing your ideas on why your program was so successful.
If your program surmounted a particularly tricky problem in program design, consider writing a learning pattern!
If you ran a program and want to report key metrics to the Learning and Evaluation team, our collector is always open. Visit our reporting page to learn about the reporting forms contents and find the link to voluntary reporting.
Connecting with other program leaders, evaluators, and designers
If you are considering running a new program or updating an existing one, consider reaching out to experienced program leaders who have organized a similar program. You can find leaders by program in the appendix, on our facebook or during virtual hangouts.
Join the mailing list for regular updates about program evaluation, tools, etc.
Questions about Evaluation and Impact
What, if any, ideas do you have about other ways we should evaluate edit-a-thons or programs in general?
What questions around program impact or evaluation do you have after reading the reports?
What further data investigations would you like to see (or do!) for this set of edit-a-thons?
Questions about Measures
What, if any, measures have you used that are missing from these reports?
What, if any, tools/bots/programs/strategies do you use to measure the outcomes of your edit-a-thons?
↑Here we examine together all edit-a-thons which reported, but recognize that they do not all share program goals. Edit-a-thons represent a diverse set of programs and reflect diverse goals across contexts--we encourage organizers of each event to consider the data in terms of what matters most to their priority goals
↑The lower and higher numbers are based on the lower quartiles and upper quartiles.