Wikimedia Blog/Drafts/Tip of the Iceberg: Measuring the Impact of Wikimedia Programs
This was a draft for a blog post that has since been published at https://blog.wikimedia.org/2014/05/02/beginning-understand-what-works-measuring-impact-programs/
- The Tip of the Iceberg: Measuring the Impact of Wikimedia Programs
- Beginning to Understand What Works: Measuring the Impact of Wikimedia Programs
Across the globe, Wikimedia organizations and volunteers are engaging in online and offline activities to get more editors to contribute to Wikimedia projects. There are expansive efforts to attract new editors and to mobilize existing editors who can contribute diverse and high-quality content. With so much activity and energy, it is important to take a deep breath and reflect:
- What are the programs expected to achieve (i.e., what are the program goals)?
- What does it mean for a program to have “impact”?
- How much “impact” equals success?
- How might our programs achieve the most impact?
These are the big questions the Program Evaluation members of the Learning and Evaluation team in the WMF Grantmaking department have begun to explore along with the community. This past month, we completed a beta version of evaluation reports that has begun to put systematic numbers behind a handful of popular programs.
The picture is clear that Wikimedia volunteers do incredible work to create free knowledge and to promote the free knowledge movement. But this picture is incomplete without the data to help tell the story. Putting numbers behind our stories and activities helps the community and the public to better understand what is actually happening on the ground and how our movement programs are making an impact. The evaluation reports measure programs systematically against shared goals to help us see which programs drive impact along various movement goals. From here, we can reflect on what the existing programs are doing and what remains to be done in our strategies to nurture and grow a community of editors and advocates around free knowledge.
For the first round of reports, data were reviewed from 119 implementations of seven popular Wikimedia programs: Edit-a-thons, Editing workshops, on-wiki writing contests, the Wikipedia Education Program, GLAM content partnerships, Wiki Loves Monuments, and other photo initiatives. Data represented more than 60 program leaders, individual volunteers or organizations, program implementations in over 30 countries. These reports provide a basic sketch and a pilot of high-level analysis of how these programs are influencing the movement. They are also painting a picture of what these programs are in terms of their goals and help to surface the gaps in data and metrics. Here are just a few highlights:
So, what’s next?
- Examining additional programs! In FY 2014/2015, the goal is to expand the data related to these seven programs and to examine three additional programs: Hackathons, Conferences, and Wikimedian-in-Residence. Through these reports, the evaluation portal, and other pathways, we will continue conversations with the global community to work toward a shared view of program “impact” throughout the movement.
- Help us improve the reports! If you are running a Wikimedia program, start tracking it using the Reporting and Tracking toolkit. You will not only learn a lot about your own programs, but in sharing your data with us, we will be able to conduct stronger analysis on popular Wikimedia programs and we can better learn from one another to make better programs.
- Questions? Comments? Reach out to us in the comments below or at email@example.com. You can also find us on the Evaluation Portal!
Edward Galvez, Program Evaluation Associate