Wikimedia Blog/Drafts/First Wikimedia Programs Evaluation reports 2015 examine Wiki Loves Monuments and other photo events
- First 2015 Wikimedia Programs Evaluation reports examine Wiki Loves Monuments and other photo events
- First 2015 Wikimedia Programs Evaluations: Wiki Loves Monuments and photo events
A brief, one-paragraph summary of the post's content, about 20-80 words. On the blog, this will be shown in the chronological list of posts or in the featured post carousel on top, next to a "Read more" link.
What is the impact of Wiki Loves Monuments and other photo events? The first two Wikimedia Programs Evaluation reports show that these programs contributed 14% of the media uploaded to Commons from September 2013 to September 2014 -- and their photos were used in articles five times more than other uploads in that period.
Wikimedia Programs Evaluation reports show that Wiki Loves Monuments and other photo events were six times as likely to be rated as featured pictures, such as this one. Main du Juif à Tikjda, 5º prize on Wiki Loves Earth Algeria 2014. Photo by Chettouh Nabil, freely licensed under CC-BY-SA 3.0.
The first two Wikimedia Programs Evaluation Reports in 2015 have been released. The reports are a collaborative effort of program leaders and the Learning and Evaluation team at the Wikimedia Foundation. They focus on Wiki Loves Monuments and other photo events, providing a snapshot of their purpose and impact, as well as the resources and efforts that go into their implementation.
Highlights of the first reports show that media uploaded to Wikimedia Commons as part of the Wiki Loves Monuments and other photo events represent 14% of the media uploaded to Commons during September 2013 to September 2014; media uploaded are used in articles at five times the rate of Commons uploads overall and are more than six times as likely to be rated as featured pictures, compared to other media uploaded by registered users during the same period of time.
Our approach to evaluation
The Wikimedia Foundation’s Program Evaluation and Design initiative started in April 2013, with a small team and a community call to discuss program evaluation. The goal was to explore what programs were out there, what was important for program leaders and what they were measuring. In the first few months, the team worked to identify the most popular Wikimedia programs and collaborated with a first set of program leaders to map the program goals and potential metrics.
By August 2013, informed by initial survey results, we launched the first Round of Data Collection in September 2013 and completed our first Evaluation Report (beta). This high-level analysis started to answer many of the questions raised by movement leaders about key programs and their impact. The report was well received by our communities and generated many discussions about the focus of these programs, their diverse designs and the data they collected. But it still left room for improvement. Since the launch of the beta reports, the team has hosted 11 in-person meet-ups, 24 virtual events (recorded and shared), and wrote 15 blog posts on different topics around learning, evaluation, measures and storytelling, to help develop capacity of program leaders to evaluate and report.
We have completed the data collection phase for the Wikipedia Programs Evaluation Reports 2015, and the first two reports have now been released. In collaboration with grantees and program leaders across the movement, the reports cover over 700 implementations of 10 different types of programs, reported by at least 98 different program leaders from 59 countries. This second round of data represents twice as many countries, more than three times more program leaders reporting, and six times more program implementations.
The goals of these reports are:
- To develop a clearer understanding of Wikimedia programs and their impact.
- To identify positive examples of programs to explore in more depth, in order to develop best practices and support networks across communities
- To help Wikimedia community leaders explore methods for improving the data collection and reporting of their programs.
- To highlight key lessons learned that can be applied to data collection and reporting.
The first two reports: Wiki Loves Monuments and other photo events
The first programs evaluation reports released focused on photo events: Wiki Loves Monuments and other photo events, such as Wiki Loves Earth and WikiTakes. Follow the links to find out more on how many new users are introduced to wiki projects through these programs, to learn about user retention, and to see how effective photo events are at expanding and improving content on Wikimedia projects.
Wiki Loves Monuments and other photo events are successful in adding hundreds of thousands of new images to Wikimedia Commons. The media uploaded to Commons as part of the Wiki Loves Monuments and other photo events captured in the reports represents 14% of the media uploaded to Commons during the reporting period (September 2013 through September 2014).
Media uploaded for Wiki Loves Monuments and other photo events is used in articles at five times the rate of Commons uploads overall. Wiki Loves Monuments uploads are also more than six times as likely to be rated as featured pictures compared to other media uploaded by registered users during the same period of time.
The effort and resources invested in each event varies widely and does not always align directly to outcomes and impact, and the investments sometimes seem to be increasing, while the use of images is not. We encourage all Wikimedia community members to use these reports to learn about what others have accomplished. The information can be used to help set appropriate expectations for future events, resource investments and outcome targets. We also hope the data shared will help connect program leaders across the movement -- to share practices and help plan, implement, and measure the impact of photo events.
To that end, the program reports include a dedicated section that encourages peer-learning: "How this information can apply to program planning" draws out key advice on planning for program inputs and outputs.
Join the conversation!
As we work to make these reports relevant to different communities, we also need program leaders views to explore possible next steps. Some possible areas for further investigation of Wiki Loves Monuments and other photo events are:
- Do different types of photo events attract different types of users?
- How can we apply the successes of low-cost, low-scale events to other contexts?
These and other questions are framed for discussion, along with any other questions, on the talk pages of the Wiki Loves Monuments and other photo events reports. With so much data in our hands, we are curious to get input on what questions we might approach next and what thoughts are provoked by the data shared in this reporting.
We believe this collection of reports is only the beginning of the conversation. We hope the data we shared will stimulate discussions of the changes that Wikimedians make possible through photo events - as well as other programs - and how we can best capture and understand these efforts and their impact.
In the coming months we will be releasing reports on additional Wikimedia programs:
- Onwiki Writing Contest
- Editing Workshops
- GLAM Content donations
- Wikipedia Education Program
- Wikipedians in Residence
If you helped organize a Wiki Loves Monuments or other photo event in your country (or even if you only participated in one), please join us in reviewing the findings presented in these reports and share your feedback. See you on the Talk Page!
Ideas for social media messages promoting the published post:
(Tweet text goes here - max 117 characters) * What is the impact of Wiki Loves Monuments and other photo events? Check out our new reports. (link) ---------|---------|---------|---------|---------|---------|---------|---------|---------|---------|---------|------/
- What is the impact of Wiki Loves Monuments and other photo events? The first two Wikimedia Programs Evaluation reports show that these programs contributed 14% of the media uploaded to Commons from September 2013 to September 2014 -- and their photos were used in articles five times more than other uploads in that period. (link)