Wikimedia Blog/Drafts/Global Metrics for Grants: one way of doing, reporting and learning better
This was a draft for a blog post that has since been published at http://blog.wikimedia.org/2014/09/16/global-metrics-for-grants-one-way-of-doing-reporting-and-learning-better/
Global Metrics for Grants: one way of doing, reporting and learning better
[edit]- ...
- ...
Body
[edit]The Wikimedia movement is known for its diversity, on many levels: individuals, groups and organizations, in different contexts, are invested in achieving the same goal of free knowledge. As community members seeking and executing grants have worked with grant committee members and the grantmaking team, we have now reached a point of shared understanding: we need to do better at learning from each other and doing more to demonstrate our impact.
Starting in September this year, the Grantmaking team is putting into effect a set of Global Metrics, that will help us all understand, appreciate and be accountable for some of the work being done by Wikimedia communities worldwide. In particular, we are seeking to gain a shared aggregate understanding of how successful we are at expanding participation and improving content on our projects. These will have the form of a table template that will be included in the reporting form, starting on future grants, from Round 1 2014-2015.
These metrics are not meant to replace, but to complement, each grant and grantee’s individual metrics and measures of success, both qualitative and quantitative.
Why Global Metrics and how were they designed?
[edit]For the past two years, we have worked with community members to build a funding framework that supports a spectrum of needs, ideas and initiatives from across the movement, led by individuals to established organizations. This framework was also supported by a self-evaluation strategy, that allowed any community member to build their own metrics and report against their own goals.
Over the past year, we have begun reviewing grant progress and impact reports[1], and amongst many insights, three stand out: people are still finding it difficult to measure their work in clear ways; the larger the grants, the less proportionate the impact seems to be (and one challenge may be reporting); and we are finding it difficult to assess the collective impact of the considerable work supported by these grants in any systematic fashion. In particular, as a movement, we are not yet skillful in offering both the stories and the numbers, that describe how our offline work positively impacts our online successes.
After two years of observing the goals and measures of various grants projects, a few core metrics came out as commonly used indicators by community members in different contexts. These measures, however, were not calculated consistently across projects. As a result, it was difficult to convey outwards what we are accomplishing as a movement. Global Metrics, in this sense, provide a shared set of indicators that can be used across projects, to report on results. In addition, we did our best to design metrics that can, currently, be assessed with the support of tools built and used across the movement.
After research and consultation with some grantees and grants committee members, the new Global Metrics focus on participation, content and learning processes:
- Number of active editors involved.
- Number of new registered users.
- Number of individuals involved.
- Number of new images added to Wikimedia articles/pages.
- Number of articles added or improved on Wikimedia projects.
- Number of bytes added to or deleted from Wikimedia projects.
- Learning question: Did your work increase the motivation of contributors and how do you know?
The main challenge these Global Metrics are trying to overcome is the limited ability observed in Wikimedia projects and programs to sum up inputs, outputs and outcomes in self-evaluation and thereby to give us all a more cogent sense of the collective impact of our work. We hope that more cohesive reporting will help us celebrate our successes as a global movement, but also point out where we are not making appreciable difference. We recognize, however, that numbers are not enough.
Numbers do not tell the full story
[edit]We are therefore counting on community members to offer both numbers and stories, since numbers only make sense within context. Secondly and critically, global metrics are not the only measures of success we will learn from: each grantee will continue to define and assess themselves against measures of success that are critical to them. We don’t expect that grant reports should or will focus only on this seven measures. In fact, some key insights that would significantly improve the effectiveness of our work may not be easily measurable, but we know and understand their impact: for instance, volunteer motivation.
The Global Metrics are also limited in what they can currently measure. As they stand, they do not directly measure quality, retention, or readership. In addition, they may not offer the right metrics for all types of grants. For instance, an individual engagement grant for research on our wiki projects may not directly produce content or recruit new editors. In this case, the grantee might only be able to report the number of individuals and/or active editors involved.
As we implement these metrics, keeping in mind the potential and limitations of Global Metrics will help us learn from what is useful and what we may continue to need to improve upon.
Room to grow, work and be successful together
[edit]As we pilot this new set of metrics in the movement, the Grantmaking team will be available to provide consultation and support to grantees. We also encourage everyone involved in reporting to reach out to us to learn more what each metric means and how to measure them. We have prepared a set of , available on the Evaluation portal on Meta, that go through each of the Global Metrics and explain how to gather data for those. We will work with community members during the next few months to further develop these information resources and to create new ones. Please check Grants:Evaluation/News and follow @WikiEval on Twitter for updates. We also encourage all community members to comment, share concerns and ask any questions related to global metrics. Do join the conversation on the talk page and reach out to the team at eval [at] wikimedia [dot] org: come talk to us, let’s do better together!
Anasuya Sengupta, Senior Director of Grantmaking, Wikimedia Foundation
María Cruz, Community Coordinator of Program Evaluation & Design, Wikimedia Foundation
- ↑ Read the Impact reviews: First batch of FDC reports for Annual Plan Grants: https://commons.wikimedia.org/wiki/File:Learning_and_Evaluation._FDC_Impact_2012-14.pdf Project and Event Grants 2013-14: https://meta.wikimedia.org/wiki/File:PEG_Impact_learning_series_-_2014_July.pdf
Notes
[edit][1] Read the Impact reviews: First batch of FDC reports for Annual Plan Grants: https://commons.wikimedia.org/wiki/File:Learning_and_Evaluation._FDC_Impact_2012-14.pdf
Project and Event Grants 2013-14: https://meta.wikimedia.org/wiki/File:PEG_Impact_learning_series_-_2014_July.pdf
Ideas for social media messages promoting the published post:
Twitter (@wikimedia/@wikipedia):
(Tweet text goes here - max 117 characters) ---------|---------|---------|---------|---------|---------|---------|---------|---------|---------|---------|------/
Facebook/Google+
- ...