Jump to content

Global metrics/Review/Summarized feedback

From Meta, a Wikimedia project coordination wiki


Executive Summary

[edit]
  • Global Metrics has been used in many ways, beyond the original intention of providing a high-level view of certain grant-related outcomes. See Appendix for more detailed information on all the ways Global Metrics has been used, or attempted to be used.
    • However, there has been a spectrum of success in using the data in these different situations. Feedback from those grantees, Committee members, and WMF Staff surveyed / interviewed that the data has been the helpful in understanding the outputs and outcomes of planned activities. The data has been somewhat useful to grantees in helping them plan and refine their activities, and minimally useful to grant Committee members in making comparisons between grants.
  • In the WMF fiscal year 2013-14 (the year before Global Metrics became a requirement), reporting of quantitative or qualitative outcomes was largely optional, and thus largely under-reported. One of the intended benefits of Global Metrics was to get a more complete view of a few quantitative outcomes, namely for the 6 metrics required. We analyzed the total outcomes reported in all 2013-14 PEG, IEG, and APG grants and compared them to the completed 2014-15 PEG and APG grants (progress reports only for APG). This comparative analysis has shown that the 2014-15 achievements have already far surpassed the 2013-14 achievements. So far:
    • 3 times the number of participants have already been engaged through grant activities, compared to 2013-14
    • 1.5 times the number of new images have been uploaded, compared to 2013-14
    • 3 times the proportion of new images have already been used in WIkimedia projects, compared to 2013-14
    • 2 times the number of pages / articles have been created or improved, compared to 2013-14
  • There were many issues reported with Global metrics (see Appendix for more detailed information). The three main areas of issues were around:
    • How the metrics were defined, e.g. inconsistent definition of metrics across grantees, the aggregation of data across projects dilutes the value of the final number.
    • How the metrics are collected, e.g. that collecting the information is very difficult, the are so many tools with limited tool support, the attribution of outcomes is unclear.
    • How there are many outcomes that are missing, e.g. retention, partnerships, building the credibility of the projects

Presentation given at Wikimedia conference 2016

[edit]

This summarized feedback was first delivered at a session called Global Metrics Retrospective at the Wikimedia Conference 2016 in Berlin, Germany. This presentation represents the summary of feedback collected and presented. The following appendix provides additional detail and anonymized examples of each use case and issue.

Appendix

[edit]

Demographics of those surveyed and interviewed

[edit]

The first phase of the Global Metrics Review included gathering feedback from grantees, grant Committee members, and relevant WMF staff. The summarized feedback is based on interviews and survey data collected from:

  • 10 WMF staff from two teams: Community Resources and Program Capacity & Learning
  • 22 Grantees
    • IEG: 2
    • PEG: 6
    • APG: 8
    • Both PEG & APG: 3
    • Both IEG & PEG: 1
    • Unknown: 2
  • 8 Grant committee members
    • FDC members: 3
    • GAC members: 3
    • SAPG committee members: 2

Additional detail on the ways the data has been used

[edit]

Below are the various ways grantees, grant committee members, and WMF staff interviewed / surveyed indicated they had used / attempted to use Global Metrics.

Global Metrics have been used (with various levels of success or issues) to... Example quote
Understand outcomes "For photographic contests, [we] only used number of new images and images added to pages. [We used] GLAMorous...Found it quite amazing - over 4000 images in this category. Counting them manually is a lot of trouble - it is quite lovely to get them by automatic tool."
Report to WMF "Have used Global Metrics primarily in reporting to the Foundation."
Generally review grant proposals "[As a committee member I] look to the numbers to understand what the grant will result in (articles, people, images). However, [it is] usually is very hard for the grantee to predict the future. They are usually too high or too low compared to what actually happens."
Plan and refine goals & activities "Global Metrics are low-level performance indicators to understand which article contests work at all, which work better than others, and also how they work (Is this for already active editors or newcomers? Does this support creation of articles or editing existing ones. etc.?). If Global Metrics were not there, some of the information would still need to be collected and has been prior to Global Metrics."
Understand achievement against goals "Global Metrics is useful in comparing of outcomes to goals, to see how they have achieved against those goals. [I] use their metrics to understand the effect of the work in the progress / impact reports."
Recognize anomalies "[Global Metrics is] most useful when they are absent and should be there: If people are doing content-related activities and not giving explanation of what they are doing and they are not showing these metrics, it is useful to see."
Set strategy or annual plan "Using all the data to build the next annual plan and strategy; Global Metrics is taken into account."
Allocate resources "[For] allocating resources. Yes. [We have] stopped conducting one-time editing workshops. The workshops required extensive time and resources from volunteers and the results are quite poor."
Try and make comparisons between grants "On the GAC, you’re reviewing more proposals...so you don't have as much time to delve deeper. It’s also much harder to compare grants, except in rare cases. Unless the metrics are exceptionally good or poor they aren’t as useful for PEG except when we have a set to compare to like Wiki Loves Monuments, or other sets of programs."
Understand cost-benefit ratio " [I] do use the numbers, but give more importance to other results, e.g. motivation, [whether] the language is endangered. For projects with more budget, give more importance to the number; less budget, give more importance to other points. [It's] important to understand the cost-to-benefit [comparison]."
Report unrelated to a WMF grant Quote unavailable
Communicate with outside partners "[Global Metrics] represent... a solely Wikimedia-internal view of desirable effects of program activities. Therefore its nearly impossible to use them as joint metrics to communicate with partners from outside the Wikimedia movement (e.g. in joint projects or if seeking for 3rd party funds). Partners simply don't care too much about 'new active Wikipedia editors' and similar metrics."

Additional detail on all the issues raised

[edit]
Issue Categorized under the larger issue of... Example quote
Missing outcomes
See Missing outcomes section for more details
Outcomes that are missing "The absence of social impact makes the evaluation very fragmented because the actions performed by affiliates exceed the quantitative framework."
Aggregation across different projects and programs How the metrics are defined "Aggregation of the metrics dilute the meaning of the metric, to the point where people don’t believe in the conclusions drawn from them. We need to tie some meaning to each metric, that is understood and agreed upon with people on the ground."
Inconsistent definition of metrics How the metrics are defined "[For "Individuals Involved" there are] funny figures for programs because of the summation of estimated numbers (e.g. 500 visitors at a fair) and exact numbers (e. g. 2 individuals from GLAM partners). Whom to count is an issue, at least if you don't want to become completely arbitrary. In our case we have a long list of written and unwritten rules how to deal with this metric. This works for us most of the time, however it makes comparability with others illusive. /
Time-frame too short to see outcomes How the metrics are defined "The time-frame of the metrics is too short to sometimes see outcomes; need a longer time-frame question."
Outputs not outcomes How the metrics are defined "Global Metrics are low-level performance indicators to understand which article contests work at all, which work better than others, and also how they work."
Hard to collect data How the metrics are collected “[We] only use and track [Global Metrics] 50% of the time because some of the metrics are hard to track from offline to online.”
So many tools, with limited support for existing tools How the metrics are collected “Not being able to collect all the metrics in one place (one has to use Wikimetrics, GLAMorous, Quarry...) is a huge waste of time and energy. The tooling of these mandatory global metrics looks like it was an afterthought and was never quite really finished.
Attribution of outcomes
Deciding what should be included or excluded
How the metrics are collected “The metrics are open to interpretation, and so [our chapter] has needed strick rules for what should be included or excluded. [This definition] … isn’t the same for all organizations and projects.”
Burden of collection falls to volunteers or staff How the metrics are collected “Given a large number of volunteer-led projects per year...Global Metrics do place a substantial burden on volunteers in terms of reporting requirements etc.”
Wikimetrics How the metrics are collected “Wikimetrics was helpful for [participants] that came at the beginning of the month, but for those that didn’t, had to do some adjustments and some had to do by hand.  Did make me tired after some time, given the large amount of people that came in...I’m glad it wasn’t over 100.”
Data lacks context How the data is used / requested "The absence of social impact makes the evaluation very fragmented because the actions performed by affiliated exceed the quantitative framework."
Doesn't reflect program design or goals or learning How the data is used / requested "Global Metrics wasn't useful because for those metrics you need to prove that because of this [grant], this edit happened."
Unclear how data will be used How the data is used / requested "[There is] concern about how this information is going to be used: WMF "evaluating" them as grantees, and them evaluating their participants.
Internal Wikimedia movement view How the data is used / requested "Global Metrics represent for a soley Wikimedia-internal view of desirable effects of program activities. Therefore it is nearly impossible to use them as joint metrics to communicate with partners from outside the Wikimedia movement."
Unclear when applicable to a set of activities How the data is used / requested "Global Metrics didn't really apply here, instead we measured engagement with our projects."
De-emphasizes other metrics & outcomes How the data is used / requested “Reporting for mandatory global metrics was so time consuming other ways to capture outcomes were abandoned.”
Improve existing resources “More tools, for whatever metrics that will be collected”
Data doesn't reflect the quality of contribution "Improvement can be a little correction or 10,000 new bytes. And all counts as one article improved. This metrics does not take into account the quality of the improvements."
Data doesn't reflect the effort to create an entry in the projects "Articles make sense for Wikipedia but what is an "article" for other projects?"
Privacy concerns "Active users who know about WMF are very negative about being reported on by us to the WMF."
Low/minimal volunteer awareness / interest of Global metrics "Do not have any volunteers available that will be interested in collecting Global Metrics. So the ED spends a lot of time on counting bytes."


Missing metrics / outcomes

[edit]

Those interviewed or surveyed identified many metrics that were missing from Global Metrics. Below is a list of those metrics, with a definition, explanation or example quote if needed.

Missing metrics mentioned more than once Missing metrics / outcomes mentioned only once

Retention
Different types of contribution (e.g. Active and New editors doesn't capture the other types of contribution, or preferences in how to contribute, or degree of involvement of a volunteer)
Partnerships
Building the credibility of the projects
Quality
State of the movement / community
Awareness
Professional development
Leadership development
Media coverage / Communications
Content gaps
Motivation
Advocacy
Program specific metrics (e.g. metrics specific for editathons)
Page views
Number of downloads
Contributions to Mediawiki
General longer-term outcomes
Tool development
Satisfaction

Metrics specific to a particular wiki project
Environmental context
Finance
Adoption rate (for tool/features developed)
(Measuring a person's) comfort in contributing online
Number of words in an article
Community feedback
Re-use of images in other wikis
Re-use of images beyond Wikimedia
Community building
Qualitative feedback
How engagement changes over time
References added
Types of edits
Edit survival
Scalability of programs
Distribution in types of articles created
Translation of content
Engaging language communities beyond home language
Organizational effectiveness
Gender-based metrics