Wikimedia Conference 2016/Program/31
31: Create your own metrics: examples from Wikimedia France and Wikimedia UK
[edit]How to move forward
[edit]-
Daria Cybulska presenting
- What was this session about
Explaining how Wikimedia France and Wikimedia UK learned to develop their own metrics.
- What are the next steps to be taken?
No next steps were intended to be taken in this session.
- Who is the person to reach out to?
Daria Cybulska is approachable and is happy to help. Furthermore, the Evaluation team at the Wikimedia Foundation supports you as well.
- Photos
- Slides
This session documentation was approved by one of the speakers.
- Original Description
- Wikimedia France and Wikimedia UK will present how they developed their metrics over recent years. They will explain what they measure beyond global metrics, also looking at some similarities of what metrics they capture (like volunteer hours). Depending on audience format this could be in depth discussion about their metrics and how they could be developed, or a more general plenary with WMF involvement.
- Desired Outcome
- Understanding of the possibility of using indicators other than the global metrics, and possibly how to set those up. Learning about what other organisations are capturing and what their challenges are. Appreciation that the same set of metrics wouldn’t work for all organisations, and that even if a metrics has the same name, if may be capturing something different in different organisations.
- Session Format
- Listening (30 min presentations from the speakers + 30 min discussion.)
- Speakers
- Daria Cybulska (WMUK) and Mathieu Denel (WMFR)
- Summary of the session
Daria Cybulska (Wikimedia UK) and Mathieu Denel (Wikimedia France) opened the session on how to create own metrics for your organization. The session started with Daria presenting her slides and talking about the development process for metrics / measures of success at Wikimedia UK. Daria explained that Wikimedia developed its own metrics through trial and error, a couple years before Global Metrics. This allowed time to experiment and ‘test’ metrics - there were some metrics we agreed on, but in the course of the year discovered that it’s too complicated/impractical to record those. It’s only once you start reporting on a metric, that you test whether it works. Through this, WMUK has changed its metrics from year to year, although having the same metrics allows for comparisons and better planning - that’s the whole point of metrics reporting.
To establish KPIs, Wikimedia UK worked ‘backwards’ (from outcomes to metrics) - you need to know what you want to achieve first, and then work out how to capture your progress. She then also pointed out that there are different metrics that could be used to measure the same goal, it’s best to choose the one that works best and is most meaningful for the organisation.
As an example Daria took the metrics on volunteers’ work. She explained that “volunteering” is a very broad term, especially in Wikimedia movement, so needs to be specifically defined in the context of a given organisation, so people know what to report and measure,
In specifics, she mentioned metrics for volunteer hours (How many hours did the volunteer work?), “lead” level (Is a volunteer in a lead role?), Gender (Gender balance), Chapter Satisfaction (How satisfied is the volunteer with the Chapter and its work?) and Skill Development (Did the volunteer acquire new skills?).
Daria also explained that they have a metric for newly registered users but not for their retention. Keeping new editors is not a goal in their strategy. „Individuals involved“ was a difficult metric with different understandings on what it means, she explained. Currently it includes anyone doing some activity with the chapter (for example participating in or organizing a workshop but not necessarily editing).
Daria also said that there is a bit of human judgement involved in measuring the number of „leading volunteers“. When logging contributions/time, WMUK decide whether to consider it as a lead role or not. Test question: would this thing have happened if this volunteer wasn’t involved? Daria advised to define the metrics clearly and write down the information about how to measure them. She mentioned that getting everyone in an organisation to understand the metric in the same way is actually a challenge and needs to be worked on.
She then explained how WMUK measures content contributions, which focus on reusing Global Metrics. She mentioned though that they have an additional measure, % of image reuse, which is useful as it indicates, at a glance, how useful the image uploads were.
One participant asked how WMUK measured its advocacy work. Daria explained that understanding the UK-specific context for that was important. In UK, if the government is preparing a paper, it runs consultations and everyone can submit evidence or opinion. WMUK counts the number of consultations the chapter participated in. She explained that this wouldn’t take into account whether the submission was successful, but it was more practical to measure.
After Daria’s presentation and talk, Mathieu presented his slides and explained how Wikimedia France developed four additional metrics beyond Global Metrics. The four metrics are: Press mentions Files uploaded which were supported by the chapter Partners’ satisfaction Volunteers hours
Someone asked who are the “partners” that WMFR surveys. Mathieu answered that these are the organizations they work with, for example GLAM and other similar institutions. Daria commented that it is an example of a “vague metric” that can be understood in different ways.
Another participant asked what the use of capturing volunteer hours was and why WMFR measured it. Mathieu answered that it was important for recognizing their work. They also calculated the monetary value using the minimum legal wage. Daria adds it was common in UK as well, to calculate the value of volunteer work that way. A participant adds that volunteer hours is a good metric for persuading partners.
Mathieu added that while at Wikimédia France a group of 30 people worked on metrics, at the WMUK, Daria explained, they involved some people from the board, but there was no broader participatory process.
A participant raised the question of technical difficulties with measurement tools. A WMF staff member explained they were trying to get some engineering time for the tools, but it was a lot of work. Possible contacts at WMF were Ryan Kaldari, Danny Horn, Amanda Bittaker, Alex Stinson, Tighe Flanagan.
Daria and Mathieu closed the session.