Assessing impact and collecting metrics
Any good program keeps track of what it's doing and how it's working so that it can adapt and grow in response to results from projects and feedback from the community.
Each project you do should have some way to answer the question, "Is it working?" Deeper questions matter too, like "How is it working?" and "How is it not working?" A mix of quantitative numbers and qualitative comments can give you a good sense of this and help you plan and learn.
One example, Journal donations:
- Survey editors to find out which sources they desire most. Add up the !votes and pursue those with most demand
- Count the total number of partners you have, the total number of accounts they've donated, the total number of people who signed up, and the total number who received access (note that WMF currently does this centrally)
- Check in after 4–6 months to survey editors about the source they are using, how much they are using it, how useful it is to them, etc.
Other projects should also be documented, for example:
- Resource Exchange (Share): How many requests came in, how many were successfully delivered
- Visiting Scholars or Interns: How many institutions were involved, how many positions/classes were filled, how many articles were created, how much content (bytes) was added
- Reference Desk: How many questions were asked, how many users visited/edited the page, how many pageviews are there for each month
While each situation is different, here is a general approach for working with metrics
- Identify key metrics
- Identify main tools available for collecting metrics
- Develop tracking documentation
- Regularly update metric tracking materials
- Communicate metrics to key stakeholders
- Analyze the metrics to iterate and improve processes
An important point to note is that there are sometimes metrics that you wish you had or need, but that you can't collect because a tool or survey just can't capture that data. If that's the case, get as close as you can with a "substitute metric" and investigate the possibility of building or improving a tool which can do it.
- People: One or two volunteers should be enough: the English Wikipedia Library branch tracks all of the metrics for its 35+ publisher partnerships through one volunteer operating at 3–6 hours a week. Other metrics for other programs are limited-time activities which are tracked only occasionally during the course of a programs.
- Skills: familiarity with tools on- and off-wiki for tracking pageviews, link usage, and contributions; knowledge of basic spreadsheet functionality; ability to interpret and communicate basic changes in metrics
- Time: metrics time scales with number of partnerships and programs. The greatest time intensity is defining and establishing processes for metrics tracking – estimate each new metric to require 4–5 hours of prep both exploring tools and creating documentation. Established metrics processes require regular maintenance and communication of those metrics (5–10 hours a month for several partnerships, scaling up as partnerships and other programs grow).