Program evaluation basics: the program impact model

From Meta, a Wikimedia project coordination wiki
Jump to: navigation, search

Contents

This page aims at introducing the reader to the "Program Impact Model" as a basic model for how programs generate impact. It also helps program leaders to understand some of the basic terms around program development and evaluation.

The Program Impact Model[edit]

After comparing efficiency, effectiveness and impact of programs with each other, let's take the next step and take a closer look at how to create impact (because, in the end, this is why we're doing programs, right?) and how impact relates to the other components of a program.

If you're a program leader, you've most likely come across different types of programmatic impact in the past. Here are a few examples: the amount of new contributors recruited through a program, the increase in the percentage of female Wikipedia editors on a specific wiki, the increase in article quality in a specific topic area on Wikipedia, etc. What all those different examples of impact have in common is that they are sustained, long-term changes effecting the Wikimedia projects (as opposed to "program outputs" that are products of program activities or to "program outcomes" that are benefits for the participants of a program; no worries – we'll get back to these terms shortly).

Now, let's take a look at a simple model for how to generate program impact and its different components:

The Program Impact Model.svg

In order to explain the different components of this model, let's just make up a real-life example (Yes, I wrote "make up", because – as you might remember – I'm using hypothetical examples in order to clarify concepts). So, let's assume that birdwatchers are an ideal target group for improving the quality of bird articles on Wikipedia. Our theory of change (we'll get back to this term later) is that if we just organized enough Wikipedia beginner workshops with birdwatchers, we could turn those people into long-term Wikipedians who would – in the long run – massively improve the topic area "birds" on Wikipedia. Now, let's take a look at which components would come into play:

  • Inputs are defined as resources dedicated to our program. In our case, that would be volunteer time, workshop rooms, projectors, printed educational materials that the bird watchers will take home (like "Welcome to Wikipedia" brochures or handouts for how to upload photos to Commons), etc. A program uses inputs to support activities.
  • Activities are what the program does with the input to achieve its goals. In our example, that could be the process of developing a workshop agenda, recruiting Wikipedians as presenters, sending invitation emails to birdwatcher communities, etc. All these activities result in outputs.
  • Outputs are the direct products of program activities. For instance, the number of workshop sessions conducted, educational materials contributed, and participants served. These outputs should produce desired outcomes for the program's participants.
  • Outcomes are the benefits or changes for individuals participating in the program's activities. They may relate to knowledge, skills, behavior, and other attributes. In our example, an ideal outcome would be that all workshop participants learn how to edit articles, how to cite sources, how to upload pictures, and, finally, that every birdwatcher who participated in our program develops an huge appetite for improving Wikipedia's articles on birds. These outcomes should result in sustained impact.
  • Impact in our context is the extent to which program outcomes lead to long-term and sustained changes on Wikimedia projects. Here we go: as a result of our program, every single workshop participant continued to improve articles over the course of several years (this is still hypothetical, remember?), leading to sustained changes on Wikipedia: after five years, 50% of the articles in this specific topic area have improved by 70%. Also, our birdwatchers community started more than 500 new articles about bird species. Well, let's stop here… it's too nice of a dream.


Wikimedia-logo.svg
Program evaluation basics

+ Add a commentDiscuss this topic

one of the missing links of Program evaluation is about lesson's learnt and how you want to incorporate them in order to improve the enterprise/organisation's capability.

Impact - Questions[edit]

I have three questions
1. For my understanding, the way impact is currently measured is according to statistics and visual representations. To make sure the impact is therefore somehow recorded you need to include specific activities.

  • making sure people register on the Wikimedia projects with a username (so you can trace them)
  • making sure female say somewhere they are female if you want to record statistics related to gender
  • making sure templates are used (and remain there) if you want to monitor what happens to content donations.
  • recording relevant categories, monotor them at the beginning of the project and after 6 months, 1 years, 5 years and so on.
  • maybe, asking permission to trace and monitor the online work of the workshops participants after the workshops? (this is research on humans; we need to get authorizations and to consider that maybe workshop participants don't really like to be traced and evaluated for years; i personally would not like it, but maybe others like it, don't mind it, don't understand it, or see advantages of it)

In the example provided in this page, those activities I listened above are not mentioned, but they are actually necessary to determine/trace the impact envisioned.

There are other ways to measure impact. you can have experts (academics but also people who love birdwatching) evaluating the articles before the project and after several years. this approach has two disadvantages

  1. ICT people and wikipedians tend to believe more deeply in numbers than on comments (quantitative vs. qualitative)
  2. you can not truly state that it was your project which made that impact. maybe it was just an extraordinary user who did a huge and wonderful work on his/her own.

I thought about this quite a lot for the WikiAfrica project. the solution I came up with was to have an explicit statement from people/institutions: they agreed that what they were doing was to support the project (we also asked them to make a comment on the project to say why they thought it was useful/relevant for them). In this way we could associate content, users, templates to the project. This was not strictly related to the impact (we would have made an impact even if the people/institutions were just doing it): we did it to claim the impact.

I think that from the perspective of the Wikimedia projects, claiming impact is contradictory/controversial/wrong. But from the perspective of project management we needed to do it. I proposed an approach that I really hope didn't disturb too much and somehow allowed people/institution to freely take their decision. For the templates related to institutions we included information related to licenses and OTRS tickets, to make sure the template would also provide some relevant information from the Wikimedia projects perspective.

My question is: does the Wikimedia Foundation really want to do it this way?
I think WMF is the only institution which can actually understand that claiming impact on content and contributors of the Wikimedia projects is contradictory/controversial/wrong. You can consider that the fact that a project is not directly responsible for an impact is a risk you can take, because it better respects the Wikimedia community, the meaning of what people do on the Wikimedia projects and the actual beneficiary of their work (the project or just the vision of Wikimedia?).

2. Can we make sure the tools we have to document impact actually change if our projects do make an impact?
if I actually increase the quality and quantity of African content on Wikipedia and the Wikimedia projects, will this appear in the way we currently monitor the impact? I make some examples

  • On a map of contributors from all-over the world, will it appear? The geography of Africa (with its dimensions and the demographic concentration of people in specific areas) will probably always show Africa as a dark continent. The fact that one computer is used by many people will effect the representation? I have the impression that this kind of representation will never make Africa look good. Certain countries/territories will be shining and colorful. The African continent will remain dark even if much much more users than in Europe are editing the Wikimedia projects.
  • If contributes in Africa can not directly upload content on Wikimedia Commons because they have not a sufficiently stable internet connection (which is indeed a reality in many places), is it possible to acknowledge their contribution anyway? maybe they are providing the authorization to use images. If I make the uploads I can not ethically use their account because it is private and I can not make a second account I use for them because it would be a sock puppet. Can the author or source of images be also considered a user if she/he/it actually agrees to contribute to the Wikimedia projects? Can this kind of contribution be acknowledged somewhere?

Please don't tell me "we can use different measurements" because it is not true. if measurements are not shared ones, the really is that what does not fit, it will never fit and it will be considered "B class" ("diverse but equal" is a politically correct way to make sure diversity remains over there, nicely separate crap).

3. Can we include also the impact we do not want?
I think defining what we do not want to do is very healthy. It helps to create some barriers and to also consider how we do things, not only what we can state we accomplished. It can be quite general issues (for example: we do not want to break Wikipedia rules) or more specific (for example: we do not want to pay edits).
Thanks, --Iopensa (talk) 10:30, 20 June 2013 (UTC)

The model in practice[edit]

To see how Wikimedia UK has used this model in practice, to help inform its five year strategy, see here.