Program evaluation basics: efficiency, effectiveness and impact

From Meta, a Wikimedia project coordination wiki
Jump to: navigation, search

Contents

This page aims at introducing the reader to three basic concepts of program evaluation: efficiency, effectiveness and impact. It aims at defining these terms while giving some real-life examples in the Wikimedia context. After reading this page you will have a better understanding of each of these terms and you will also know what they mean in the context of doing programmatic work within the Wikimedia movement.

Defining program efficiency, effectiveness, and impact[edit]

A great starting point for talking about program evaluation is to get a better understanding of the concepts of "efficiency", "effectiveness" and "impact". Once we all agree on how we define and use these terms, we will share a common language for everything else that follows. Because all three are things that we will want to measure later on.

Program efficiency relates to an analysis of the costs – money, people, time, materials, etc. – that are expended as part of a program in comparison to either their benefits of their effectiveness (Boulmetis / Dutwin 2011, p. 5). What does this mean in our context? Let's consider two Wikipedia Editing Workshops executed by grantees A and B. Both grantees staged their workshops as one-day events and both had 30 participants attending their events. Grantee A asked 5 Wikipedians to train the participants how to edit Wikipedia, whereas grantee B needed 10 Wikipedians to do the same task. Now, let's just assume that both workshops have the same outcome. Every single participant's level of knowledge about how to create a user account, how to start a new article, how to work in sandboxes and how to upload a picture to Wikimedia Commons increased significantly, and at the end of the day, every participant was able to contribute to Wikipedia and to Wikimedia Commons (now, I guess you're getting it: this is a hypothetical example; but although this might never occur in reality, let's just take this as a way to explain the phenomenon of "efficiency"). Well, determining the efficiency in those two cases is easy: grantee A's workshop was twice as efficient as grantee B's workshop. Grantee A only needed half the number of people to achieve the same result as grantee B, whatever the reason for this might have been. Maybe grantee A selected Wikipedians who already had some experience in teaching newcomers how to edit. Or the workshop that grantee A organized had a better agenda that enabled a smaller number of trainers to cover the same amount of content. Let's not get further into the details here and move on to the next term instead.

Program effectiveness relates to the level by which the activities of a program produce the desired effect. Let's consider a grantee C who receives money for creating online training materials for Wikipedia newcomers. Those training materials introduce new editors to the basic concepts of how to contribute. As the materials are freely available online and many people can access them, grantee C has achieved a high level of efficiency. Why's that? Grantee C can reach a much larger audience with her online training than grantees A and B, so the materials prove to be more cost-efficient (let's just assume that C's investment was not bigger than that of grantee B). But are those online materials that grantee C created also as effective as the trainings that grantees A and B executed? As it turned out (oh, you're getting it again, right? This is also a hypothetical example; of course there are online trainings that can be as efficient as in-person trainings. I'm just making this up to explain the concept of "effectiveness"), after measuring the results, grantee C finds out that her online course was not as effective as she'd hoped. Only 10% of the participants who took the online course learned the basics of how to edit Wikipedia. Thus C's program was not as effective as A's and B's program. So, when you look at the effectiveness of your program, you are asking whether the activities did what they were supposed to do. Therefore, a program's effectiveness "is measured in terms of substantive changes in knowledge, attitudes, or skills on the part of the program's clients" (Boulmetis / Dutwin 2011, p. 6). Now, let's move on to the last term.

Program impact is the extend to which long-term and sustained changes occur in a target population (Boulmetis / Dutwin 2011, p. 7). Now we're really getting into what doing programs is all about. Let's consider that one of our strategic goals is to make more people contribute to Wikipedia and also to improve Wikipedia's coverage and quality of content. Thus, the impact of our programs can be measured by looking into how many people actually edit Wikipedia after going through one of our programs and by looking at how much these people's work improved Wikipedia. To explain this further, let's get back to our two prior examples. Half a year after executing their programs, grantees A, B and C decide to measure their programs' impact. They look at the participants who went through their workshops (A, B) or used their online materials (C) and count the number of people who actively started editing. They also measure the amount of content that those people contributed to Wikipedia. As it turns out (and this is still hypothetical, I almost don't dare to mention it again), out of the 60 people who attended grantees A's and B's workshops, 20 became active Wikipedians, each improving more than 50 articles. Grantee C had a different outcome: More than 1,000 people took her online course, and only 10% of those became active Wikipedians, each of them also improving more than 50 articles. That's more than 100 new Wikipedians and more than 500 articles improved for C's program, whereas grantees A's and B's effort resulted in 20 new Wikipedians who together improved more than 100 articles. Although C's program was less effective, it had a bigger long-term impact on Wikipedia.


Wikimedia-logo.svg
Program evaluation basics
+ Add a commentDiscuss this topic
No comments yet. Yours could be the first!