Wikimedia Blog/Drafts/Evaluation Portal on Meta: A Redesigned Space for Learning
This was a draft for a blog post that has since been published at http://blog.wikimedia.org/2014/08/29/evaluation-portal-on-meta-a-redesigned-space-for-learning/
Evaluation Portal on Meta: A Redesigned Space for Learning
[edit]Body
[edit]Just over one year ago, the Wikimedia Foundation started talking about evaluating programs like Wiki Loves Monuments and the Wikipedia Education Program. The goal was to grow support for program leaders to evaluate activities and outcomes that would lead to learning and improving the effectiveness of their programs.
As we have engaged in this work, the collection of evaluation resources has grown significantly. In order to better support program leaders and the broader community in learning about evaluation, we had to reimagine our pages on meta. We are happy to introduce you to the newly redesigned evaluation portal!
Improved organization
[edit]The new portal has four main sections with evaluation resources: Study, Plan, Measure, and Share. Two other sections, Connect and News & Events, are spaces for networking within the evaluation community through talk pages, online and face-to-face events. We’d like to take a moment to explain these sections and how they may be useful for anyone who wants to evaluate their programs.
- Study. Program evaluation is an academic field, with its own language and theory that can be studied. The Study section has resources to guide new evaluators with the vocabulary, theory and strategies related to evaluation in the Wikimedia movement.
- The Glossary is one of the most valuable pages that defines some of the key terms that may be used in conversations about program evaluation. Explanations for phrases like program leader or program implementation, are found here. With evaluation, it can often help to read what others have done. Go through examples about how evaluation fits within the movement in Evaluation in the Wikimedia Movement: Case studies. Step-by-step guides called Learning modules walk through resources and tools for evaluating a program. Some of the topics include writing surveys and using Wikimetrics.
Plan. Evaluating a program means to plan in advance. This section of the portal is designed to include the important steps to planning an evaluation: identifying goals, choosing targets for those goals and deciding which metrics to use for measuring those targets.
Choosing Goals and Measures provides guidance for setting outcome targets. Once you identify your goal (or goals), you might review Program Resources as a most basic guide of best practices and associated program goals and metrics. If your program is slightly different, or if you are creating a new program, the Logic Model is a great process to map your program’s or project’s vision. Explore Learning Patterns related to implementation, such as how to collect usernames, how to ask about gender, or how to advertise a program.
- Measure. In order to evaluate a program you must know what and how you will measure progress toward your goals. The Measure section can help: it provides strategies for collecting and keeping track of data.
- Tracking and monitoring can capture data for telling the story of a program, how the program is working and where improvements might be needed. The Reporting and Tracking Toolkit offers guidance and templates for tracking a program, from the inputs, like hours or money, to the outputs, like how many participants and the outcomes, like the number of editors retained. Wikimetrics is a useful tool for easily measuring user contributions on Wikimedia projects. Meanwhile, surveys can measure participant’s attributes (e.g. gender, hometown), attitudes (e.g. motivation to edit), or behaviors (e.g. how many times they edit per week). The Survey Question Bank is a repository for questions searchable by program goal or the survey goal and Qualtrics, an online survey platform, is a tool program leaders may access for large-scale online surveys.
Share. A key aspect of learning and evaluation is sharing what you know. This section is the portal space for the entire community to share results of activities and evaluation related to Wikimedia programs.
Writing and sharing reports can be very helpful for learning from one another about evaluation strategies. Evaluation Reports (beta) is an initial collection of program impact reports, that provides many details on the process and ways to analyze data. Program leaders can also read or post Case Studies to show the work they have done. In addition to sharing reports, it is great to share tips or solutions to problems you have found along the way. Creating or endorsing Learning Patterns are great ways to reflect and share with your peers.
Better spaces for Communication
[edit]Connect is a space for the evaluation community to talk about evaluation, metrics, programs and to meet one another.
If you are involved in planning, implementing, or evaluating Wikimedia projects and programs, add your photo to the Community section and share which programs you have been involved in. If you want to ask a question about evaluation, this is the place to post it on-wiki.
- News and Events is for the Learning and Evaluation team to post upcoming events we are hosting, or hear about from community members, related to Wikimedia learning and evaluation.
- We frequently host Virtual Meet-ups and training events to build our shared knowledge around programs, measurement and evaluation. Follow this page to keep up with upcoming events and learning opportunities!
Visit the Portal @ meta:Grants:Evaluation
[edit]While the sections and resources in the portal will continue to develop, we hope that the new organization will help all of us better navigate the useful content that is held there. Please visit the portal and let us know how it can help you! Also feel free to post us any feedback about the site’s organization or content.
As always, email eval@wikimedia.org if you have any questions!
Notes
[edit]Tags:
Evaluation, Program Evaluation and Design, Learning and Evaluation, Grantmaking
Category:
Program Evaluation and Design