User:EGalvez (WMF)/Evaluation/Navigation bar

From Meta, a Wikimedia project coordination wiki
Jump to navigation Jump to search

Grants-Evaluation logo-2.png


Grants: Evaluation

LEARN

PLAN

MEASURE

SHARE

TALK

NEWS

Translation box


Share icon WHITE-01.svgLearning and Evaluation


Let's Talk! - Program Evaluation and Metrics

Group visioning session - group one - Stierch.jpg


Let's Talk!
Program Evaluation and Metrics

Summary[edit]

The focus of the Program Evaluation and Design team is peer learning, practice sharing, and the adoption of shared language and approach to program evaluation across the movement, to support the discoverability and innovation of programs that will achieve high impact. Importantly, the team is charged with developing learning resources and tools to support program leaders to self-evaluate their program efforts and impacts.

The first year[edit]

This past year, the Program Evaluation and Design team has initiated a handful of strategies designed to help program leaders evaluate the impact of their work and share best practices. These efforts toward program evaluation capacity building are part of an initiative intended to support program leaders in designing and running effective programs as well as grantmakers in making good decisions about how and where resources may be invested in order to achieve impact.

In brief, goals for program leader self-evaluation and sharing have included:
(a) Supporting program leaders in learning to use evaluation to improve their program design through:


(b) Working in collaboration with program leaders to develop and share a high-level understanding of which programs and activities reach different outcomes and levels of impact through:

The year ahead[edit]

In the second year of the program evaluation capacity building initiative, the team aims to:

  • Continue to work with the grants officers to ensure continuous learning opportunities across the movement, including periodic hangouts, IRCs, one-on-one conversations and face-to-face capacity building workshops at larger Wikimedia events.
  • Develop and provide the most needed evaluation resources - data, tools, and information - to program leaders and community organizers, using baseline information gathered through meet-ups, dialogues, surveys, and reporting.
  • Revisit and report back on the seven current year’s program reports, as well as grow - by at least three new areas - the depth and number of programs reviewed and reported on.
  • Work with successful program leaders throughout the movement to develop “program toolkits”: blueprints for program components and processes that have proven in the past to support the achievement of impact through case studies and collaboration on pilot program evaluation strategies with program leaders interested in pursuing deeper inquiry.

Key questions[edit]

Program evaluation[edit]

  1. How should the program evaluation team evaluate and report on the program evaluation and design capacity-building initiative? What measure of success would you find most useful for assessing progress toward your team goals? (Discuss here)
  2. Examining the evaluation learning opportunities and resources made available so far, what have you found to be (a) most useful in supporting program evaluation and design and (b) in what resource areas do you feel least supported?(Discuss here)
  3. Examining Evaluation reports (beta), please share about the (a) strengths and (b) weaknesses of the metrics piloted. (Discuss here)

WMF grantmaking overall[edit]

4. How should WMF grantmaking evaluate its various grantmaking initiatives to groups and organizations (e.g., annual plan grants, project and event grants)? What strategies and/or metrics would you recommend for assessing grantmaking to groups and organizations?(Discuss here)
5. How should WMF grantmaking evaluate its various grantmaking initiatives to individuals (e.g., travel scholarships, individual engagement grants)? What strategies and/or metrics would you recommend for assessing grantmaking to individuals?(Discuss here)

PARTICIPATE![edit]

If you are interested in providing thoughts and suggestions – please contribute! Each of the above questions has a space on the dialogue page in the evaluation portal, and other topics will inevitably be proposed and hashed out there as well. Sub-pages compiling suggestions and recommendations are also encouraged. :)

See also[edit]

Interested in reading evaluation report (beta)? Access the overview and all seven reports here here
Interested in sharing more in terms of specific measures and metrics? Participate in the metrics brainstorm!