User:HaithamS (WMF)/Learning & Evaluation/About

From Meta, a Wikimedia project coordination wiki


Grants:Learning & Evaluation/Nav


The Wikimedia Foundation is increasingly focused on effective use of movement resources, and Grantmaking is the primary channel for this. Grants, though, are only a useful tool if we learn from them and invest in activities that are driving our movement towards its goals! The Grantmaking Learning & Evaluation (L&E) team is designed to

(a) assist the grants officers in executing their programs effectively and efficiently by providing frameworks, tools, and research,
(b) provide a hub of learning and evaluation materials for our grantee partners, and
(c) execute and reflectively apply research and evaluation work to inform WMF’s overall grantmaking strategy.

Learning & Evaluation is the connecting piece of the Grantmaking work, guiding resources to our grantee partners and enabling a better understanding of the impact of our overall grantmaking work. L&E collects and uses inputs from various experts across the movement -- e.g., grantees, grants committee members, volunteers , WMF -- to better understand the strategic approach we should take to grantmaking. For example, is it sustainable for us to fund movement partners to grow at 30% rates? What type of organization or organizational context is required to build an excellent GLAM program? Within WMF, the L&E team will work closely with the Program Evaluation & Design, Finance, Analytics, and Community Advocacy teams to help answer these questions.

The work done by the L&E team will focus on answering two meta-questions for the WMF Grantmaking team:

  1. how can the movement partners be best poised to achieve impact?
  2. what can WMF Grantmaking provide to our partners to achieve impact?

While these questions are big and daunting, and certainly won’t be answered quickly, the L&E team is preparing to tackle just a few pieces of them for the first two years (2013-14). Specifically, the L&E team is focusing in its first years on the following components of the grantmaking process:

Area of focus Example activities/projects
Designing good measures for organizational trajectories of growth and effectiveness for our movement partners, based on both Wikimedia movement observation and external research
  • Commission external researchers to help us adapt an Org Effectiveness index
  • Work with community to better understand the importance of OE in the context of our movement
Developing systems and tools to enable learning and development internally and externally
  • Internal: Create a cross-grant programs framework for structured data collection and analysis
  • External: Create and maintain a set of technical tools and other online resources to support research and project management by movement partners
Mapping of communities and programs to build baselines and identify strategic areas, including careful incorporation of the Global South strategies
  • Community Maps of priority Global South networks
  • Databases for the languages we are supporting the most
  • Database of the funds and timings of investments in our priority target groups
Fostering a collaborative learning agenda across the grantee partners (i.e., grantees, grant committees, grantmaking staff)
Analysis of databases, reports, and research across movement to create best practices and inform overall grantmaking processes
  • Quarterly reports
  • “ROI” for grants programs
  • Identifying under-penetrated and over-penetrated areas

Ongoing evaluation of Grantmaking[edit]

We have built into our grantmaking processes a series of self-evaluation methods, to ensure that we are doing our grants in the best way possible! Collecting feedback and self-evaluating is a critical component of the Wikimedia Grantmaking process. The radically participatory and transparent nature of our grantmaking process has little precedent in the broader grantmaking field, resulting in an even stronger need for review and reflection of our processes and impact. Such reflection will help us better assess what things may need to be improved or better coordinated.

Programs and Stakeholders[edit]

All components of WMF Grantmaking are reviewed on a regular basis. This includes:

  1. Individual Engagement Grants
  2. Project and Event Grants
  3. Annual Planning Grants
  4. Travel and Participation Support Grants

These will also be rolled up to review the Grantmaking work overall.

The perspectives of the different stakeholders in the grantmaking ecosystem have to be taken into account, as well. We are looking specifically for inputs from the following groups: All applicants, grantees, Grants committee members, WMF staff, Broader community.

Tools in place[edit]

Process evaluation[edit]

Our primary goal for these surveys is to get feedback on our grantmaking processes from their primary stakeholders: those who submit and/or complete grant-funded projects, and members of the volunteer committees that select grants for funding. Surveys are generally deployed at two particular points in the grantmaking process: after funding decisions have been made, and after a grant project has been completed. These surveys provide us with data to help us improve the process.

We are currently in the process of deploying post-decision and post-project surveys for all four grantmaking programs. They are unlikely to be a permanent fixture of grantmaking: once we feel our grantmaking processes are in good shape, we will no longer need to ask all submitters/grantees/committee members to fill out additional surveys.

If you have general questions or comments about WMF surveys, beyond a particular survey, please ask them at the Evaluation portal Q&A page or the primary L&E talk page.


"Stakeholder" Individual Engagement Annual Plans Project & Events Travel & Participation Overall Grantmaking
All Applicants *Aggregate feedback (one time, for now)
Grantees
i.e., those applicants selected to receive funds
Post-grant survey Post-decision process survey
Committee Members Process survey (in development)
WMF Staff involved
Larger community members
  • Editor survey (TBD)

Full program evaluation[edit]

Individual Engagement Grants

Following each round of funding, we will do a thorough review of the grantees to better understand their impact, experience, and ongoing contributions. We will collect this information primarily through the synthesis of the completed final reports, supplemented with a post-grant survey.

Annual Plan Grants

See "Feedback and continuous improvement of the FDC Process" for more information
As the Annual Plan Grants are currently on a two year "trial" period, we are very carefully assessing the work being done here. There are four main mechanisms for collecting feedback and continuous improvement:

  1. The ongoing feedback process collected by the Ombudsperson
  2. The FDC recommendation complaints process
  3. The annual assessment of the FDC process, which will include:
    • A survey for involved parties after each funding round
    • Feedback gathered in the annual editor survey
    • A summary report from the Ombudsperson from all feedback collected over the course of the cycle
    • An assessment of the time and financial costs of the FDC
    • An objective assessment of the effectiveness and outcomes of funds allocated in the cycle
    • A self-assessment of the FDC's performance by FDC members
  4. An annual FDC report summarizing learnings and reflections on past grants
Project & Event Grants
Largely TBD. We have done only outsourced one Grants retrospective, capturing history rather than fully assessing impact. We are in the process of doing a comprehensive impact review of the Project & Event Grants, and then doing bi-annual reviews of PEG utilizing grant reports and post-grants input surveys
Travel & Participation Support

The Travel & Participation Support Program is going through some self-reflection that includes a program participants survey, and analysis of the grant proposals data. Results will be posted here as soon as they are ready.

Ongoing evaluation of Grantmaking Pipeline[edit]

We want to fund projects and groups that can cause changes in the Wikimedia world. To do this, we are focusing on two channels of increasing the pipeline of potential grantees: (a) encouraging innovative ideas/projects (IdeaLab) and (b) bringing in new grantees from prospective areas of the world that are under-represented areas on the Wikimedia projects ("Global South" strategy).

===IdeaLab===

We are focusing on the following questions in FY2013-14 regarding the IdeaLab:

  1. What is an effective way to source and incubate a variety of new project ideas as a pipeline for grantmaking?
  2. Does IdeaLab result in more good grant proposals to choose from?
  3. Does it result in more diverse grant proposals (GS and Gender) and a wider range of grantees to choose from?
  4. If so, which elements/pathways within the IdeaLab experience contribute most to these outcomes?

Tools in place for assessment[edit]

  • Some questions on the Post-funding decision survey and post-grant survey for IEG
  • Focus groups with select IdeaLab participants, committee members, and relevant WMF staff
  • Tools monitoring activity within the IdeaLab - gauging whether or not it is a used resource

===Global South===

Wikimedia Grantmaking intends to develop and execute an innovative grantmaking and community growth strategy for supporting under-resourced and emerging regions, languages, and communities in our movement, particularly in the Global South (GS), that will build upon the learnings from the catalyst projects. While continuing to be global, and supporting other Global South and emerging communities, we will have a

specific focus for the next two years on 8-10 geographies and languages with high potential, and build community and content through grants as well as pro-active community and leadership development. The questions we are hoping to answer throughout FY 2013-14 are:

  1. What can our funds do where no core active editing community exists? (i.e. not just Congo, but also Kenya)
  2. What constitutes waste in GS spending? How much do we put in before we declare failure?
  3. How can we leverage partnerships in the GS to expand our mission work? What are the profiles of such partners we should look for in the GS?

Tools in place for assessment[edit]

  • Global South Dashboard
  • Mapping of different communities (TBD)

Gender Gap[edit]

TBD

Overall grantmaking review[edit]

We take the inputs from the above in order to do a full-assessment of our grantmaking program. We do a full assessment quarterly during the WMF Quarterly Reviews with the Executive Director.

Tools in place for assessment[edit]

  • All tools listed above
  • Quarterly Reviews with the WMF Executive Director
  • Quarterly metrics on meta
  • Ongoing monitoring of grants spend via tool Fluxx (results in quarterly review) (TBD)

Other projects[edit]

  • Research on external movement organizational strategies (TBC)
  • Benchmarking research on other grantmakers (TBC)
  • Research: Wikimedia First Employee Study
  • Background research on legal and financial restrictions in grantmaking worldwide