Grants:APG/Staff proposal assessment form

From Meta, a Wikimedia project coordination wiki
Jump to: navigation, search

The staff proposal assessment is one of the inputs into the Funds Dissemination Committee (FDC) proposal review process, and is reviewed by the FDC in preparation for and during the in-person deliberations each round. The purpose of the staff proposal assessment is to offer the FDC an overview of the expert opinions of the FDC staff team about this annual plan grant proposal. Members of the FDC review each proposal in its entirety, including relevant background documents, inputs from WMF grants officers, the learning and evaluation team, and finance staff. They also analyze the detailed discussions surrounding each proposal that include questions and comments from community members. FDC staff do not participate in the actual decision-making process of the FDC; their primary role is to identify key strengths and concerns for the FDC to consider while making funding recommendations.

This staff proposal assessment, which was refined after the first two years of the FDC process, consists of three parts: (1) Overview, including a financial summary and a summary of strengths and concerns; (2) Staff proposal assessment narrative, including a nuanced analysis of the programs, context, and the feasibility of each plan; (3) Staff proposal assessment rubric, assessing each proposal across three dimensions by identifying areas of strength and concern. Read more about the methodology behind the Staff proposal assessment rubric on this page.

Staff proposal assessments do not represent the views on any one staff person, since all FDC staff (unless one of them recuses for some reason) work together to complete each staff proposal assessment, and may consult with other key WMF staff and other experts as needed. While assessments will not be changed after they are published except to correct significant factual errors, organizations are welcome to share responses on the discussion page of each staff proposal assessment if they would like to provide the FDC with additional information or perspectives on the assessments.

Overview for all assessments[edit]

As in Round 1 of 2014-2015, proposals in 2014-2015 Round 2 are being assessed in the context of some preliminary impact data for grants and programs over the last two years, and our current and future approach is being guided by this analysis. In particular, we have been rigorous with our assessments of the Wikimedia organizations with the largest budgets and requests, as we expect them to be delivering outcomes to scale.

For example, we were able to access data from the first round of impact reports for grantees entering the APG process in 2012-2013 Round 2 (i.e. Wikimedia Norge and Wikimédia France), as well as progress reports received from the current grants (i.e. Q1 and Q2 reports) and project grant reports for all organizations but Wikimedia Italia, which is new to the APG program and did not receive significant project funding in the past. This allowed for a more detailed assessment of proposed goals against progress in the current year, as well as the current year’s progress against the current year’s goals. In some cases, we were also able to compare the current year’s goals and performance with past achievements. A move toward the inclusion of more easily comparable metrics in reports also assisted with this analysis.

There are a few overarching themes we noticed in this year’s proposals that we want to highlight. Please note that any examples used are meant to be illustrative, not exhaustive.

Train the Trainer approaches[edit]

In this round, we noticed a focus on Train the Trainer (TTT) approaches, which builds on the interest from Round 1. With a TTT approach, organizations target and train a group of participants on specific topic areas so they may in turn train others. If designed and implemented well, this program approach may have potential to expand reach in the movement and improve relevant skills beyond an organization’s direct sphere of influence.

This approach, which takes significant resources and organizational focus, requires more clearly articulated strategies, documented/centralized resources, and measurement and evaluation of the results. Most organizations are not yet evaluating skills developed, results of trainings after the original trainings, or attitudinal shifts, and so it is difficult to understand the impact of these programs. Furthermore, we are concerned not to see plans for sustaining engagement with these trainees after trainings.

We encourage organizations to document results and resources invested in these programs. We also encourage organizations to develop engagement plans before new training programs are implemented. Finally, we also encourage organizations start small before making the decision to invest in programs more significantly.

Metrics and evaluation[edit]

As in Round 1, we realize that all Wikimedia organizations in this round are improving their measures of success and attempting to demonstrate impact. This work is not easy; we recognize the effort. For example, we are pleased to see some (including Wikimedia Armenia, Wikimedia Norge, and CIS) offer baselines and projected outputs and outcomes. This is a significant improvement from 2013-2014 Round 2 proposals.

However, not all programs include metrics, particularly around outcomes. We are very concerned, particularly in the case of the larger organizations, about the lack of a baseline and project outcomes beyond outputs. Quantitative and qualitative evaluation methods are both important. The Wikimedia Foundation will provide continued support to organizations who request it to develop capacity in evaluation. On the quantitative side, we encourage organizations to establish systems to track the global metrics, as they will need to report on in order to understand their work in a broader context, and also to continue to carefully consider what metrics will be most relevant in their contexts.

Diversification of funds and resources[edit]

Especially for organizations with larger budgets, we are monitoring reliance on APG funding. Organizational reliance on APG funds (expressed as a total percentage of revenue) may be increasing for some, and in other cases may not be decreasing at a sufficient pace. In some cases, large organizations are not meeting fundraising targets or cultivating sustainable longer term institutional partnerships. We are particularly impressed with organizations in the global south who have been able to cultivate donors.

All organizations in this round are proposing significant total budget growth. Each organization’s growth must be examined in context, but in some cases this rapid growth does raise capacity concerns.

As we noted in Round 1, the diversification of resources is not only needed in order to maintain the sustainability of each individual organization and the global movement, it is also a way to build awareness of the Wikimedia mission.

Lobbying and political advocacy[edit]

In Round 1, we saw organizations express significant interest in policy, advocacy, and awareness.

In Round 2, Wikimedia Italia has proposed to engage a lobbyist to address legal threats to its program. Any lobbying work requires some oversight from the Wikimedia Foundation. We will monitor this work over time.

Increase in program activities and organizational capacity[edit]

In many cases, organizations are planning a large number of programs, with a many activities. This raises concerns around organizational capacity. Beyond the activity and staff expenses for each program, programs represent a significant investment of organizational focus. We encourage organizations to group programs or prioritize them appropriately in order to present a clear picture of what they are trying to achieve and why, and in order to reduce strain on volunteer boards, staff, online contributors, and offline volunteers active in program areas that can result from too many dispersed activities. We encourage organizations to focus on their strengths rather than to experiment in every area. In this way, organizations can experiment with thoughtful adaptations of existing approaches based on local context and the organization’s expertise, and with small scale pilots with potential to scale up.

As we noted in Round 1 around proactive and responsive community engagement strategies, we realize it is not easy to achieve a good balance of responding to community needs and leading strategic work for the movement. Refining focus will be an ongoing process.

Diversity and work on projects other than Wikipedia and Commons[edit]

As in Round 1, an interesting development is the increasing focus on Wikimedia projects other than Wikipedia. For example, we have seen CIS and Wikimedia Armenia integrate work on Wikisource and Wiktionary into key programs.

As in Round 1, we are also assessing proposals across the dimension of diversity, and we have seen Round 2 organizations focusing on this in their contexts through both a regional and a gender-focused lens. Some organizations have diversity at the core of their missions (e.g. CIS, Wikimedia Armenia), some are approaching this work through a cross-cutting approach (e.g. Wikimedia Italia) and others have designed specific programs to address diversity (e.g. Wikimedia Norge).

Overview[edit]

Overall assessment and eligibility[edit]

Organization

{{{name}}}

Eligibility

Summary[edit]

Current (projected)

Upcoming (proposed)

Proposed change (as a +/- percentage)

FDC or PEG funding

Budget

Staff

Overview of strengths and concerns[edit]

This section summarizes the key strengths and concerns identified by staff, which are explained in the detailed narrative section of this assessment.

Strengths[edit]

Concerns[edit]

Staff proposal assessment narrative[edit]

This section takes an in-depth look at the strategy behind this proposal, this organization's broader context, and the feasibility and risks of this plan.

Context and potential for impact[edit]

This section takes a close look at this organization's context, since there will be unique factors specific to this organization or to its environment that enable this plan to have impact or that make this plan less likely to have impact. Here are some questions we will consider:

Environment[edit]

How does this organization's environment (including its local context and its community) enable this plan to succeed and have impact? Are there risks and opportunities presented by this organization's environment? Are there extraordinary events or other factors that need to be considered as part of this organization's context?


Past performance[edit]

Will this organization's past performance with program implementation enable this plan to succeed? Does it raise any concerns?


Organizational effectiveness[edit]

Will this organization's overall effectiveness enable this plan to succeed? Does it raise any concerns?


Strategy, program design, and program implementation[edit]

This section takes a close look at this organization's plans for its programs, including its past performance with program implementation. In this section, we may also identify specific programs that seem to have particularly high or low potential for impact. Here are some questions we will consider:

Strategy[edit]

Does this organization have a high-quality strategic plan in place, and are programs well-aligned with this strategic plan? Is this organization proposing significant changes to its strategy, or are the strategic plan and annual plan consistent with past strategies? Is this organization's strategy and are this organization's programs aligned with the Strategic Priorities of Increasing participation, Improving quality, or Increasing reach?


Program design[edit]

Do proposed programs have specific, measurable, time-bound objectives with clear targets, and are program activities and results logically linked?


Specific programs[edit]

Based on proposed plans and past work, which programs may have the highest potential for impact corresponding to the amount of funding requested, and which programs may not have impact corresponding to the funds requested?


Budget[edit]

Is this plan and budget focused on the programs with the highest potential for online impact?


Feasibility and risks[edit]

How likely is it that this organization can implement this plan successfully if funded? Is this plan likely to achieve impact corresponding to the funds requested?



Staff proposal assessment framework[edit]

This framework assesses annual plan grant proposals across the three dimensions of (1) Program design and strategic alignment, (2) Organizational capacity and effectiveness, and (3) Budgeting by identifying strengths and concerns in each of these areas.

In each assessment, an organization’s context is taken into account and so criteria may be applied differently in different contexts. For example, a smaller organization with less experience will not be expected to have a formal learning and evaluation program that a larger more experienced organization might have, and so what may constitute a major strength for a smaller organization in the same category may not constitute a major strength for a larger organization. With this approach, the criteria listed here are not weighted equally, and so an “overall assessment” cannot be determined simply by adding up the numbers of strengths and concerns identified for each dimension. Also note that the scale here is not linear and so we do not assign any numerical values to the assessments given for each criterion. More information about each criterion is shared in the assessment. We believe a framework like this is helpful, as it ensures that each proposal is assessed consistently, even as they are applied with contextual nuance.

The dimension “Program design and strategic alignment” addresses many high-level elements, focusing on how likely programs are to achieve progress toward some of the Wikimedia movement's strategic priorities. Programs need to show clear links between strategic priorities and program objectives, program objectives and targets, and targets and program activities. In addition, programs should focus on areas that will impact participation, reach, and quality on the Wikimedia projects, with an emphasis on approaches and programs that increase diversity.

Meanwhile, the dimension “Organizational capacity and effectiveness” looks at the ability of an organization to achieve its plan based on both past experience (as an indicator of future success) and current practice. An effective organization both learns from past experience and contributes to movement learning in the process. This dimension also includes effective and stable leadership that models good practices, and a rational approach to growth that is supported by the organization’s strategy and context.

Finally, we analyze the organization’s past experience with and current plan for budgeting. Here, we look at how the organization has performed against its plans and budgets in the past.

To complete the assessment, we assess each of the criteria in each category according to the following scale, which is not necessarily linear:

  • Major strength
  • Strength
  • Neither a strength nor a concern
  • Concern
  • Major concern
Criterion Assessment Description
Program design and strategic alignment
P1. Strategy and alignment
P2. Targets and logic models
P3. Needs assessments
P4. Potential for impact at scale
P5. Evaluation methods
P6. New ideas and approaches, or thoughtful adaptations
P7. Diversity {{{P7}}} {{{P7 description}}}
Organizational capacity and effectiveness
O1. Past results
O2. Stability and leadership
O3. Learning
O4. Improving movement practices
O5. Community engagement
O6. Capacity
Budget
1. Past budgeting and spending
B2. Proposed budget is realistic
B3. Budget is focused on programmatic impact


This staff proposal assessment is the work of FDC staff and is submitted by:

Staff proposal assessment rubric description[edit]

This rubric assesses annual plan grant proposals across the three dimensions of (1) Program design and strategic alignment, (2) Organizational capacity and effectiveness, and (3) Budgeting by identifying strengths and concerns in each of these areas.

In each assessment, an organization’s context is taken into account and so criteria may be applied differently in different contexts. For example, a smaller organization with less experience will not be expected to have a formal learning and evaluation program that a larger more experienced organization might have, and so what may constitute a major strength for a smaller organization in the same category may not constitute a major strength for a larger organization. With this approach, the criteria listed here are not weighted equally, and so an “overall assessment” cannot be determined simply by adding up the numbers of strengths and concerns identified for each dimension. Also note that the scale here is not linear and so we will not assign any numerical values to the assessments given for each criterion. More information about each criterion is shared in the assessment. We still believe a rubric like this is helpful, as it ensures that staff analysis of each proposal is based on the same criteria, even as they are applied in a nuanced way.

The dimension “Program design and strategic alignment” addresses many high-level elements, focusing on how likely programs are to achieve progress toward some of the Wikimedia movement's strategic priorities. Programs need to show clear links between strategic priorities and program objectives, program objectives and targets, and targets and program activities. In addition, programs should focus on areas that will impact participation, reach, and quality on the Wikimedia projects, with an emphasis on approaches and programs that increase diversity.

Meanwhile, the dimension “Organizational capacity and effectiveness” looks at the ability of an organization to achieve its plan based on both past experience (as an indicator of future success) and current practice. An effective organization both learns from past experience and contributes to movement learning in the process. This dimension also includes effective and stable leadership that models good practices, and a rational approach to growth that is supported by the organization’s strategy and context.

Finally, we analyze the organization’s past experience with and current plan for budgeting. Here, we look at how the organization has performed against its plans and budgets in the past.

To complete the assessment, we assess each of the criterion in each category according to the following scale, which is not necessarily linear:

  • Major strength
  • Strength
  • Neither a strength nor a concern
  • Concern
  • Major concern

Criterion

Description

Program design and strategic alignment

P1. Strategy and alignment

Programs objectives are strongly aligned with the Wikimedia strategic priorities of increasing reach, improving quality, or increasing participation. The organization has a strategic plan in place.

P2. Targets and logic models

Programs have specific, measurable, time-bound objectives with clear targets, and programs present clear links between planned activities and desired results.

P3. Needs assessments

Program objectives are based on needs assessed in consultation with stakeholders, including the communities and volunteers that are working together with this organization to achieve its program objectives.

P4. Potential for impact at scale

Programs could lead to significant impact on the Wikimedia strategic priorities of increasing reach, improving quality, or increasing participation, at scale and corresponding to the amount of funds requested.

P5. Evaluation methods

Programs include a plan for measuring results and ensuring learning, and employ effective evaluation tools and systems.

P6. New ideas and approaches, or thoughtful adaptations

Programs will test new ideas and approaches or will thoughtfully adapt ideas and approaches that may work in this organization’s context.

P7. Diversity

Programs will expand the participation in and reach of the Wikimedia movement, especially in parts of the world or among groups that are not currently well-served.

Organizational capacity and effectiveness

O1. Past results

This organization has had success with similar programs or approaches in the past, and has effectively measured and documented the results of its past work.

O2. Stability and leadership

Stable, effective leadership and good governance will enable this plan to succeed. Any leadership transitions are managed effectively.

O3. Learning from the past

This organization is addressing risks and challenges effectively, is learning from and documenting past experiences, and is applying learning to improve its current and planned programs.

O4. Improving movement practices

This organization effectively shares learning about its work with the broader movement and beyond.

O5. Community engagement

This organization effectively engages key communities and volunteers in the planning and implementation of its work.

O6. Capacity

This organization has the resources and ability (for example, expertise, staff, experience managing funds) to do the plan proposed.

Budget

B1. Past budgeting and spending

This organization has a history of budgeting realistically and managing funds effectively in the past.

B2. Proposed budget is realistic

This proposal includes a budget that is comprehensive and clear, and that corresponds to the activities proposed.

B3. Budget is focused on programmatic impact

Based on past performance and current plans, funds are allocated to programs and activities with corresponding potential for programmatic impact.