Grants:APG/Learning/2012-2013 round1

From Meta, a Wikimedia project coordination wiki
APG round 1 2012-13
impact report

total grant spend (USD)
Articles improved
individuals touched
of grant funds to global south


Abridged public presentation (google hangout) from June 2014
Full report for the FDC Advisory Group in their two year review of the FDC process 2012-14
Impact assessment across grants programs 2013-14

This report looks at the results of the first round of FDC funding: Round 1, 2012-13. The final reports summarizing the funded organizations' work from fiscal year 2013 came in April 2014.

Nine organizations received over US$4M in funding to make progress towards the Wikimedia movement's strategic goals of reach, participation, and quality content in 2013. When examining the reports, two key questions were in play:

  1. What progress are the organizations making against our movement's strategic goals (reach, participation, quality content)?
  2. Are organizations learning and becoming more effective?

After sifting through the reports of the organizations, we found that organizations were doing a lot of offline work that could be translated into online impact, though this work was not necessarily commensurate to the amount of money distributed to the organizations. Moreover, we saw that organizations were investing a lot of money into their organizations structures, systems and process, but it remains unclear what characteristics make for the most effective Wikimedia organizations.

Coming out of this first round of reports, we continue to applaud the work being done on the ground by so many volunteers and the organizations which support them, while reflexively asking ourselves the question: what is the right levels of funding for the outcomes of the work being seen?

Round 1, 2012-13: $4M in funds distributed Stats

  • Median grant: $342K
  • Min grant: $64K
  • Max grant: $1790K
  • % of funds requested received: 87% (Range: 58-100%)
Wikimedia France received $94K for 6 months of “bridge funding;” they applied for a full 12 months of funding in round 2 (and received $525K)

Key takeaways[edit]

  • Chapters are doing a lot of work, and seem to be well suited to focus on projects related to improving the quality of content on Wikimedia projects. This includes GLAM, Wiki Loves projects, and "technology pools" (see table below for more details)
  • Chapters are investing heavily in themselves as organizations, yet it is not yet clear that these investments will lead to more programmatic impact and effectiveness
    • ~25% of total costs were explicitly administrative, and ~50% were staff related. It is unclear if this is the appropriate overhead ratio for the different Wikimedia organizations
    • When funding received was less than requested, orgs made all the cuts in budget on planned program expenditures
  • Learning is taking place, but little of it has been systematically documented in learning patterns:
    • To scale the impact of the programs occurring, we need clearer systems of documenting and sharing learning
  • Chapters respond relatively well to specific reporting requirements (e.g., budget, hiring) but struggled with self-reporting on the more open program questions
    • No clear tracking back to the original SMART goals laid out in the proposals
    • Limited metrics around outcomes included (e.g., # of images used on Wikipedia projects)
  • Support is needed to develop tools and guidelines for better measuring both programmatic impact and organizational effectiveness
Summary of key content projects
GLAM initiatives bring important documents online, but successful partnerships can require significant investment of staff or volunteer time.
  • 58 GLAM partnerships
  • 1,809 documents/photos/files uploaded to Commons per partnership
Wiki Loves photo contests are popular with volunteers, engage many new users and generate considerable content.
  • 107,364 photos added to Commons through WLM and other photo contests.
  • 1,788 participants reported in 15 contests (from 5 chapters)
  • 1% of photos are recognized as ‘Quality Images’
Technology Pools make high-quality equipment available to the community to document important national events.
  • 583 users supported by 5 chapters
  • 73,098 photos, videos and sound recordings

Major challenges[edit]

Programmatic Challenges[edit]

Community engagement: Organizations faced some difficulty establishing relationships with local editors; small grants have been harder to distribute than anticipated. Some reports indicated that hiring Community Liaisons has been an effective way to mitigate these issues.

Prioritizing partnerships: Many organizations report receiving more requests for education programs and GLAM partnerships than they are able to take on. Frameworks for identifying goals and prioritizing opportunities are needed.

Program development: Organizations found they could not carry out the full slate of programs described in grant proposals. Targeting specific groups or focusing on a smaller set of goals when developing programs could lead to better outcomes.

Volunteer management: Chapters struggle to recruit and manage volunteers, and have volunteer burnout. Chapters may benefit from guidance in recruiting new volunteers, volunteer friendly reporting tools, volunteer-staff working relationships.

Organizational challenges[edit]

Reporting: It is clear that some entities have more facility with English and narrative reporting than others. Recommending the use of a translator may save time and lead to clearer reports.

Estimating Budgets: Several newer entities reported budget variances due to difficulty estimating costs for office space and hiring, VAT, and auditing expenses. It may be helpful to provide guidelines for entities who are establishing offices for the first time.

Hiring and onboarding: Several entities reported that hiring employees has been a challenge. Learning patterns with best practices for recruiting, interviews and new employee onboarding would be useful for many entities.

Governance: Issues with board governance led to higher administrative costs. Movement has been made by organizations to provide guidelines for recruiting and managing effective boards, such as the board training workshop hosted by WMUK.

Learning Patterns[edit]

The following learning patterns were created or modified by the FDC grantees working through 2012. This means, these are what they considered the most important learnings to share with the broader movement. We're excited to see organizations continue to share their learning more broadly:

Recommendations and next steps[edit]

For WMF[edit]

Upon review of the survey, we have the following recommendations for the WMF staff working on activiites around the FDC process and the FDC funded organizations:

  • Reports: FDC Staff to reconsider revising reporting requirements, so that orgs can put more focus on quality for the ones that are filled out
  • Learning patterns: improve functionality and searchability of the learning pattern library, so that orgs can share learning in this codified method throughout the duration of programs
  • Organizational Effectiveness Indicators: engage in research on what it means to be effective as a Wikimedia organization, and develop a tool enabling organizations to self-monitor this
  • Global metrics: continue to socialize volunteers to the tracking and reporting toolkits; set up a few key metrics to be tracked globally by organizations and groups working on programs

For Funded Entities[edit]

  • Share learning: capture the lessons being learned into learning patterns, to help demonstrate how you are reflecting and improving. This will also help others not "reinvent the wheel," and learn from the experiments already conducted.
  • Capture data systematically: know what you want to measure before you start a program, and report these metrics systematically. For ideas on what to track and how to collect the information, see the Tracking and Reporting Toolkit.
  • Compare results to original (or modified) goals: One way of demonstrating your progress towards a strategic plan is by tracking performance against these goals. See examples from the Project & Event Grants program, or Wikimedia UK's new 2014 tracking system.

See also[edit]