Grants:PEG/Learning/2013-14

From Meta, a Wikimedia project coordination wiki
Jump to navigation Jump to search
Correct.svg This page is currently a draft. More information pertaining to this may be available on the talk page.

Translation admins: Normally, drafts should not be marked for translation.

PEG 2013-14
impact

IEG key lightblue.png

36
grants
IdeaLab2.png
project focuses:

  • 8 general support (37% of funds)
  • 2 tools (1% of funds)
  • 5 outreach (12% of funds)
  • 7 content (12% of funds)
  • 10 Wiki Loves Monuments (28% of funds)
Learn Icon-01.svg
19
different countries
Measure BLACK.svg
41%
of grant funding to global south
total grant spending reported (USD)
$352,508
180 days
average length of grant
Calendar Icon V2.svg
lots of projects:

  • Commons
  • MediaWiki
  • WikiSource
  • WikiQuote
  • WikiVoyage
  • 26 language Wikipedias
smallest grant spending
$420
largest grant
$41,723
IEG 2.png
Mix of grantee types

  • 59% chapters
  • 8% external organizations
  • 17% groups
  • 17% individuals

Overview background[edit]

Abridged public presentation (Google hangout) from July 2014
[[|thumb|Full report on PEG reports 2013-14.]]

The Project and Event Grants (PEG) program has been in existence in its current form since 2012, and though it has been over multiple process-oriented studies, has never systematically evaluated the outcomes of the work being funded....until now! Unlike the Individual Engagement Grants and Annual Plan Grants programs, PEG operates on a rolling cycle, with grant applications coming in at all points throughout the year, with huge variance in the duration of a specific grant. As a result, to summarize and compare the work done through PEG, we examined all the final reports that were submitted during the FY2013-14. These included primarily grants that were funded during 2013-14 (n=22), but also grants approved in FY2012-13 (n=12) and FY2010-11 (n=2).

In total 36 grants and US$350K in funding was reported on by 27 different individuals, groups, and organizations who executed projects and events in an effort to make progress towards the Wikimedia movement's strategic goals of reach, participation, and quality content. When examining the reports, three key questions were in play:

  1. What progress did the grants as a whole make against our movement's strategic goals (reach, participation, quality content)?
  2. What characteristics stood out in the highest performing grants?
  3. How were the grants managed and reported?

We are thrilled to see the work being done across a mix of organizations, groups, and individuals, and are excited to continue supporting projects and events that are working towards our strategic goals. We see the need for more experimentation on projects that have clear linkages to the Wikimedia projects themselves -for example, on-wiki interventions and writing contests.

Key takeaways[edit]

  • Most PEG grantees are very focused on specific output results, such as images or articles, which are clear additions to Wikimedia content
    • There are limited proxies around quality of content, which is an area we need to explore more
  • PEG grantees demonstrated ability to reach out to people. But, with average cost per person touched at $35, it is critical that grantees improve depth/quality of interaction and/or scale of interactions.
    • Best practices around improving interactions and following up with participants is needed
  • Online writing contests produced the highest amounts of direct article contribution, and they were inexpensive to run
  • Groups/organizations with an integrated strategy throughout the year performed best
  • The larger grants (>$10K) have high rates of significant underspending; chapters in particular struggled to spend the larger amounts of money in general support grants
  • Learning is taking place, but little of it has been systematically documented in learning patterns:
    • To scale the impact of the programs occurring, we need clearer systems of documenting and sharing learning

Key gaps[edit]

  • We have limited cost data by program; we need more detailed breakdown of inputs by program
  • We have limited data around quality of participation or quality of content: we need more information around the levels of engagement with participants, and the quality of content created

Major learnings[edit]

Focus[edit]

Related Learning Patterns:

Gear shape black 09.svg

On-wiki contests[edit]

Related Learning Patterns:

Gear shape black 09.svg

Complementary activities[edit]

Related Learning Patterns:

Gear shape black 09.svg

Learning Patterns[edit]

The following learning patterns were created by PEG grantees reporting during 2013-14. Creating learning patterns was not a requirement for these grantees, and we are so appreciative of this extra work put forth by grantees to share their most important learnings with the broader movement. We're excited to see grantees and community members to continue to share their learning more broadly:

MechaDuck.png

Photographing your local buildings

Improving your building photography

Use an expense tracking software

Recommendations[edit]

For grantees[edit]

Recommendations for Grantees
Program Planning Executing and reporting
  • Grantees should link to commons categories for all projects
  • Final reports should reflect outcomes for the full grant period in an accessible way - simply linking to monthly reports is not sufficient.
  • Refer back to your original goals, and assess whether or not you met those goals and why
  • Tool grantees should plan to count usage and/or view counts.
  • Be sure to track content generated through GLAM projects!
  • Report budgeted and actual spending based on program
  • Create learning patterns throughout the project and afterwards!

For Grants Advisory Committee[edit]

Recommendations for GAC
Proposal evaluation

increase understanding of larger grants

  • If applicable, look at past performance of a returning applicant
  • Look for applications with activities that focus on developing quality content in a specific area
  • Ensure there are clear measures of success and a plan for how to monitor those metrics
  • Consider systematic proposal evaluation: there was lots of variance in the quality and depth of feedback received by grantees
Project type advise
  • External organizations may have access to high quality media, but they should demonstrate a link to Wikimedia and plan to upload content to Commons.
  • Outreach project applications should include clear plans for following up and tracking participants; consider the quality of the contact
  • Focus funding for education projects that work with the same group of students over several weeks or months.
  • Content projects that focus on photographing important national events or public figures have a high ratio of photos in use with significant page-views.
  • Look for contests on-wiki, which are low-cost and generate significant content

For WMF[edit]

Recommendations for WMF
Program Strategy
  • Encourage more short, specific writing and photography contests; also encourage edit-a-thons following a photo event, to put content into the projects
  • Build best practices around on-wiki competitions: these work, and are inexpensive
  • Provide more mentorship to grantees and applicants in terms of developing their idea, executing and measuring. Potentially have tighter integration with IdeaLab.
Assessment
  • Explicitly request that grantees create (and report) a category to track to all media content from their project.
  • In future reports, collect key metrics in a table in same location and format.
  • Simplify forms as much as possible, and instead incorporate most important elements for the reporting: learning patterns to facilitate sharing; standardized metrics to have filled in by each grantee, cost by program to increase understanding of larger grants

Future areas for study[edit]

  • Market analysis: this report (and others) do not take into account the amount of participation nor content in relation with the overall context of that language, geography, or topic area
  • Measuring quality: we have limited proxies for quality; we need better understanding of how to estimate this
  • Measuring important offline interventions: we have limited understanding of the impact of specific types of events on developing free knowledge

See Also[edit]

Grants data by program type[edit]

Content projects Wiki Loves Monuments Outreach Conferences Tools General Support
Typical Activities
  • Photo and writing contests
  • GLAM programs
  • Wikiexpeditions
  • Photography workshops
  • Upload events
  • Awards ceremonies
  • Education programs
  • Info sessions
  • Offline Wikipedia
  • Local editor conferences
  • Regional Leadership Meeting
  • Global Wikipedia Research
  • Instructional videos
  • Translation tools
  • Overhead costs
  • Content programs, WLM
  • Outreach programs
Average cost $4,889 $5,518 $3,521 $14,036 $1,009 $13,131
Number of grants 7 10 5 4 2 8
Summative outcomes
  • 48 events
  • 477 people
  • ~31k photos
  • 4,358 articles
  • 25 events
  • ~1,850 people
  • ~80k photos
  • 8% photos in use
  • 119 events
  • ~ 4,000 people
  • ~450 articles
  • 4 events
  • 335 people
  • 15 articles
  • 141 events
  • ~2,200 people
  • ~80k photos
  • 16% photos in use
  • 4,386 articles

References[edit]