Grants:IEG/Learning/Round 1 2013/Impact
- 4 projects focused on online community organizing (50%)
- 3 projects on building tools
- 1 project on offline partnerships and outreach
(but spent $0!)
- Javanese, English, Catalan, Chinese, Italian
- English Wikipedia (x3)
- Javanese Wikipedia
- Catalan Wikipedia
- Chinese Wikipedia
- Wikispecies
- Commons
- Wikisource
- Mediawiki
Overall, we believe this first batch of projects indicates that supporting individuals and small groups through the IEG model can have direct impact on the strategic goals of the Wikimedia movement: to increase the quality of the online content, increase the number of people participating in knowledge creation, and to increase readership. Individuals who have specific ideas for experimental, scalable programs seem to thrive in the IEG program, which provides high levels of resources beyond money.[3]
All the grants had useful elements, but from our analysis, the best IEGs seem to be the ones that build some sort of platform -- a social media group, a curriculum, a library, a strategy -- and have demonstrated the possibility for that platform to have impact on a smaller (beta) level. These projects were explicitly designed to meet an expressed need within the community: the grantee had heard this need (e.g., through past experience, through surveys) and designed a creative solution to resolve the gap. The next step, ideally, was for the grantees to see if the designed solution did in fact affect the desired outcomes.The impact of the Round 1 Individual Engagement Grants is yet to be determined - it requires a certain time lapse before we will truly be able to see the changes happening on the targeted wikis, or the scale in the programs. That said, based on initial results, we found that the IEG projects focused on online community development have demonstrated the greatest potential for results, whereas the projects creating tools were not yet able to demonstrate the desired outcomes nor impact.
The grants we recommend prioritizing to analyze for impact longer term are the following:
- The Wikipedia Library -- the project has already more than paid for itself, and great potential still exists for scaling the online library across languages: $7,500 grant, $279,000 in donated sources - a 37.2X return on investment. Moreover, editors are demonstrating demand for and utilizing the resources: the donated resources have seen growth in their references on Wikipedia between 400-600% since free access became available.
- Wikisource Strategy -- demonstrated impact with organized events: November 2013 saw the highest number of new Wikisourcers (171) since July 2011.[4]
- wikiArS -- a program demonstrating high quality donations of content, and extending the scope of Commons by including rating of drawings, rather than just photos: 30 images marked with a high designation of quality; 211 Wikipedia pages using images from program; 100% retention rates of participating schools.
- The Wikipedia Adventure -- game built to provide more coaching to newcomers on English Wikipedia has shown possibility of creating more productive, longer term editors: players were 1.2x more likely to edit than those who did not play.
- Publicity in China -- the low-cost social media experiment with Weibo has already demonstrated the hunger for information in China with 10,000 followers (25% of whom are women), and top posted articles seem to drive readers to WIkipedia: there was overall a 252% increase in pages views for articles with the most reposts the three days following the post.[5]
1. IEG grantees are contributing towards our strategic goals
[edit]- Javanese, English, Catalan, Chinese, Italian
- English Wikipedia (x3)
- Javanese Wikipedia
- Catalan Wikipedia
- Chinese Wikipedia
- Wikispecies
- Commons
- Wikisource
- Mediawiki
By and large, the IEG grantees are focused on contributing towards the Wikimedia Movement's overall strategic goals through sponsorship of specific ideas anchored in the voices of the individual community members. The IEG Round 1 projects are able to contribute to our movement's vision of free knowledge for all humanity[6] in a unique way because they are tackling problems in different Wikimedia projects and also they are experimenting with projects focusing on different strategic challenges than Wikimedia Foundation otherwise experiments.
Some of the projects were focused specifically on a language or project with no intention of further scale outside of that market - for example, the social media outreach campaign User:燃玉 pioneered in China via social networking platform Weibo, with no plans for expansion to other languages. But it is important to underscore that most projects[7] were designed in such a way that, if the proof-of-concept worked, they would be able to scale outside of the target language / project specified in their grant.
Diversification of activities
[edit]The projects sponsored through IEG were focused on high priority Wikimedia projects that WMF and other Grants programs are currently not addressing (Javanese projects, Wikisource, Chinese Wikipedia) as well as types of problems that are not being addressed (e.g., difficulty acquiring sources to write articles). We believe this first round of grantees indicates that small investments can jumpstart big thinking and great ideas, and sponsoring individuals allows the exact views from various projects be represented and addressed in a more direct manner.
Focus on strategic goals
[edit]This batch of grantees was most focused on projects that would fix the quality of gaps on the projects[8], but two projects[9] focused primarily on participation, and one focused on raising awareness to increase readership[10].
The unique thing about these projects was their specific theories on HOW they would affect change on one of the highest level strategic goals. For example, the Wikisource Strategy project wanted to increase participation on Wikisource and because their theory was that lack of motivation was a key factor for retention, they focused a lot on community development. See image below for more information.
Reach (number of readers)
[edit]Participation (number of editors)
[edit]
Quality of Content
[edit]Returns on investment (focus on direct online work)
[edit]As has been noted, IEGs are selected "based on an assessment of the grant's potential for impact, ability to execute, and capacity to foster learning and innovation."[11] In particular, the IEG committee assesses the proposals based on their potential to have ongoing online impact. The majority of Round 1's projects were designed to beta test a potential solution for a strategic problem, and then build the base for that solution to scale. Thus, in general, return will only be measurable in the longer term.
wikiArS worked on building a model for classroom engagement with universities pairing with on-wiki groups (Wikiprojects), which included such ongoing pieces as the wikiArS portal and documentation, a new model for helping students, and new resources for participants. While they consolidated and built these support areas which would promote the scaling of the program, they also were executing on a sample program with eight schools in Catalonia. Through this test, they were able to demonstrate the potential for needed, high quality images to be contributed through the program. But more time is needed before we'll be able to calculate the full return of this potential.
As can be seen through the case study, a calculation of return for the wikiArS project is meaningless as of now: the funds were sent mostly to enable the building of a platform for future sustainable growth. It appears to have taken the appropriate steps for that, but time is needed before we see the uptick in program adoption/adaptation by other universities. As this adoption occurs, we can continue to evaluate (a) what the overall program effectiveness of wikiArS is[12], and (b) what the return on this specific investment is (as well as others, as they may occur). Until then, though, a full ROI cannot be determined (or it would not be of much meaning).
As a result, we are here only going to highlight what we are seeing as the initial steps towards returns. We used the information provided by each grantee in their midpoint and final reports to come up with a list of metrics designed to track activities, outputs, outcomes, and ultimate impact. Unfortunately, there is a lot we still do not know about the "Impact" section. That said, there is one grant that we are able to already calculate an initial return on investment: The Wikipedia Library, which currently boasts financial returns x37. See case study below for clarification, though again: note that most ROIs will be calculated in terms of amounts of content added, or number of contributors retained. For example with The Wikipedia Library, if no one is utilizing the library sources, the dollar amount of donations are not very important.
The Wikipedia Library is in the unique case where it was actually working to acquire and distribute accounts to online databases which have a dollar value. As a result, it is possible to calculate an actual financial return on investment for TWL as it currently stands: and it is big! If you take the estimated dollar value of the accounts donated ($279,000) and divide it by the total amount spent on the grant ($7,500), we see a 3700% return on the investment![13]
ROI table
[edit]In the table below, we attempt to provide a comprehensive mapping of what inputs, activities / outputs, outcomes and impact the different grants had. We also suggested a few metrics for outcomes and impact that either would have been useful to dig further into, or which should be looked at going forward.
It is important to offer a big caveat to this table: not all projects necessarily have direct ties to what is considered of "strategic impact" for the WIkimedia Movement. This does not necessarily mean the project is not valuable: it might be a small part in a comprehensive strategy. For example, while the Javanese Script project did not produce measurable impact, Indonesia is a priority area for the WIkimedia Foundation's Global South strategy, and this may be just one of several funded projects attempted at learning about the community and bolstering overall motivation and excitement. That said, we do a forced fitting of the funded projects in the "Inputs->Impact" framework here, in order to give ourselves some understanding of the linkages between the projects funded and the returns the movement sees from these investments. As more comprehensive, movement-wide impact metrics and strategies are established, we can cross-map initiatives into these efforts. In addition, we hope to develop a better understanding of what types of outcomes and impact metrics are most valuable for different types of projects.
Note: cells highlight in coral are those which have not been measured, but should be for further analysis over time. For full definitions of terms, see Evaluation Glossary.
Inputs -> Impact Map, Round 1 IEG | ||||||
---|---|---|---|---|---|---|
Grantee | Inputs | Activities/ Outputs | Outcomes | Impact | ||
wikiArS Offline Project |
Grant spend: $9499 |
|
|
| ||
The Wikipedia Library Online project |
Grant spend: $7,432 |
|
|
| ||
Publicity PRChina Online project |
Grant spend: $168 |
|
Note: these are high level trends that are hard/impossible to attribute to this project. Nonetheless, they are the top level numbers of most importance for this project., and therefore should be monitored.
| |||
The Wikipedia adventure Online project |
Grant spend: $11,110 |
|
|
| ||
Wikisource Strategy Online project |
Grant spend: $11,627 (USD) |
|
|
| ||
Replay Edits Tool project |
Grant spend: $0[19] |
|
(To be calculated upon 3 months of completion; see Final Report - Targets)
|
| ||
Mediawiki Data Browser Tool project |
Grant spend: $15,000 (USD) |
|
|
|||
Javanese Script Tool project |
Grant spend: TBD, but given $3,000 |
(Final Report incomplete, so this may not represent all activities)
|
(Final Report incomplete, so this may not represent all activities)
|
|
Additional spending data
[edit]Across all grantees[21], the majority of spend went towards human capital (83%), whether to Grantees themselves for project management, or for other areas of expertise (design, translation, technical). We are curious to continue tracking this spend data of our grantees to better understand the types of financial supports are needed.
2. Indicators of outliers
[edit]Projects that were successfully completed, deployed, and poised for impact demonstrated some comparable patterns in behavior. We found the following factors were true across high performing IEG grantees:
- (A) Project had clear links to strategic priorities
- (B) Project was designed to meet the specific needs of end-users
- (C) Feedback loops are built into the project, giving voice to the people being served
- (D) Community engagement in the developing of the plan, process, and execution
- (E) Significant expert support in developing and executing on a strategy
Related Learning Patterns:
Projects that were most linked to impact were those that focused on a specific method for affecting the strategic priorities. They had clear theories on how to get overall impact: whether improving the quality of the content, increasing the number of editors, or increasing the number of readers.
Case study:
For example, the highest level strategic goal of increasing the number of active editors on our projects (“Participation”) can be achieved by working on one of three components: increasing the number of new editors, converting the new editors into longer term editors, or retaining existing editors. The Wikipedia Adventure focused on the middle section of the editor chain: converting new editors into longer term contributors. It did this under the specific hypothesis that new editors would be more likely to become expert editors if they had the appropriate mentoring and help[22] and also had motivation incentives to contribute.[23]. The Wikimedia Adventure addresses the challenge of mentoring and motivating new editors by gamifying the initial editing experiences. Because grantee Ocaasi had a clear definition of the change he was hoping to affect, he was able to identify clear metrics for evaluation from the highest levels (average contributions of new users), and also the more nuanced motivation and confidence incentives (satisfaction rates; confidence rates).
Related Learning Patterns:
As a required field in the application form, “Target audience” was identified from the onset by all the IEG projects. Upon post-grant reflection, we found the more specific the identified audience, the more able the grantee was to demonstrate effectiveness. Often, this was because it was clear from whom to ask for feedback (see section C below) and also with which community to engage (see section D).
Case study:
The Wikipedia Library (TWL) started as a direct demand from active editors: free sources from which to draw encyclopedic references is a need! Over the course of a few years of discussions with the community, Jake, an English Wikipedia editor and administrator, decided it was high time to intentionally combine existing and solicit new resources from references databases. Because Jake knew specifically for whom the initial Wikipedia Library should be designed - active editors on English Wikipedia - he was able to identify in his project which areas to prioritize and how to surface the information most desired by the participating editors. He also then knew exactly how to track the usage of the library efforts and build appropriate feedback loops via a survey sent to over 2,000 English Wikipedia editors. The results of the careful target? TWL distributed 1,500 accounts to editors for free resource access, increasing the references in Wikipedia usage by 400-600% - and has a clear direction for next phases of the project.
Related Learning Patterns:
While it proved crucial for an IEG grantee to define (and consult with) the target audience during the planning phase of the project, effective projects also engaged their stakeholders throughout the design and/or execution phases. This enabled reflective modifications of the projects as the proceeded.
Case study:
Wikisource Strategic Vision grantees Aubrey (Italian Wikisource) and Micru (Catalan Wikisource) planned from the outset of the project what their stages for community engagement would be. Their theory was that the only way to ignite the project Wikisource was to build connections across the community and engage them in a process of identifying motivations for contributions and envisioning the potential of the project. They set up stages through which the community could contribute (see image above), including two particularly important components: (1) in-person meet-up at Wikimania and (2) Wikisource community survey. The combination of face-to-face and anonymous online feedback provided the project leaders with a lot of the voices in the community. In particular, the anonymous survey provided a forum for Wikisourcers who are more independent contributors and/or who do not generally participate in public-facing strategy discussions to provide feedback. The results? The survey - distributed in eleven languages - helped guide next phases of development for the project, and the Wikisourcers even went a step farther to lock in avenues for feedback and ownership by the broader community: they helped to create an organized Wikisource User Group with 46 members as of this writing!
Related Learning Patterns:
Wikimedia projects have hundreds of thousands of contributors across different languages: a powerful force, but not always easy to engage. IEG projects which were able to carve a niche in this community and find clear points of engagement seemed to fair the best.
Case study:
Grantee David Gomez wanted the design students' participating in the wikiArS pilot project to be creating work of maximum value for Wikipedia. While of course all contributions are welcome on Commons and can be of some value, he wanted to pair the students' unique skills with specific gaps in article coverage. In order to do this, he engaged directly with Wikipedians on Wikiprojects[24], soliciting their specific needs and engaging them in the creation of classroom assignments which would be directly beneficial to the work which that community was attempting to do. Consequently, David was able to build awareness for the project and the work of the students, which also resulted in better mentoring of students by community members, adaptation of the Wikimedia Commons image ranking criteria, and rapid uptake of images onto the main pages of Wikipedias across many languages (231 pages across 36 Wikipedias, and 66 pages across 7 other projects).
Related Learning Patterns:
Project experiments from Round 1 were strengthened when individuals came together to use their complementary skills in developing the ideas and executing on the projects. The IEG grantees identified specific gaps in their own execution skills, and found people who could fill these gaps in their projects.
Case study:
One of the more interesting examples is the Wikisource Strategy Projects’ use of Google Summer of Code developers in improving some tools for Wikisource which were of particular interest for that community. They listened to the specific needs of the community of Wikisource, and then prioritized a set of tools that would match the resources available through GSoC. This resulted in a variety of tool improvements aimed at helping Wikisource.
3. Conditions for success
[edit]The IEG grantees rose to the occasion of developing and executing high value and innovative projects, but the IEG program itself offered a lot of opportunity for these creative solutions to grow and flourish. We noticed a few things that were consistently cited as creating an environment enabling success. In particular:
- Experimental mentality
- Mentoring support
- Wikimania (maybe ... )
Experimental mentality
[edit]In contrast to other WMF grants programs, the IEG reports jump out as brutally honest and transparent. Many reasons may contribute to this, but more than anything it appears that the IEG community is a space clearly separated and specified as experimental. Most of the recipients (88%) had never been grantees of WMF previously, which may have helped in their transparency.
The need for innovation is baked into the model of IEG at the beginning. A major enabler of the safe-to-fail environment is the very scorecard and selection criteria as laid out by the WMF Grantmaking staff and the IEG Committee, which is tasked with recommending to WMF the IEG projects for funding. Called out explicitly in their selection criteria:
"Innovative approach to solving critical issues in the movement, and clear measures of success to learn and evaluate impact of the approach."
Note that a specific criteria around innovation is unique to the IEG program in contrast to the other WMF grantmaking.[25]
"I also liked that in Final and Midterm reports we... were free to express our insights."
"I have been given free rein and flexibility to develop this library in any meaningful direction I think will be best, and I take that as a great trust that I hope I have honored. It has been exhilarating..."
The safe-to-fail environment seems to have been underscored throughout the course of the grant as well. Through the review of the reports, we found that the reports being completed via the IEG program were more simple and more focused on lessons learned than other grant reports. Part of this seems to be the design and coaching on the reports themselves. Only two primary reports were required - a midpoint and a final report - and these were framed in such a way to simply relay back the lessons being learned. Interestingly, one grantee self-assigned the word "Failure" to many of his planned goals in the progress report. He provides clarifications and explanations, but it is nonetheless a noticeable difference from past grantee reports. Overall, the weight of the IEG reports rested more in the "Learnings" section of the midpoint report and the "outcomes and impact" section of the final report.[26]
Mentoring support
[edit]The second enabling feature is WMF staff support - in particular, IEG program officer, Siko Bouterse. IEG grantees had monthly calls with Siko, as well as received additional support as necessary. On average, 19.5 hours of Program Officer staff time went towards supporting and communicating with the grantees each month.
We have reason to believe these check-ins extended beyond formalities and governance, and rather were of value to the developments of the projects. Interestingly, five of the eight (>60%) grants' projects explicitly mentioned the support of the IEG team being of great value to their projects during their midterm and/or their final reports. In particular, the most valuable components were encouragement, advise on areas outside of expertise (e.g., promotion), and tools connection.
Outside of Siko, other WMF resources were highlighted as helpful areas, in particular design and communication. The help received from these two teams, though, was not equally spread (which is probably a good thing!) across the different grantees, and it is unclear the process through which this help was maintained. That said, of the post-grant survey distributed, people were "Very Satisfied" or "Satisfied" with the support they received across Administration, Research, and Communication.[27]
Wikimania
[edit]Wikimania seems to be an enabler for a subset of the grantees. An important commonality for this subset of grants was that they were projects most focused on community mobilization and engagement: Wikimania provided a great platform to introduce the grantee project to the global community.[28], collaborate with relevant community members[29], learn about new tools[30], or simply feel more a part of the IEG and Grantmaking community[31].
That said, the cost for seven[32] IEG grantees to attend Wikimania was ~US$16,000 - or over 20% of the total IEG grant spend (For several grants, the cost for attending Wikimania was more than their actual grant). Though there are some indications of value, most grantees did not make large mention of Wikimania in their reports. That said, in the post-grant survey several did mention their appreciation of attending Wikimania, which is described in the IEG 2013 Program review, though again - actual benefit of the investment is not yet determined. Digging further into the value-add of Wikimania for grantees would be a useful future experiment, though initially it looks like this was a big benefit for the types of projects that required lots of community collaboration.
4. Scalability and longer-term implications
[edit]As mentioned previously, the full understanding and increased impact of the IEG program requires additional supports in data, tools, and scale.
Tools for execution
[edit]Many IEG projects were resourceful in the different tools that they utilized in executing their grants. The accessibility of these tools was essential to actually effectively executing their projects.
That said, it is clear there are gaps in the tools available for this particular sample set of projects, and we are interested in seeing some development in these areas. We are documenting them here and hope that future grants rounds might be able to help identify which of these are the most essential:
- Tools used
Tool | Type of Usage | Further materials |
---|---|---|
Qualtrics | Used for surveys to collect information regarding the projects.[33] | Qualtrics overview in Evaluation Portal |
Guided Tours | Utilized the extension to develop customized gaming experience[34] | Overview of extension "Guided tour" |
Snuggle | Leveraged Snuggle's desirability algorithm to identify good-faith contributors[35] | Overview of Snuggle |
Commons Upload Wizard (modified for project) | Modified version of the upload tool on Commons | Overview on Mediawiki |
- Tools desired / needed
Type of Tool | Notes |
---|---|
Translation Extension | The Translate feature currently only works from English to other languages; desire for the ability to translate from any language to any language |
Guided Tours | Desire to be more easily accessible |
Tools for measuring impact
[edit]Many tools were used to better understand the effects of the work being done, but again: there were big gaps in this knowledge. We saw bigger gaps in this section than the execution tools because limitations in the ability to execute a project without certain supports reflects negatively in the selection criteria for the IEG, resulting in fewer projects that require unavailable technical tools. Evaluation capabilities, on the other hand, is not a selection criteria.
- Tools used to measure outcomes/impact
Tool | Type of Usage | Further materials |
---|---|---|
GLAM category tool | Used to track the total number of files and file usages of a commons category (for wikiArS) | See "GLAMorous" tool |
"Linksearch" | Used to track the change in usage of reference sources provided in The Wikipedia Library | See Linksearch |
http://stats.grok.se/ | Page view aggregator; used by L&E team to see the effects of Weibo posts | See stats page |
http://stats.wikimedia.org/ | Hosts a variety of statistics, but used by the L&E team for the total new editor number on Wikisource |
- Metrics needed to access
Type of Data | User story |
---|---|
Usage statistics of tools hosted on WMFLabs | Jeph, the grantee executing "Replay Edits" needed to see the usage of his tool once it is developed and hosted on WMFLabs so that he can make appropriate modifications and report back about the outcomes of his grant. (Note: he did not host his tool on WMFLabs because he is unable to track usage if it is hosted there!) |
Aggregate page view statistics | Addis from the Publicity to PRChina project did a significant amount of a outreach around specific pages in Wikipedia; he needed a way to mass upload a series of articles along with specific dates to understand the results on page views of his posting work. |
Tracking cohorts of users | User:Ocaassi needs to track cohorts of users playing his game "The Wikipedia Adventure." Given that these users come at different times, usages stats are not able to go through one cohort in Wikimetrics (Note that this is possible via the Event logging tool, but this tool is only available to WMF staff with permissions, and not accessible by grantees. We recommend that there should be a way to extend the features of this tool to community members working on important experiments which are not amenable to Wikimetrics). |
Mentoring
[edit]“Being officially part of a project has made it easier for me to interact with a lot of wikipedia users.”
"We think that one of the core features of being a grantee (thus, one of the main reason to apply for a Individual Engagement Grant) is the fact that you get an official approval for a specific project, which has previously been evaluated. You are not a single, rogue Wikisource user who is speaking for himself only: you carry on a project for the WMF, which has a budget, an evaluation and an audit. We think it is very important to highlight how this recognition acts on people and users, who actually listen to you and allow you and your project to make an impact."
Grantees consistently brought up the value outside of money that helped them execute their projects, such as mentorship from the IEG Program Officer (Siko) and even official “endorsement.”
It is of utmost importance for the IEG program to continue to develop out its mentorship model. This appears to be a limiting resource to scale (compared to, for example, amount of funds available). In this past round, the IEG program officer spent approximately 19.5 hours per month on work directly with the eight grantees, which could be difficult to scale. Moreover, benefits came to some projects through design help, communication support, or evaluation support.
5. Specific recommendations and open questions
[edit]Recommendations
[edit]IEG Selection Criteria
[edit]The IEG committee should pay particular attention to clear demonstration of the indicators listed above as key for success in IEGs. This may require modifications to the review framework. In particular, we recommend:
- Paying particular weight to a few criteria already embedded in the rubric:
- “(A) The project fits with the Wikimedia movement’s strategic priorities.”
- “(E) The project has demonstrated interest from a community it aims to serve”
- “(I) The project has innovative potential to add new strategies and knowledge for solving important issues in the movement.”
- Consider expanding criteria to encompass:
- Target market: for example, “The project has a specific community identified in which to experiment and execute the project.” (in “Potential for impact”)
- Feedback loops: for example, “The project has clear community engagement pathways” (in “Ability to execute”)
Areas of specific mentorship for grantees
[edit]- Impact metrics to track
When taking a step back to look at “impact” across all grantees, we found that there were many gaps in data collection and reporting. While some projects (e.g., wikiArS) compiled very comprehensive sets of information, other projects were not able to identify or collect metrics so clearly. One reason is that data did not exist (e.g., Jeph could not track the usage of his tool “Replay Edits” if hosted on WMFLabs), but there were several gaps in the measurements that could have been highlighted through some brainstorming. For example:
- tracking page views of articles posted on Weibo[36]
- number of new members or active editors of Wikisource[37].
This also involves highlighting tools for measurement, as listed above, or more clearly laying out how to use certain tools. It is important to note that this should be a responsibility of lots of different people and groups in the WMF and the movement!
- Promotion of projects
A common theme in the project reports was one focusing on not having enough promotion amongst the broader community. When appropriate, the IEG program officer may want to work with the grantees on developing a specific promotional strategy with the grantees.
IdeaLab development
[edit]Given the importance of leveraging different skill-sets in the development of a comprehensive strategy, we recommend the further expansion of the IdeaLab as a tool for connecting ideas with the appropriate people to accomplish them. This could be both in the pre-phase (Ideas) but also throughout the execution of an idea (e.g., IEG phase). There is currently a gap in the ability for people to easily identify what areas of expertise an idea needs, and for an idea to find individuals with those specific areas of expertise!
Open Questions
[edit]At the conclusion of this report, we are still left with the main question:
“What is the ultimate impact of the IEG program?”
While we do see signs of success, we also acknowledge that the time lapse between this report and the creation of so many projects has not been long enough to fully understand the sustainability and scale of many projects.
The main questions which we (the Grantmaking Learning & Evaluation team) will continue to dig into alongside the IEG Program Officer (Siko) and (hopefully) the broader IEG community:
- How can the platform for success scale?
- What is the appropriate level of risk to take for an Individual Engagement Grant?
- How should the projects be evaluated longer term?
- Is IdeaLab well-suited to serve as a connection place between individuals with complementary skillsets to execute against ideas?
- What happens to the IEG grantees over time? Are they still engaged in the movement? Do they take on further leadership roles and responsibilities?
Also, there is the question around the types of projects that IEG is well-suited to work on. For example, it is two of the three tools projects did not meet their goals, and the third (replay edits) does not yet know the usage results. Grantee Yaron provides some interesting reflections on the topic in his final report on the MediaWiki Data Browser project, including the artificial time constraint of six months. We'll be watching closely to see how the next batch of projects focused on developing tools fare!
Sources / Notes
[edit]Learning patterns
[edit]We developed a set of six learning patterns based on the our analysis of the first round of Individual Engagement Grants. These patterns are intended to provide actionable guidance to grantees, committee members, program staff, and community members involved in future IEGs, as well as other mission-aligned projects.
You worked hard on a project you believe in, but no one seemed interested in what you made or wants to carry your work forward.
You don't have all the skills or knowledge needed to accomplish your project alone.
Projects that require a lot of preliminary work before they can start, or depend on many external factors, easily get bogged down and may fail to achieve their goals.
You have an awesome idea for a project or activity, but it will only succeed if enough of the right people participate.
Even promising ideas with detailed project plans and broad appeal can fail to achieve their goals if what they produce is not useful and relevant to the right people.
Even impactful projects may look like failures if the project team sets unrealistic goals, or doesn't develop some way to measure their progress ahead of time.
References
[edit]- ↑ Bouterse, Siko (17 January 2013). "New grants available from the Wikimedia Foundation for individual Wikimedians". Wikimedia Foundation. Retrieved 27 January 2014.
- ↑ Bouterse, Siko (29 March 2013). "Announcing the first Wikimedia Individual Engagement Grantees". Wikimedia Foundation. Retrieved 27 January 2014.
- ↑ For example, this Round 1 group had ~2 hours of IEG staff support per month, a trip to Wikimania, easy access to WMF blog channels. Some IEG grantees also utilized research support from the Grantmaking L&E team (The Wikipedia Adventure) and blog writing support (Publicity in PRChina)
- ↑ See Stats on Wikisource; "New Wikilibrarians" are defines as having reached 10+ edits after registering)
- ↑ Compares change from the three day before post to the three days after. This was only done for a subset of articles that were provided to WMF. Note that this type of analysis would be best conducted on ALL postings throughout the duration of the grant, in order to draw more comparisons between the exact times of postings and the number of repostings. See spread of sample size in bar chart
- ↑ See "Wikimedia Foundation's Vision"
- ↑ wikiArS, The Wikipedia Adventure, The Wikipedia Library, Replay Edits, Mediawiki Data Browser
- ↑ wikiArS, The Wikipedia Library, Javanese Script, Replay Edits
- ↑ The Wikipedia Adventure and Wikisource Strategy
- ↑ PRChina and Social Media
- ↑ see IEG "project selection criteria"
- ↑ For more information on the Program Evaluation initiatives see "Overview of Program Evaluation"
- ↑ Including the price to attend WIkimania, the return is 3000%
- ↑ Could be measured through Wikimetrics if list of students is available
- ↑ Includes 1/2 of grantee's Wikimania costs, since he was also a part of The Wikipedia Adventure
- ↑ Note: While this was not measured, it could have been by using the [1] database, and comparing page view trends 3 days before and 3 days after the Weibo post.
- ↑ Possibly measurable via Wikimetrics, if a list of usernames from those who attended a meet-up is available.
- ↑ Includes 1/2 of grantee's Wikimania costs, since he was also a part of TWL
- ↑ It's true! Returned all the money!
- ↑ See page stats on Commons
- ↑ With the exception of Javanese Script project, which has not completed the final report
- ↑ This lever for change was demonstrated to be meaningful in the analysis of the WP:EN Teahouse project, which demonstrated that experiencing Wikipedia a s a community and perceiving that one has a role in it is a powerful incentive. See Research from Phase 2 of Teahouse project.
- ↑ See research paper “Socialization in an Open Source Software Community: A Socio-Technical Analysis,” which finds that participating in community processes makes editors more invested in the project and can also help them build credibility among other members.
- ↑ See Wikipedia:Wikiprojects for more information on the role of Wikiprojects on Wikipedia.
- ↑ See Grantmaking homepage for more information on the other grants programs.
- ↑ Note that we (Grantamking L&E ) are going to take these observations to other WMF Grantmaking programs to consider how they might encourage similar transparency!
- ↑ Three of the seven grants have responded (Javanese script has not yet completed the grant to receive the survey).
- ↑ Like the campaign run by Addis and the Chinese Wikipedians
- ↑ Like Wikisource, bouncing ideas off of different Wikisource contributors
- ↑ Like wIkiArS learning about GuidedTours
- ↑ As mentioned by Benny in his midpoint review
- ↑ One grantee could not attend, one grantee was a duplicate (i.e., had two grants), and one project had two members.
- ↑ Used by The Wikipedia Library, Wikisource Strategy, and The Wikipedia Adventure
- ↑ The Wikipedia Adventure
- ↑ The Wikipedia Adventure
- ↑ Publicity in PRChina
- ↑ Wikisource Strategic Vision