Jump to content

Grants:APG/Proposals/2020-2021 round 1/Wiki Education Foundation/Impact report form

From Meta, a Wikimedia project coordination wiki


Purpose of the report

[edit]

This form is for organizations receiving Annual Plan Grants to report on their results to date. For progress reports, the time period for this report will the first 6 months of each grant (e.g. 1 January - 30 June of the current year). For impact reports, the time period for this report will be the full 12 months of this grant, including the period already reported on in the progress report (e.g. 1 January - 31 December of the current year). This form includes four sections, addressing global metrics, program stories, financial information, and compliance. Please contact APG/FDC staff if you have questions about this form, or concerns submitting it by the deadline. After submitting the form, organizations will also meet with APG staff to discuss their progress.

Global metrics overview - all programs

[edit]

We are trying to understand the overall outcomes of the work being funded across our grantees' programs. Please use the table below to let us know how your programs contributed to the Global Metrics. We understand not all Global Metrics will be relevant for all programs, so feel free to put "0" where necessary. For each program include the following table and

  1. Next to each required metric, list the outcome achieved for all of your programs included in your proposal.
  2. Where necessary, explain the context behind your outcome.
  3. In addition to the Global Metrics as measures of success for your programs, there is another table format in which you may report on any OTHER relevant measures of your programs success

For more information and a sample, see Global Metrics.

Overall

[edit]
Metric Achieved outcome Explanation
1. number of total participants 14,131 We significantly exceeded our goal of 10,685.
2. number of newly registered users 10,766 We significantly exceeded our goal of 8,100.
3. number of content pages created or improved, across all Wikimedia projects 17,055 We significantly exceeded our goal of 13,350.
4. quantity: words added to the article namespace 9,984,134 We significantly exceeded our goal of 5.6 million.
5. quality: number of articles increasing by at least 10 points on the ORES scale 7,625 We significantly exceeded our goal of 1,430.

Telling your program stories - all programs

[edit]
This chart shows the data from WMF's survey, compared to Wiki Education's 2021 survey data of our participants.

In 2021, we significantly exceeded our goals, across all metrics. When we were setting these goals in late 2020, we were unsure about what ongoing effects the COVID-19 pandemic would have on our programs. While the pandemic created instability across higher education and cultural institutions, we continued to support participants in their efforts to edit Wikipedia. Our all-online programs were effective at bringing a lot of high-quality, diverse content as well as diverse contributors to Wikipedia and Wikidata.

As mentioned in our midterm report, the new availability of demographic data on the race and ethnicity of U.S. contributors from the Community Insights survey enables us, for the first time, to quantify our role in diversifying Wikipedia's contributors. According to that data:

  • Only 22% of Northern American editors identify as women
  • Only 5.2% of U.S. Wikipedia editors identify as Hispanic or Latino/a/x
  • Only 0.5% of U.S. Wikipedia editors identify as Black or African American
  • Only 0.1% of U.S. Wikipedia editors identify as Native American

Contrast that to Wiki Education's program participant demographics:

  • 64% identify as women, and another 4% identify as nonbinary or another gender identity
  • 12.8% identify as Hispanic or Latino/a/x
  • 10.4% identify as Black or African American
  • 1.8% identify as Native American

Our programs are significantly diversifying Wikipedia's editor base in the United States and Canada — while also adding high-quality content on a diverse range of topic areas to Wikipedia.

Wikipedia Student Program

[edit]

Quantitative targets

[edit]
Measure of success Goal (2021) Total % completed Notes
Total Participants 9,000 12,403 138% We exceeded our goal.
Newly Registered 8,000 10,588 132% We exceeded our goal.
Content Pages Improved 13,000 12,743 98% We only barely missed this goal; students edited fewer articles than expected, but added more content per article than expected.
Quantity 5,500,000 9,632,716 175% We exceeded our goal.
Quality Articles[1] 1,000 3,105 311% We significantly exceeded our goal.

[1] Number of articles that have at least a 10-point improvement in an ORES-based quality prediction score, which indicates significant improvement of the "structural completeness" of an article.

What we did

[edit]
In the brief period when our staff had all recently been fully vaccinated against COVID-19, but before the Delta surge, we felt safe enough to gather in person.

We supported more than 300 courses in both the spring 2021 and fall 2021 academic terms, continuing our trend of the past year. While 2021 was obviously challenging in higher education due to the COVID-19 pandemic, we were thrilled with the outcomes. The diverse contributors we brought to Wikipedia this year edited a wide array of topics. In addition to the improvements mentioned in our midterm report, our student editors:

As part of a Simmons University course, a student improved the article on the Cape Ann Museum — including traveling to the museum, taking photos of all the buildings, uploading them to Commons, and added them to the article. This is one of the images the student took.

Students in other classes improved whole topic areas, including African archaeology, Black women's history, deafness in various countries, Franco-American culture, environmental science, LGBTQ reproductive health, lizards, medical racism, minerals, music of the world, poetry, protests and policing, and psychology, just to name a few. These examples demonstrate the depth and breadth of topics Wikipedia Student Program participants improve on Wikipedia — and you'll note many intersect with the broader movement goal of increasing knowledge equity. Students in the Franco-American culture course, for example, focused on addressing racism in articles. The work Wiki Education does to directly recruit instructors who teach in knowledge equity subject areas is supplemented by our efforts to support courses that add an element of equity to their work. Some courses do this through the information they add, while others, for example, encourage students to cite more women and authors of color.

We often emphasize our impact to Wikipedia, but it's important to acknowledge how transformative this program is for students and their instructors. We ask instructors to share anecdotes about their students' experiences, and they show how Wikipedia assignments influence students' lives.

  • "The student who worked on her Wikipedia project this semester (told me) that this was her favorite Honors project she has ever completed as a part of her degree."
  • "My favorite comment was 'this assignment was harder than a regular research paper because you couldn’t just go to Wikipedia to do your research!'"
  • "The big surprise was the bias against Wikipedia being accurate that was driven into them. Most were shocked at the gatekeeping and the high quality of most articles when they looked at them from the perspective of editors."
  • "A senior indicated that it was the most difficult research project they ever had to do, but also the most satisfying."
  • "The biggest 'achievement' from a media literacy perspective is that student frustration about not finding reliable or in-depth sources led to (some of) them recognizing that writing in our field is often repetitive and promotional, even in well-recognized news sites. It led them to want to go deeper and to recognize that there is more original work to do, even if it might not be able to be done through Wikipedia."

Instructors are also particularly grateful for the support Wiki Education provides them as part of this program:

  • "Thank you to Wiki Education especially. You are making my students work relevant. I cannot count the incredible number of times that my students talk about their work mattering and enduring. You are making my students relevant."
  • "Thank you for everything you do! I rely heavily on the inspiration and support provided by this wonderful community."
  • "Thank you so much for providing this resource! I was completely lost about how to do a Wikipedia assignment until I discovered Wiki Education and it made everything so much easier."
  • "Thank you for this important service --- the digital librarian and myself would not have been able to do this project very well, if at all, without this platform!"
  • "Thank you again. You're providing an exceptional resource to the world. Students can learn they can be contributors, too."
  • "Thanks for all you do! I used to be a 'rogue' teacher of Wikipedia, but now I feel part of the community of Wikipedia educators."
  • "I've incorporated Wikipedia for almost 9 years now (on and off). It's amazing to see a cumulative portrait. I love running it at the community college level as it helps support diversifying who contributes and what articles/perspectives are contributed to."

What worked well

[edit]

Throughout 2021, we continued to employ a new application process for instructors interested in teaching with Wikipedia. We support most courses that enroll after the deadline, but encouraging instructors to request our support as early as possible allows us to plan better. By identifying trends earlier in the term, we can adapt as needed. We're quite pleased with the results and will continue implementing the application process.

Prior to 2020, we had a robust outreach program, where we attracted new faculty to teach with Wikipedia through a series of events and referrals. The pandemic eliminated in-person events, so we relied on contacts we'd connected with prior to 2020 for new growth. It typically takes anywhere from 6–12 months or longer from when we first engage with someone to when they actually begin teaching with Wikipedia, so we have missed a few terms of building this pipeline. In the second half of 2021, we cautiously recruited new faculty to the program for the first time since the start of the pandemic. We recruited through email listservs, various social media groups on Facebook and Twitter, facilitated publications about teaching with Wikipedia, and hosted a webinar for our existing instructors to share their experiences. We look forward to continued outreach opportunities in 2022.

We also took advantage of people's vastly expanded comfort with video platforms like Zoom to host weekly office hours for instructors running Wikipedia assignments and ran an intro video session at the beginning of the term. Both provided instructors with opportunities to engage with Wiki Education staff as well as others running Wikipedia assignments. We believe both efforts resulted in greater engagement by instructors and promoted community building among faculty running Wikipedia projects.

What didn't work well

[edit]

We reported on our primary learning from 2021 in our midterm report. Reposting from there:

In our proposal for 2021, we proclaimed excitement over a new idea to have some students stay in sandboxes, which we'd then review after the end of the term with the intent of moving the good content live. The idea did not pan out as we had hoped. At the end of the fall 2020 term, staff spent an inordinately large number of hours evaluating sandbox content, and we moved very little of it live. We quickly came to the conclusion that having students actually put the work live in Wikipedia's mainspace results in better quality drafts. Our belief is that knowing their work will be read by other people motivates students to put extra effort into their research, follow Wikipedia's rules, and create good content. The sandbox relaxed that motivation enough that there was a substantial quality drop in content left in sandboxes, and we don't think the staff time spent on reviewing them was worth it. We will not be pursuing this strategy in the future.

Scholars & Scientists Program

[edit]

Quantitative targets

[edit]
Wikipedia Scholars & Scientists Program Goal (2021) Total % completed Notes
Total Participants 140 330 236% We significantly exceeded our goal.
Newly Registered 80 158 198% We significantly exceeded our goal.
Content Pages Improved 125 735 588% We significantly exceeded our goal.
Quantity 91,000 338,573 372% We significantly exceeded our goal.
Quality Articles[1] 30 120 400% We significantly exceeded our goal.

[1] Number of articles that have at least a 10-point improvement in an ORES-based quality prediction score, which indicates significant improvement of the "structural completeness" of an article.

Wikidata Scholars & Scientists Program Goal (2021) Total % completed Notes
Total Participants 45 113 251% We significantly exceeded our goal.
Newly Registered 20 20 100% We met our goal.
Content Pages Improved 225 3,577 1590% Several participants used mass editing tools, causing us to significantly exceed our goal.
Number of statements improved 1,000 12,845 1285% Several participants used mass editing tools, causing us to significantly exceed our goal.
Number of references added to items 400 4,400 1100% Several participants used mass editing tools, causing us to significantly exceed our goal.

What we did

[edit]
OER Wiki Scholars Group Photo. All our courses were held via Zoom.

We supported 19 Wikipedia courses and 8 Wikidata courses in 2021, far outpacing our goals for outcomes in every category of impact to Wikimedia projects. This program continues to be extremely successful in training subject matter experts to add high-quality information to Wikipedia and Wikidata.

In a collaboration with ReThink Media in the months prior to the 20th anniversary of 9/11, we brought peace and security studies experts to Wikipedia to improve articles connected to September 11th, the War on Terror, and related topics. While Wikipedia’s extremely active WikiProject Military History had led to extensive coverage of the specifics of war in these articles, our experts were able to identify and fill content gaps related to the context of humanitarian implications of war. Articles our scholars improved received more than 7 million page views.

This photo of Edith Renfrow Smith was uploaded by a Grinnell College staff member who participated in one of our Smithsonian courses. Having access to source material like this image makes these GLAM professionals particularly helpful contributors to Wikipedia.

In four Wiki Scholars courses, museum professionals who work at one of the Smithsonian’s nearly 200 Affiliates collaborated with each other and our team to add and expand biographies of notable American women on Wikipedia. All told, we trained 74 museum professionals how to edit Wikipedia, representing 53 different Smithsonian Affiliate museums, and they improved more than 160 articles. By embedding Wikipedia know-how within their institution, the Smithsonian has developed a network of new Wikipedians to continue this important work both through their own editing and through organizing local projects.

We trained a diverse group of academics, scholars, and university students how to contribute content to Wikipedia and better represent influential LGBTQ+ figures to the public. This course was unique in that we were able to work with participants outside of higher education, like a high school U.S. History and AP Government teacher. The course created six new biographies and one new Wikidata item, and participants edited many more. In particular, one participant created four new articles about trans artists, an underdeveloped topic area on Wikipedia.

In partnership with the National Science Policy Network, science policy experts edited numerous articles. One example is neuroscience postdoc Tristan Fehr, who created a new article on effects of early-life exposures to anesthesia on the brain. Articles like these require some subject matter expertise to add to Wikipedia, which is why this program is so valuable. Many of these participants also speak a second language; participant Carmen Fernández Fisac was inspired by the course to improve both English and Spanish Wikipedia articles.

Thanks to one of our Wikidata courses, there's a new property from course participants: Dumbarton Oaks object ID. This new property allows the collection of the Dunbarton Oaks organization to be batch uploaded, maintained, and shared with all of Wikidata. Participants in our Wikidata courses continued making significant improvements to Wikidata items, as evidenced by how significantly we exceeded our goals.

We also spread the word about our program extensively in virtual conferences.

  • At Wikimania, Wikidata Program Manager Will Kent and Ian Gill from the San Francisco Museum of Modern Art presented how Ian was able to make SFMOMA's data more collaborative after taking one of Will's Wikidata Institute courses.
  • Also at Wikimania, Will and Hilary Thorsen, Susan Radovsky, and Sarah Kasten from the LD4 Wikidata Affinity Group and the Program for Cooperative Cataloging (PCC) Wikidata pilot discussed how they've run their projects to the betterment of Wikidata.
  • We had three Wikidata course alumni present at the LD4 2021 conference: Paul Jason Perez presented about Open Heritage in Philippine Museums and Richard Naples and Jackie Shieh presented about Modeling Smithsonian Collections Data.
  • Will and Director of Partnership Jami Mathewson presented on survey results from these courses at WikiConference North America.
  • Will presented on how to teach Wikidata at WikiConference North America.
  • Jami, Will, and Senior Wikipedia Expert Ian Ramjohn presented about our experiences running COVID-19 courses described in previous reports at WikiConference North America.
  • Chief Programs Officer LiAnna Davis and Smithsonian Wikimedian in Residence Kelly Doyle co-presented about our courses at WikiConference North America.
  • Will and LiAnna, along with Shani Evenstein Sigalov, served as co-curators of the Education & Science Track at WikidataCon. Will and LiAnna also presented as part of group panels at WikidataCon.
  • We had three Wikidata course alumni present at WikidataCon 2021: Amanda Rust provided an update on her class' public art in Boston Wikidata work, Jeannette Ho discussed how Texas A&M is using Wikidata to Enhance Discovery for Dissertations, and Jennie Choi presented on the Met's work with structured data on Wikidata and Wikimedia Commons.

What worked well

[edit]

By the end of 2021, we had worked with the American Physical Society (APS) to run four Wiki Scientists courses to add biographies of physicists and physics information to Wikipedia, and they had committed to two additional courses in 2022. This ongoing partnership has proven fruitful and shows that academic institutions will invest their resources into providing this type of programming to their members.

In 2020, we offered at least two distinct Wikidata training courses throughout the year, recruiting individual participants depending on which course level met their needs. In 2021, we exclusively offered an introductory Wikidata training course, the Wikidata Institute, that we run every few months. Developing a course curriculum that is replicable has reduced staff time and allowed us to execute with a quick turnaround. Additionally, the marketing and recruitment work we do for these courses has a snowball effect, giving interested professionals several opportunities a year, depending on their needs and availability, to join our movement and learn about Wikidata.

In September, we partnered with the New York Data Carpentries Library Consortium to run the first Wikidata Institute that was fully sponsored for a specific institution and community, compared to our standard Wikidata courses, in which individual participants enroll. A fully sponsored course reduced staff time in finding participants and led to meaningful collaborations among participants in the same region and with similar job descriptions. Running this successful course has given us a useful case study to pitch to other prospective partners.

What didn't work well

[edit]

We hoped to significantly grow the revenue for this program in 2021, from the $185,450 we brought in during 2020 to a goal of $275,200 in 2021. We planned to hire an additional salesperson so we could scale up the courses we brought in. We found, however, that it was difficult to find the right person for the role. Instead of a January start date as planned, we did not hire for this role until July. Subsequently, our hire was unable to meet the goals set for this position. Once we recognized the problem, we took action, but this delay and then challenge caused us to not be able to grow the revenue for the program as much as we'd anticipated. Nevertheless, we were able to bring in $246,293, still an increase of 33% year over year.

Programs & Events Dashboard

[edit]

Quantitative targets

[edit]
Measure of success Goal (2021) Total % completed Notes
Total Participants 1,500 1,285 86% While fewer program leaders used the Dashboard than expected, those who did tracked more programs with it.

What we did

[edit]

Usage of Programs & Events Dashboard has remained fairly steady — but, for the first time, it was down slightly in 2021 versus the previous year. In particular, it was used for 2,226 events — down 5% from 2020 — with 27,669 participants — down 3% — and 1,278 different event organizers, which is down 18%. Many events were cancelled or never scheduled because of COVID. Performance problems, which caused a substantial amount of Dashboard downtime in late 2020 and especially early 2021, may have also contributed to the downtick in usage.

Fortunately, intensive work on solving the performance and capacity constraints of Programs & Events Dashboard was already underway when the problems became much more acute in February. We had engaged Ruby on Rails performance consultant Nate Berkopec to analyze and improve the system, and had begun identifying key bottlenecks. Over the next two months, we made major infrastructure changes. With support from the Wikimedia Cloud Services team, we moved from running the entire Dashboard on a single server to a distributed system where different parts of the system are spread across several servers. The Programs & Events Dashboard has been highly stable since that work. We had no significant downtime from April onward, and update lag improved dramatically as well.

In order to better support users and potential users, we ran our first survey of Programs & Events Dashboard users in August and September, with responses from a diverse set of global program organizers. The overall picture is that users are eager to see improvements to many aspects of the Dashboard; every possible focus area had a substantial amount of support from survey respondents. Based on the survey, we published our first Programs & Events Dashboard roadmap, and we've begun working on some of the top priorities, including an Outreachy internship project to improve Wikidata support. We also presented about the Dashboard at Wikimania and WikiConference North America, and we began coordinating with the Wikimedia Campaigns product team to plan for integration between the Dashboard and their upcoming Event Center platform.

What worked well

[edit]

The Programs & Events Dashboard survey was particularly useful, with responses roughly representative of the language diversity of typical Dashboard usage (about 40% English Wikipedians, with the rest spread across many languages). The survey surfaced some clear and concrete needs that the Dashboard can meet with further development effort — along with a set of power users who volunteered through the survey to participate in interviews and related user research.

We also had promising results with using contractors to extend our capacity for technology work, which we will continue doing in 2022.

What didn't work well

[edit]

Supporting Programs & Events Dashboard at a lower level while focusing primarily on the technical needs of Wiki Education's own programs, as we did in 2020 when our technology team went from 2 people to just 1, was clearly insufficient in retrospect.

In our mid-2021 progress report, we noted plans to complete Programs & Events Dashboard's "transition to a more scalable distributed server architecture", i.e., the infrastructure-as-code system we use on dashboard.wikiedu.org. Getting these infrastructure-as-code tools to work in the relatively constrained Wikimedia Cloud environment proved more challenging than anticipated. We decided instead to maintain and improve the current, more hands-on approach to Programs & Events Dashboard infrastructure.

Similarly, while we completed prerequisite infrastructure work that would let us more easily run multiple independent Dashboard sites, the learning curve for this work was steeper than expected; we will push back a more serious exploration of providing Dashboards-as-a-Service until 2022 or later when we have more resources to devote to it.

Revenues received during this period (6 month for progress report, 12 months for impact report)

[edit]

Please use the exchange rate in your APG proposal.

Revenue source Currency Anticipated Q1[1] Q2[1] Q3 Q4 Cumulative Anticipated ($US)* Cumulative ($US)* Explanation of variances from plan
Selling Services USD $275,200 $32,650 $94,809 $83,019 $35,815 $246,293 $275,200 $246,293 A delay in hiring caused us to only reach 90% of this goal.
Selling Impact USD $1,287,000 $1,000,353 $125,620 $6,647 $148,118 $1,280,737 $1,287,000 $1,280,737 We nearly perfectly met this goal.
Other Income USD $0 $0 $0 $28,293 $241,451 $269,744 $0 $269,744 We received unplanned government pandemic relief loans (now forgiven) and grants, as well as some banking rewards.
Total USD $1,562,200 $1,033,003 $220,429 $117,959 $425,384 $1,796,774 $1,562,200 $1,796,774 Overall, we raised more money than expected in 2021.

[1] Based on feedback from our auditors, we made some adjustments to timing on when revenue was booked, which leads to these numbers being slightly different from our midterm report.

Spending during this period (6 month for progress report, 12 months for impact report)

[edit]

Please use the exchange rate in your APG proposal.

Expense Currency Budgeted Q1 Q2 Q3 Q4 Cumulative Budgeted ($US)* Cumulative ($US)* Percentage spent to date Explanation of variances from plan
Student Program USD $261,202 $78,775 $72,210 $85,850 $86,869 $323,704 $261,202 $323,704 124% Salary adjustments and a shift to enable more staff time for this program led to an increase.
Scholars & Scientists USD $472,672 $95,489 $88,081 $124,225 $123,554 $431,349 $472,672 $431,349 91% A delay in planned hiring and a shift to move time away from this program caused this under-spending vs plan.
Technology USD $141,664 $61,301 $53,768 $60,598 $59,399 $235,066 $141,664 $235,066 166% We spent more than planned on external contracting to support our technical projects, including the Programs & Events Dashboard.
General/HR/ Finance/Admin/ Board/Fundraising USD $660,853 $214,539 $197,213 $150,051 $135,589 $697,392 $660,853 $697,392 106% We only slightly exceeded our budget here.
TOTAL USD $1,536,391 $450,104 $411,272 $420,724 $405,411 $1,687,511 $1,536,391 $1,687,511 110% Overall, salary adjustments and additional tech contracting work led to a small increase in our budget.

Compliance

[edit]

Is your organization compliant with the terms outlined in the grant agreement?

[edit]

As required in the grant agreement, please report any deviations from your grant proposal here. Note that, among other things, any changes must be consistent with our WMF mission, must be for charitable purposes as defined in the grant agreement, and must otherwise comply with the grant agreement.

  • There were no major deviations.

Are you in compliance with all applicable laws and regulations as outlined in the grant agreement? Please answer "Yes" or "No".

  • Yes.

Are you in compliance with provisions of the United States Internal Revenue Code (“Code”), and with relevant tax laws and regulations restricting the use of the Grant funds as outlined in the grant agreement? Please answer "Yes" or "No".

  • Yes.

Signature

[edit]
Once complete, please sign below with the usual four tildes.

Resources

[edit]

Resources to plan for measurement

[edit]

Resources for storytelling

[edit]