Grants:APG/Proposals/2020-2021 round 1/Wiki Education Foundation/Progress report form

From Meta, a Wikimedia project coordination wiki

Purpose of the report[edit]

This form is for organizations receiving Annual Plan Grants to report on their progress after completing the first 6 months of their grants. The time period covered in this form will be the first 6 months of each grant (e.g. 1 January - 30 June of the current year). This form includes four sections, addressing grant metrics, program stories, financial information, and compliance. Please contact APG/FDC staff if you have questions about this form, or concerns submitting it by the deadline. After submitting the form, organizations will also meet with APG staff to discuss their progress.

Metrics and results overview - all programs[edit]

We are trying to understand the overall outcomes of the work being funded across our grantees' programs. Please use the table below to let us know how your programs contributed to the Grant Metrics. We understand not all Grant or grantee-defined Metrics will be relevant for all programs, so feel free to put "0" where necessary. For each program include the following table and

  1. Next to each required metric, list the outcome/results achieved for all of your programs included in your proposal.
  2. Where necessary, explain the context behind your outcome.
  3. In addition to the Global Metrics as measures of success for your programs, there is another table format in which you may report on any OTHER relevant measures of your programs success

For more information and a sample, see Grant Metrics.

Metric Achieved outcome Explanation
1. number of total participants 7,269 We are on track to meet our goal of 10,685.
2. number of newly registered users 5,183 We are on track to meet our goal of 8,100.
3. number of content pages created or improved, across all Wikimedia projects 9,667 We are on track to meet our goal of 13,350.
4. Quantity[1] 4,788,771 We are on track to meet our goal of 5,592,000.
5. Quality[2] 1,615[3] We have already exceeded our goal of 1,430.

[1] Number of words added to the article namespace for Wikipedia programs, and number of statements improved for Wikidata.
[2] For Wikipedia programs, number of articles that have at least a 10-point improvement in an ORES-based quality prediction score, which indicates significant improvement of the "structural completeness" of an article.
[3] We had planned to include number of references added to Wikidata as the quality indicator, but that number is already at 3,400, vastly exceeding our goal. We choose not to include it here in the sum total (only putting it in the Wikidata table below) to not overshadow the more meaningful Wikipedia numbers.

Telling your program stories - all programs[edit]

In October 2020, when we put together our 2021 proposal, the only thing we really knew about 2021 was it would be hard to predict. A global pandemic was raging against the backdrop of a contentious presidential election in the United States. Our proposal reflected this uncertainty with cautious numeric targets for 2021. Six months into the year, we're happy to report that we're well on track to significantly exceed our goals — and in some cases have already met the annual goals, in only half the time. While the pandemic presents ongoing challenges to all of our programmatic efforts, we've continued to operate our programs with significant success. We began to lay the foundation for additional growth in our impact as we globally emerge from the pandemic, adding new staff in our Advancement department to help grow our fundraising and earned income revenue streams.

The work we did in the first half of 2021 aligns well with Wiki Education's own strategic plan as well as the broader Wikimedia 2030 movement strategy. Of particular note for us was the 2021 release of the Wikimedia Foundation's Community Insights survey, with demographic data on contributors. For the first time, this report gave us concrete data based in geography to be able to compare our program participants to the existing editing population in our region, for both gender and race and ethnicity.

The gender ratio of our program participants has remained steady over the years, with 30% men, 67% women, and 3% either nonbinary or another gender. Contrasted to the Community Insights Report's — which found that only 22% of Northern America contributors are women — it's clear to see the impact Wiki Education's programs have had in bringing women and nonbinary contributors to Wikipedia, and do so at scale.

A version of the 2021 Community Insights Report chart, with Wiki Education's demographics for comparison.

For the first time, the Community Insights report asked about race and ethnicity of U.S. contributors. The results showed significant under-representation of Latino/a/x and Black editors, compared to the U.S. population. While Wiki Education's program participant demographics don't precisely match the US population data, they are significantly closer than the current Wikipedia editor demographics in the U.S.:

  • 5.2% of U.S. Wikipedia editors identify as Hispanic or Latino/a/x, while 12% of our program participants do (compared to a U.S. population percentage of 18%).
  • 0.5% of U.S. Wikipedia editors identify as Black or African American, while 8% of our program participants do (compared to a U.S. population percentage of 13%).
  • 0.1% of U.S. Wikipedia editors identify as Native American, while 1% of our program participants do (compared to a U.S. population percentage of 0.9%).

While we obviously have room to improve before our program participants fully reflect the U.S. population, we're pleased that our programs actively bring diverse newcomers to Wikipedia. Our movement strives toward knowledge equity, but this goal requires that more people have a voice in building our content and projects. Wiki Education's programs are helping make that happen in our region.

Wikipedia Student Program[edit]

Wikipedia Student Program Annual goal Jan – June Percentage
1. number of total participants 9,000 6,106 68%
2. number of newly registered users 8,000 5,072 63%
3. number of content pages created or improved, across all Wikimedia projects 13,000 6,544 50%
4. Words added to article namespace 5,500,000 4,573,328 83%
5. Quality Articles[1] 1,000 1,543 154%

[1] Number of articles that have at least a 10-point improvement in an ORES-based quality prediction score, which indicates significant improvement of the "structural completeness" of an article.

Typically we include a picture of student editors in this report, but given the pandemic and for everyone's safety, we did not venture into classrooms to take pictures. So we've included an image of student editors from 2018 instead.
What's worked well

The first half of the year overlaps closely with the spring academic term in the U.S. and Canada, enabling us to report on the outcomes easily. We're pleased to report we are on track to meet all of our goals. More than 6,000 student editors in courses across numerous disciplines improved thousands of articles in the first half of the year, adding more than 4.5 million words to Wikipedia.

In 2020, we introduced a new application process for instructors to participate in our program. We asked instructors who wanted to participate in our program to create their course page by October. In prior spring terms, we allowed instructors to create and submit course pages on their own schedule, leading to course pages arriving throughout the term, which made it difficult for us to plan. By creating this application process, the onboarding of new courses went dramatically more smoothly. The majority of course pages were already approved, preparing instructors for their Wikipedia assignments, and we were better able to plan our staff support for the term. This process was so effective that we are planning to continue it in coming terms.

To continue providing high quality support for our participants, we revised some of our support processes. In particular, we created a series of email messages with time-sensitive tips that are automatically sent to instructors teaching with Wikipedia, based on what their course page says they're up to that week. We also revised the templates we use to guide on-wiki article evaluations and peer reviews to encourage higher-quality answers from student editors. Both of these seemed to provide positive outcomes. We expanded our pilot mentorship program, where we pair an experienced instructor who has previously taught with Wikipedia with a new instructor. We are evaluating the results of this pilot.

A theatre history student from George Mason University created a new article on the Arlington Cinema 'N' Drafthouse, taking and uploading this photo to Wikimedia Commons to illustrate the article.

Throughout this work, Wiki Education student editors continued to improve articles in important topic areas, especially in relation to knowledge equity, a movement priority. Examples include:

  • A Georgetown University student improved the article on postpartum depression, adding risk factors, the role of stigma, and cultural factors. The student expanded the epidemiology section, which previously only focused on the U.S., to include information about Canada, South America, Asia, Europe, and Africa.
  • A Rice University student improved the article on politics in the United States, adding sections on unincorporated territories, gerrymandering, and a large section on concerns about political representation by race and ethnicity, gender, and sexuality.
  • A Carleton University student editor expanded the article on biodiversity loss, adding sections on air and noise pollution, invasive species, and overexploitation, all exacerbated by climate change.
  • A Rice University student editor expanded the article on Tassili n'Ajjer, a UNESCO World Heritage Site in the Sahara Desert in southeastern Algeria known for its Neolithic rock art. The student editor added substantial information about the rock art at the site, along with coverage of its archaeological history, context, and significance.
  • As non-fungible tokens exploded in popularity in early 2021, millions of people turned to Wikipedia to figure out just what an NFT was. A student editor from the University of Southern California rewrote the lead section and made other improvements that helped readers understand.
  • A McGill University student created the article on The Feminist Five, a group of Chinese feminist activists who were arrested for planning a protest against sexual harassment on public transportation.
  • More than 200,000 people have viewed the improved article on chess player Alexandra Botez, expanded by a student editor in our program.

Hers wasn't the only biography improved by students in our program this spring that helped address the English Wikipedia's equity gaps in biographies. Others included:

Biographies aren't the only equity-related content students added, however. Student editors from a Carroll College course, Monsters, Criminals, and the Other, added missing post/decolonial critiques of the Bertha Mason character to the article on Jane Eyre, the racial coding of "the Creature" in Frankenstein, and the recurring trope of the Orient as a signifier of contagion within the Sherlock Holmes canon. Students from Wellesley created and improved articles related to women and LGBTQ+ characters in Dante's Divine Comedy, on the 700th anniversary of its completion. A West Virginia University student noticed the article on Appalachian music lacked any reference to African-Americans' influence in the style; he added that section and others to the article. Through work like this, students in our program work to correct English Wikipedia' biases.

We continue to be thrilled with the improvements in the quality and quantity of content coming from the Wikipedia Student Program. While the COVID-19 pandemic has made the program less predictable (student contributions used to follow a very predictable pattern, they've been all over the place in the last three terms), we are still impressed with the scale of work being contributed to Wikipedia by student editors in this program. In our instructor survey, we asked about the pandemic's influence on this term. Many reported the online nature of this program made it a huge win during a challenging time, although they noted the mental health struggle of students resulted in greater variation in levels of completion than they'd seen in terms prior to the start of the pandemic. One instructor wrote: "the Wiki assignment was a lifesaver because traditional research papers were harder to do during the pandemic."

What hasn't worked well

In our proposal for 2021, we proclaimed excitement over a new idea to have some students stay in sandboxes, which we'd then review after the end of the term with the intent of moving the good content live. The idea did not pan out as we had hoped. At the end of the fall 2020 term, staff spent an inordinately large number of hours evaluating sandbox content, and we moved very little of it live. We quickly came to the conclusion that having students actually put the work live in Wikipedia's mainspace results in better quality drafts. Our belief is that knowing their work will be read by other people motivates students to put extra effort into their research, follow Wikipedia's rules, and create good content. The sandbox relaxed that motivation enough that there was a substantial quality drop in content left in sandboxes, and we don't think the staff time spent on reviewing them was worth it. We will not be pursuing this strategy in the future.

The next six months

As we move into the second half of 2021, we are focused on recruitment for our fall term, as well as supporting 21 summer courses. We are again running an application process for courses, which we've found encourages instructors to create course pages far in advance of the term and makes it easier on our staff. We are cautiously beginning to recruit new instructors to teach with Wikipedia again, although we expect that to show up numerically in our impact numbers in 2022; the lead time from when someone hears about our program to when they first teach with Wikipedia is usually at least six months, meaning we are starting recruitment now for 2022. Since we have done no active recruitment in the last year, it will take some effort to build our numbers back up. We are focusing our recruitment efforts on our two key content campaign areas, Communicating Science and Knowledge Equity.

Scholars & Scientists Program[edit]

Wikipedia Scholars & Scientists Program Annual goal Jan − June Percentage
1. number of total participants 140 207 148%
2. number of newly registered users 80 101 126%
3. number of content pages created or improved, across all Wikimedia projects 125 418 334%
4. Words added to article namespace 91,000 207,117 228%
5. Quality Articles[1] 30 72 240%

[1] Number of articles that have at least a 10-point improvement in an ORES-based quality prediction score, which indicates significant improvement of the "structural completeness" of an article.

Wikidata Scholars & Scientists Program Annual goal Jan – June Percentage
1. number of total participants 45 60 133%
2. number of newly registered users 20 10 50%
3. number of content pages created or improved, across all Wikimedia projects 225 2,705 1202%
4. Statements improved 1,000 8,326 833%
5. References added 400 3,400 850%
What's worked well

We are only halfway through the year, but we've already significantly exceeded our annual goals in nearly every metric. This outcome is the result of exceptional interest in the program that we've been able to foster, already running 12 Wikipedia and 4 Wikidata courses this year, far more than we anticipated. This work also generated $76,123 in revenue for our organization in the first six months of the year, another key marker of success.

Through the Wikidata branch of our program, we were able to improve more than 8,000 statements on 583 items, far exceeding our goals. Wikidata goal-setting remains challenging, as we've mentioned in past reports, and we are eager to participate in movement-wide discussions of how to define success for Wikidata programs.

University of Virginia researchers Jonathan A. Gómez, Thomas Hartka, Binyong Liang, and Gavin Wiehl, with Wiki Education's Will Kent.

Aside from the coursework, we participated in two other Wikidata-related activities. We provided ongoing mentorship for a group of graduate students from the University of Virginia who were seeking to use machine learning to help answer the question of how to define things on Wikidata. Our Wikidata Program Manager, Will Kent, served as a mentor for these students, helping them along the way. More about their research can be found in Will's blog post.

We helped support the U.S. Library of Congress Program for Cooperative Cataloging Wikidata Pilot. Specifically, we led a strategic planning session to explore the most effective way to track progress on this project, as well as an orientation to the Programs & Events Dashboard, which the group selected to track their progress. This project explores Wikidata as a linked data space for libraries to produce linked data as well as download for collection data enrichment. This project could have wide-reaching effects on library metadata production and we're pleased at the supportive role we could play.

In addition to our work with Wikidata, we ran 12 Wikipedia editing courses. The content added by the participants had huge impacts on Wikipedia's readers and on equity. In early 2021, we ran our final course focused on improving information related to the COVID-19 pandemic, with participants adding significant content to articles like COVID-19 pandemic in Pennsylvania (adding sections on its impact on the economy, communities, and voting, as well as information on the vaccine rollout), COVID-19 pandemic in Louisiana (adding sections to the timeline, the state response, and the impact to the economy), and media coverage of the COVID-19 pandemic (adding sections on Germany and Sweden). At the conclusion of the course, we engaged in a deep program evaluation of the full suite of COVID-19 courses, publishing a COVID-19 Project Evaluation report on Meta so others in the Wikimedia community could share in our learnings.

Nobel Prize Winning Physicist Dr. William D. Phillips learned to edit Wikipedia through one of our courses in 2021.

This program provides an opportunity to teach experts to contribute to Wikipedia. We were pleased to collaborate with the American Physical Society to offer a course on improving physics topics. Among the participants in this course was Nobel Laureate Bill Phillips, who won the 1997 prize in physics. We documented his experience in our course in a post on Wiki Education's blog, which was reprinted in the Signpost. Participants in this course improved high-traffic physics articles like nitrogen-vacancy center, mathematical formulation of quantum mechanics, and a new article on atom localization. We supported two courses targeted at improving content related to open educational resources, with participants improving articles like virtual exchange, open textbook, open-access mandate, open educational practices, and a new article on open thesis.

We tackled equity work with this program, running two courses for Black History Month, three courses in collaboration with the Smithsonian's American Women's History Initiative, and one course with 500 Women Scientists. All six courses were focused on improving biographies of people who have been historically excluded from Wikipedia. Some examples of biographies created or improved through these courses include:

What hasn't worked well

Near the end of this six-month term, we began to notice a drop-off in engagement with this program. This slide occurred in all aspects of the program — recruiting participants for courses, course session attendance, and actual contributions to Wikipedia. Our belief is most of this can be attributed to the easing of COVID-19 restrictions and Zoom fatigue. We suspect as people are able to travel, see friends in person, and otherwise engage in activities that have been restricted for more than a year, the draw for a weekly Zoom course plus hours on your computer outside of class to edit articles is diminishing. As the second quarter of this grant report ended, we began discussing ways to better engage participants.

The next six months

Addressing this drop-off in engagement remains an important element of our work in the remainder of 2021. Our goal is to scale up our impact for the Scholars & Scientists Program in the next year; key to that is hiring a second sales role. Our new Director of Sales, Nanette James, started July 1; we expect to see an increase in revenue and impact from this program in the next six months and into the future. Having an earned income model, we believe, will help ensure the financial stability of our organization.

Programs & Events Dashboard[edit]

Wikidata Scholars & Scientists Program Annual goal Jan – June Percentage
1. number of total participants (number of unique global Wikimedia program leaders who track an event with the Dashboard) 1,500 896 60%
A visualization of the new server cluster powering dashboard.wikiedu.org. This same system, built with the open source tool Nomad, will be rolled out to Programs & Events Dashboard soon.
What's worked well

Our plan to evaluate and improve the server architecture and hosting configuration of Programs & Events Dashboard was very fortuitous in its timing. We hired Ruby on Rails performance expert Nate Berkopec to work on the project at the beginning of March, and while we were focused on measuring the key performance and capacity bottlenecks of the Dashboard, the system began failing in major ways because the infrastructure was stretched too thin. (The acute cause was that a few users were trying to track groups of editors so large and active that the system would run out of memory each time it attempted to update statistics, which would sometimes bring down the whole server.) With our focus already on the root causes of these problems, we were able to fix the most important bottlenecks, begin the transition to a distributed architecture, and stabilize the system. We haven't had any downtime since late April. (We summarized this work in a recent blog post, "A reliable future for Programs & Events Dashboard".)

What hasn't worked well

The Dashboard had an unacceptable amount of downtime and slow updates in the first months of the year — particuarly March and April. While we were aware that Programs & Events Dashboard was nearing the limits of its capacity, we did not have appropriate monitoring in place to indicate just how close to failure the system was between January and March, or to pinpoint the causes of system failure immediately. We've improved our monitoring system, and we expect the system to remain stable until we complete planned capacity improvements in the next quarter.

The next six months

We're increasing our focus on Programs & Events Dashboard in the second half of 2021, both in terms of improving the infrastructure and conducting user research to build a feature roadmap for future improvements. We'll complete the transition to a more scalable distributed server architecture — which has already been deployed for dashboard.wikiedu.org but isn't quite ready to work in the Wikimedia Cloud environment — in the third quarter. We are also preparing a survey of Dashboard users to help us prioritize new features and user experience improvements, which we will follow up with user interviews to better understand the needs of program leaders. The distributed architecture work will also pave the way for an initiative we plan to begin late this year: exploring the feasibility of "Dashboards-as-a-service". With this model, we will launch and maintain custom-configured and branded Dashboard websites for individual Wikimedia movement organizations, unlocking many of the advanced capabilities that Wiki Education uses but that aren't compatible with the shared, open-for-all environment of Programs & Events Dashboard. We plan to pilot this with two partner organizations in 2022.

Revenues received during this six-month period[edit]

Please use the exchange rate in your APG proposal.

  • Important note
    • the anticipated column may list revenues anticipated for the whole year instead of only the 6 months. Please make sure that this the time period clear in the table.
    • In the explanation column, always mention relevant information about the numbers: what period they refer to etc.

Table 2 Please report all spending in the currency of your grant unless US$ is requested.

  • Please also include any in-kind contributions or resources that you have received in this revenues table. This might include donated office space, services, prizes, food, etc. If you are to provide a monetary equivalent (e.g. $500 for food from Organization X for service Y), please include it in this table. Otherwise, please highlight the contribution, as well as the name of the partner, in the notes section.
Revenue source Currency Anticipated Q1 Q2 Q3 Q4 Cumulative Anticipated ($US)* Cumulative ($US)* Explanation of variances from plan
Selling Services USD $275,200 $32,650 $43,473 F G $76,123 $275,200 $76,123 A delay in hiring our second sales person delayed the increase in this revenue source.
Selling Impact USD $1,287,000 $1,100,353 $2,020 F G $1,102,373 $1,287,000 $1,102,373 We are on track to meet this goal.
Total USD $1,562,200 $1,133,003 $45,493 F G $1,178,496 $1,562,200 $1,178,496 We are on track to meet our revenue goals for 2021.

* Provide estimates in US Dollars


Spending during this six-month period[edit]

Please use the exchange rate in your APG proposal.

  • Important note
    • Budget can be the budget for the whole year (and thus the percentage will reflect the half year and should be around 50%, or the half year, in which case the % should be around 100%. Please make that clear in the table.
    • In the explanation column, always mention relevant information about the numbers: what period they refer to.

Table 3 Please report all spending in the currency of your grant unless US$ is requested.

(The "budgeted" amount is the total planned for the year as submitted in your proposal form or your revised plan, and the "cumulative" column refers to the total spent to date this year. The "percentage spent to date" is the ratio of the cumulative amount spent over the budgeted amount.)
Expense Currency Budgeted Q1 Q2 Q3 Q4 Cumulative Budgeted ($US)* Cumulative ($US)* Percentage spent to date Explanation of variances from plan
Student Program USD $261,202 $78,775 $72,210 F G $150,985 $261,202 $150,985 58% Salary adjustments caused a slight increase over plan.
Scholars & Scientists Program USD $472,672 $95,489 $88,081 F G $183,570 $472,672 $183,570 39% A delay in planned hiring caused this under-spending vs plan.
Technology USD $141,664 $61,301 $53,768 F G $115,069 $141,664 $115,069 81% The additional contracting we did to shore up Programs & Events Dashboard caused this over-spending.
General/HR/ Finance/Admin/ Board/Fundraising USD $660,853 $214,539 $197,213 F G $411,752 $660,853 $411,752 62% Salary adjustments caused a slight increase over plan.
TOTAL USD $1,536,391 $450,104 $411,272 F G $861,376 $1,536,391 $861,376 56% Overall, we are on track.

* Provide estimates in US Dollars


Compliance[edit]

Is your organization compliant with the terms outlined in the grant agreement?[edit]

As required in the grant agreement, please report any deviations from your grant proposal here. Note that, among other things, any changes must be consistent with our WMF mission, must be for charitable purposes as defined in the grant agreement, and must otherwise comply with the grant agreement.

  • None.

Are you in compliance with all applicable laws and regulations as outlined in the grant agreement? Please answer "Yes" or "No".

  • Yes.

Are you in compliance with provisions of the United States Internal Revenue Code (“Code”), and with relevant tax laws and regulations restricting the use of the Grant funds as outlined in the grant agreement? Please answer "Yes" or "No".

  • Yes.

Signature[edit]

Once complete, please sign below with the usual four tildes.

Resources[edit]

Resources to plan for measurement[edit]

Resources for storytelling[edit]