A total of 105 articles were created or improved as part of the 52 workshops (29%) that reported, and 55 (52%) of these were articles created. The typical workshop did not create or improve any articles.
From the 52 workshops reporting this metric (29%), the range was 0 to 23 articles created or improved, where the average event did not created or improved any articles.
As a part of the 42 events reporting (30%), 216 pages of text were added and 40 pages were removed. The number of text pages added or removed ranged from 0.0 to 38.1 and the average workshop added or removed 0.7 pages of text.
Text pages are a way to frame how much content is added to articles. For this report, one page of text was defined as 1,500 characters in any language. From the 42 workshops reporting bytes added, the average event resulted in about 0.7 total text pages. By counting pages of text added and removed during workshops, we see that a total of 215.5 pages of text were added, while 40.1 pages of text were removed. The average number of pages added was 0.6 The average number of pages removed was 0.0.
Only 14 workshops reported information for budget, number of participants, and text pages added or removed. No relationship was found between budget and text pages.
Only 14 workshops reported information for budget, number of participants, and text pages added or removed. Budget per week does not show a significant relationship with number of participants nor with text pages.  Similarly, participants does not show a significant relationship with text pages. 
For each contest, we assessed whether users edited Wikimedia projects during three post-event time windows
during the first, third and sixth months following the start date. Data was available for 38% of events.
Six-month retention data was available for 42 workshops. Of the 78 new users involved in these events, 3 (4%) were still active users 6 months after their workshop. Of the 168 existing editors involved, 113 (67%) were retained as active users 6 months after the workshop in which they participated.
A few things to remember with this section:
A follow-up window is a 30-day window a certain distance away from the event start date. In the case of editing workshop retention, one, three and six-month follow-up windows were assessed, meaning user activity was mined for first, third and sixth months after the event start date, respectively.
An active editor is a user that makes 5 or more edits during the follow-up period. We examined editing activity both in terms of the number of edits made to any Wikimedia project and in terms of the number of edits made only to the main project of the event but in no cases did these two metrics differ.
A survived editor is defined a user that made at least one edit during the follow-up period. Again, we looked at editing activity for both any project and specifically to the event’s main project and there were no differences.
For the one-month window and three-month windows, data was available on 112 events. There were a total of 97 new users and 182 existing users involved in the eligible events. For the six-month window, data was available for 97 events. There were a total of 78 new users and 168 existing users involved in the eligible events..
The dots illustrated in each of the green and blue columns in the chart below represent the number of editors who were active at (black dots) or survived to (white dots) at each follow-up point.
13% of events (19 events) reported information on replication and shared learning. 95% of the editing workshops for which data was reported were run by experienced program leaders and blogs and other online resources were used to spread information in 63% of workshops, while brochures and how-to guides were used in less than half the reported workshops.
Information on replication and shared learning was submitted for 14% of events (19 in total). Of these, 95% (17) were run by experienced program leaders. Furthermore, 63% (11) reported using blogs or other online information to tell others about the event; 16% (3) generated guides or instructions on how to run a similar event; and no events created brochures or other printed materials to tell others about the event. (See graph b)
Resources generated for replication and shared learning.
↑Although we list content production as an outcome, it can be an output or outcome, depending on the logic model of the contest organizer.
↑The text page metric was used in a previous report for the Wikipedia Education Program and we use here as an intuitive way to illustrate the amount of content produced during workshops. We are able to obtain the number of bytes added to or removed from an article and then convert that to characters added or removed. One byte does not equal one character in all languages. For example, while Latin and Cyrillic characters are about 1 byte per character, Arabic or Armenian characters are about 2 bytes per character. We count the number of characters to determine the number of text pages. In English, about 1,500 characters equals one Letter size page with double-spaced text, including any wiki markup.