Editing workshops are events that focus on educating the public about how Wikipedia and its sister projects work and how to contribute to them.
Unlike edit-a-thons, workshops don't always include hands-on editing. Workshops, which can last from 1–2 hours or an entire afternoon, are generally hosted in public venues such as universities, libraries, or community centers. Workshops can be held for specific types of groups, such as academics and students, or women or the elderly. They might take place at major conferences, or even online through webinars and online conferences.
Workshops are often facilitated by experienced Wikipedians, who may talk about the history of Wikipedia, it's mission, and then present the basics about how to edit, what the policies are, and perhaps additional information about the culture and community. Pamphlets might be given away to serve as references for participants, and even certificates are given to participants in some workshops.
Editing workshops are one of the oldest outreach events in the Wikimedia community. In 2004, Italian community members hosted workshops to educate and inspire the public to participate in Wikipedia and related projects. After 2004, editing workshops quickly continued to expand to other countries. Today, Wikipedians around the world invite the public to engage and learn about why it's important and how to contribute to Wikipedia.
Response rates, data quality, report limitations 
In total, 168 unique editing workshop events run between September 2013 and November 2014 were identified for inclusion in this report. However, for 27 of these events insufficient data could be obtained, therefore data on a total of 141 events were included in this report.
Data on workshops were collected from three sources:
(1) directly from the program leaders;
(2) from publicly available information on organizer websites and on-wiki reports; and
(3) through WMF Labs tools such as Wikimetrics, Quarry and Catscan.
The data obtained included: number of participants, event start and end times, number of bytes added and removed, number of pages created, number of pages improved, information on the goals of workshops, information on inputs to each workshop. Only a minority of events reported key inputs such as budget, staff hours, and volunteer hours, and this information cannot be mined. Thus, while we explore these inputs, we cannot draw many strong conclusions about program scale or how these inputs affect program success.
In addition, the data for workshops are not normally distributed, the distributions are, for the most part, skewed. This is partly due to small sample size and partly to natural variation, but does not allow for comparison of means or analyses that require normal distributions. Instead, we present the ‘’median’’ and ranges of metrics and use the term ‘’average’’ to refer to the median average, since the median is a more ’’statistically robust’’ average than the ’’arithmetic mean’’. To give a complete picture of the distribution of data, we include the means and standard deviations as references.
To see the summary statistics of data reported and mined, including counts, sums, arithmetic means and ‘’standard deviations’’, see the appendix.
Program leaders reported a total of 20 priority goals. The most frequent goal was building and engaging community and the second was increasing awareness of Wikimedia projects.
We asked program leaders to select their priority goals for workshops. Priority goals were reported for 24% of editing workshop events (35 in total), with the number of goals for an event ranging from 5 to 17 with an average of 8. 
As shown in Table 1 below, over 50% of events selected the same 8 goals. The top two most commons goals were building and engaging community (91%) and increasing awareness of Wikimedia projects (91%). The top five goals also included: increasing participant diversity (selected by 83% or workshop program leaders), increasing volunteer motivation and commitment (71%), and increasing skills for editing/contributing (66%).
↑Many thanks to the builders of these tools, especially: Magnus Manske, who created Catscan; YuviPanda, who created Quarry; and WMF Analytics and data researchers for Wikimetrics.
↑We provided a list of 19 priority goals, identified at the 2013 Budapest training, with an additional 20th option to write in additional goals as well. Program leaders could select as many or as few as they saw fit.