Community Engagement Insights/2016-17 Report/Survey design and process

From Meta, a Wikimedia project coordination wiki
Community Engagement Insights 2016-17 Report

Collaborative survey question selection[edit]

Question selection[edit]

Each team at the Wikimedia Foundation was allowed at most 20 questions, and they were allowed up to two audiences to survey. Teams were asked to not only submit their questions, but also their future goals and how they intend to use the results of the survey. This process ensured that the questions that were asked were going to serve a specific purpose for the staff at the Foundation who need to make decisions based on community input and community needs. In the end, 13 teams submitted questions to the survey. In addition to these 13 teams, the Affiliations Committee had been planning on surveying affiliates, so their questions were incorporated with Learning & Evaluation team.

Resourcing[edit]

This survey was complex. While the Learning & Evaluation team lead the effort, this survey would not have been possible without the support of multiple teams from the Foundation as well as volunteers. The expertise for a major survey lies across the Foundation's teams, and no team has sufficient resources or time to conduct this research on their own. Thus, this process relies on a highly collaborative approach, where teams who would like to receive data from this process also donated their team's time to the effort. To carry out the survey, the Foundation used Qualtrics primarily for its ability to translate questions and ability to create very complex survey skip patters that were essential for this survey.

Question review, refinement, testing, and translation[edit]

The survey questions were reviewed and placed into a database of survey questions. This database was shared with all staff and reviewed. Once the review period ended, the surveys were uploaded into Qualtrics survey software and 40 participants helped with testing the survey. For testing, each question had an additional question just below that said "Did you have any difficulty choosing a response or answering the question above", with an open text for response. Testing served to make sure that the questions made sense and that the response options were clear. Once testing ended, translation began, where many volunteers supported the translation. Some volunteers translated the entire survey into one language, which is a huge endeavor, given that the survey had 260 questions. These translations were then uploaded into qualtrics. Once in qualtrics, we were ready to distribute the survey.

Methods for audience distribution[edit]

Overall process[edit]

Forthcoming

Audience-specific[edit]

Forthcoming