Community Engagement Insights/Frequent questions
- 1 About CE Insights
- 1.1 Goals
- 1.2 About the surveys
- 1.3 Evaluation
- 1.4 Frequently Asked Questions
- 1.4.1 Community members don't like taking really long surveys. Are these going to be very long surveys?
- 1.4.2 How will we make sure users respond to questions that are most relevant to them?
- 1.4.3 If our team would like to do more research about a result from a survey, would the Learning & Evaluation team be able to help?
About CE Insights
Community Engagement Insights is a new process for teams at the WMF to gather data from the communities they serve. As the WMF guiding principle of shared power states, the "...Wikimedia Foundation works in partnership with a global community of volunteers made up of article writers, copy-editors, photographers, administrators, page patrollers, quality assessors, translators, wiki-gnomes, help-desk staffers, developers, bot creators, people who do outreach work and many others." CE Insights is a means towards reaching this goal of working in partnership with communities by measuring, year over year, how well we are engaging with our global community in the work that we do and measuring the impact of the WMF's work.
CE Insights has 4 goals:
- Goal 1: Improve alignment with Wikimedia global communities - To use surveys as a means to improve alignment between the WMF and communities, through needs assessment and project/program evaluation.
- Goal 2: Collect and deliver survey data for annual planning - To gather the most useful year-over-year data from community audiences, especially those who use WMF products and staff support
- Goal 3: Reduce overhead for individual teams by design through collaboration - To offer centralized support to teams across the WMF and, through collaboration, reduce overhead for teams
- Goal 4: Avoid survey fatigue - To coordinate data collection efforts across teams in order to reduce survey fatigue for various projects
In sum, the project works to gather feedback and opinions from communities that can directly affect the work of the Wikimedia Foundation.
About the surveys
Annual WMF Performance Survey
The annual survey is the larger of the three surveys. This survey will heavily focus on information that can support teams in annual planning; often tied with KPIs. The performance survey will focus on 5 core audiences only, or audiences within these 5 major audiences. Most importantly, the survey employs a collaborative process - if a team at the WMF proposes questions, that team would also need to support the survey process in some way.
Pulse surveys were an idea to have shorter surveys available throughout the year. We do not yet have the capacity to do more than one survey at this time.
Evaluation of CE Insights will be done through two means. First, we will use the CE Insight surveys as well as annual WMF staff surveys to evaluate the project. In addition to these methods, we will conduct a process evaluation to determine how useful the process was for staff and for community members involved. The process evaluation and surveys will results in annual recommendations for improvement which will be published in June or July of each year, during the design of the following year's process.
Goals will be measured by the following two surveys
- Annual CE Insights staff survey: Goal 1,2,3
- WMF performance survey: Goal 1, 4
For more information, see the Evaluation section.
Frequently Asked Questions
Community members don't like taking really long surveys. Are these going to be very long surveys?
There are three main ways we will keep these surveys short:
- We will focus on the most important questions that we need to make decisions as an organization.
- The surveys will get split up by audience so we will keep the surveys specific to the audience of interest. We will do this by using specific avenues for reaching users.
- We will use many screening questions to users will answer questions that are most relevant to them. As an example, if someone did not engage in the WMF strategy process, they will not get asked any questions about the strategy process.
How will we make sure users respond to questions that are most relevant to them?
There are two ways we will do this. First, there will be separate surveys for each audience, and we will reach these audiences in a specific way. For example, to reach developer/tech collaborators, we can survey them on labs, mediawiki wiki, phabricator and other technical spaces.
A second method will be to heavily use "screening questions" so they don't answer questions about everything. We will only ask users to review/give feedback on activities they have participated in. For example, they will choose whether they have engaged with software processes, developing tools, participating in WMF-hosted hackathons, etc. They will likely have to choose 3 areas that are most important to them and then they will get a set of questions related to those areas.
If our team would like to do more research about a result from a survey, would the Learning & Evaluation team be able to help?
The Learning & Evaluation team would be happy to help you design and plan any follow-up studies you would like to do based on the results of CE Insights. Research projects are a great learning opportunity for your team to engage in learning how to conduct interviews, focus groups, surveys, or other studies. L&E would maintain an advisory role. We have other teams would would likely be able to help as well, such as the Research or the Design Research teams.