Grants:Learning patterns/Framing survey questions

From Meta, a Wikimedia project coordination wiki
Jump to: navigation, search

a learning pattern i for: survey

Steel framing 3.PNG
Framing survey questions


Problem:
If people misunderstand your survey questions, they may not answer them the way you intended.

Solution: When asking multiple-choice questions, make sure you provide choices for all the major types of answers you expect to receive, at the right level of specificity. Ask open-ended questions in a way that encourages people to elaborate, rather than give one word answers.

endorsed by: J-Mo


endorse this pattern

Gear shape blue.svg Create
a new pattern
Gear shape gold.svg Browse
more patterns
Gear shape warm gray.svg Visit
the portal

created on: 27 August, 2013

What problem does this solve?[edit]

When people misunderstand survey questions, they often don't answer them the way the survey creator intended. Unfortunately, by the time the survey creator realizes this, it's usually not feasible to go back and ask them again, which means they may lose useful information.

Most people fill out surveys quickly, and they don't always read every question carefully. The survey creator usually only have one chance to ask these questions--most people are not willing to fill out the same survey twice! So it is not only important to ask the right questions, but to ask them in the right way.

The "right questions" and the "right way" to ask may be different for each survey, but there are some good general principles that you can follow. These priniciples will help you make sure that survey participants understand what you're asking, and respond with the information you're asking for.

What is the solution?[edit]

When asking multiple-choice questions, make sure you provide choices for all the major types of answers you expect to receive, at the right level of specificity. Ask open-ended questions in a way that encourages people to elaborate, rather than give one word answers.

Asking multiple-choice questions[edit]

When you ask multiple-choice questions, make sure you cover all the reasonable options, but avoid asking for more detail than you actually need or more detail than the participant can reasonably provide. The example below shows some of the ways you can design your questions to make sure that you get the most useful responses.

If you forget to provide the right options, you might not learn as much from the responses as you could have: if you're asking people how much they participate in Wikimedia projects, and some of your survey participants are not yet regular contributors, you might want to include an "None of the above" option (see example 1b below). Asking for unnecessary detail can be confusing, distracting, and lead to survey fatigue (see example 1a below).

Example
1a. (needs some work)
How active are you on Wikipedia?
  • 1-10 edits a month
  • 10-20 edits a month
  • 20-30 edits a month
  • 30-40 edits a month
  • 40-50 edits a month
  • 50-60 edits a month
...etc...
1b. (much better)
In a normal month, how many edits do you make to Wikipedia?
  • 1-10 edits
  • 11-100 edits
  • 101 - 1000 edits
  • more than 1000 edits
  • I don't edit regularly edit Wikipedia

Considerations[edit]

The second example is an improvement over the first example in several ways:

  • Ask for an appropriate level of detail. Most people won't be able to accurately tell you off the top of their heads whether they average 20 or 30 edits per month. And if you're just trying to get a sense of how active your participants are, you probably don't need that much detail. Someone who averages 500 edits per month is a lot more active than someone who makes 20. But someone who makes 400 edits and someone who makes 900 edits are both "highly active Wikipedians" by anybody's definition.
  • Use clear language. The first example asks "how active are you?" and then talks about edit counts. This is a good measure, but it is only one measure of activity on Wikipedia! Make sure you are clear about what you are asking. The first example also repeats "per month" in every option; it would be more readable if it just specified this once, like the second example does.
  • Capture important exceptions. What if one (or more) of your participants doesn't edit Wikipedia at all? This might not be the case if you're doing a dispute resolution participant survey, but what if you are surveying conference participants, some of whom are not yet editors? You don't always have to cover every edge case, but you should try your best to anticipate the range of possible responses when you set up a multiple choice question.

Asking open-ended questions[edit]

Open-ended questions are used to draw out more detailed, personal responses from people in their own words. People may skip over these because they don't have anything to add or they don't feel like writing a lot. That's perfectly okay. However, you can increase both the number and the quality of your open-ended responses by phrasing them in a way that encourages storytelling.

Again, let's consider two different ways of asking the same questions:

Examples
2a. (needs work)
Has the kind of editing you do changed since you first joined Wikipedia?
2b. (much better)
How has the kind of editing you do changed since you first joined Wikipedia?


3a. (needs work)
What is the most difficult thing about editing Wikipedia?
3b. (much better)
Tell me about a recent editing experience where you felt frustrated.

Considerations[edit]

  • Avoid the yes/no trap. It's easy to imagine someone just answering "Yes" or "No" to Example 2a. If that's all you want to know, you should probably make this question multiple choice! But if you actually want to get some detail, you should ask them how their work has changed. This will prompt them to think for at least a few seconds before they answer. Their answer may still be "It hasn't changed", but they are more likely to explain why it hasn't changed—which is a useful thing to know.
  • Ask for concrete examples. People generally find it easier to remember concrete things than abstract ones. Specific examples are especially useful data, but it can be hard to think one up on command. Asking people to think of things that happened recently—things that are still fresh and clear in their memory—makes it easier to recall a specific example. If that makes them think of a better example that's less recent, even better!
  • Ask for their experience, not their opinions. In most cases, personal experiences are more valuable survey results than vague impressions, gut reactions or unsubstantiated opinions. Example 3a could be interpreted in any number of ways: does "difficult" mean a negative experience? or one that was challenging but ultimately positive? Asking someone how they were affected by an experience (i.e. how they felt) will lead to more accurate and useful feedback every time.

Examples[edit]

other examples of different ways of phrasing a question

See also[edit]

Related patterns[edit]

Other resources[edit]

other resources (links to surveys, articles about survey design, handy tools) related to this pattern