Grants:Learning patterns/Framing survey questions

From Meta, a Wikimedia project coordination wiki
Jump to: navigation, search

a learning pattern i for: survey

Steel framing 3.PNG
Framing survey questions


Problem:
If people misunderstand your survey questions, they may not answer them the way you intended.

Solution: When asking multiple-choice questions, make sure you provide choices for all the major types of answers you expect to receive, at the right level of specificity. Ask open-ended questions in a way that encourages people to elaborate, rather than give one word answers.

endorsed by: J-Mo


endorse this pattern

Gear shape blue.svg Create
a new pattern
Gear shape gold.svg Browse
more patterns
Gear shape warm gray.svg Visit
the portal

created on: 27 August, 2013

What problem does this solve?[edit]

When people misunderstand survey questions, they often don't answer them the way the survey creator intended. By the time the survey creator realizes this, it's usually too late.

Most people fill out surveys quickly, and don't always read every question carefully. The survey creator usually has just one chance to ask these questions—most people are not willing to fill out the same survey twice! So it's important to ask the right questions, and to ask them in the right way.

The "right questions" and the "right way" may be different for each survey, but there are some good general principles that you can follow. Above all, try to make the survey as short as possible: that way, people are more likely to concentrate on giving reliable responses.

What's the solution?[edit]

When asking multiple-choice questions, provide choices for all of the major types of answers you expect to receive, at the right level of specificity. Ask open-ended questions in a way that encourages people to elaborate rather than give one-word answers.

Asking multiple-choice questions[edit]

Cover all of the reasonable options, but avoid asking for more detail than you need. The example below shows some of the ways you can design your questions to get the most useful responses. For example, if you're asking people how much they participate in Wikimedia projects, and some of your survey participants are not yet regular contributors, you might want to include a "None of the above" option (example 1b below). Asking for unnecessary detail can be confusing, distracting, and lead to survey fatigue.

Example
1a. (needs some work)
How active are you on Wikipedia?
  • 1–10 edits per month
  • 10–20 edits per month
  • 20–30 edits per month
  • 30–40 edits per month
  • 40–50 edits per month
  • 50–60 edits per month
... etc ...
1b. (much better)
In a normal month, how many edits do you make to Wikipedia?
  • 1–10 edits
  • 11–100 edits
  • 101–1000 edits
  • more than 1000 edits
  • I don't edit regularly edit Wikipedia

Considerations[edit]

1b is an improvement over 1a in several ways:

  • Ask for an appropriate level of detail. Most people won't be able to accurately tell you off the top of their heads whether they average 20 or 30 edits per month. And if you're just trying to get a sense of how active your participants are, you probably don't need that much detail. Someone who averages 500 edits per month is a lot more active than someone who makes 20. But someone who makes 400 edits and someone who makes 900 edits are both "highly active Wikipedians" by anybody's definition.
  • Use clear language. The first example asks "how active are you?" and then talks about edit counts. This is a good measure, but it is only one measure of activity on Wikipedia! Make sure you are clear about what you are asking. The first example also repeats "per month" in every option; it would be more readable if it just specified this once, like the second example does.
  • Capture important exceptions. What if a participant doesn't edit Wikipedia at all? This might not be the case if you're doing a dispute-resolution survey, but what if you are surveying conference participants, some of whom are not yet editors? You can't always cover every conceivable case, but try to anticipate the range of possible responses.

Asking open-ended questions[edit]

Open-ended questions are used to draw out more detailed, personal responses from people in their own words. People may skip over these because they don't have anything to add or they don't feel like writing a lot. But you can increase both the number and the quality of their open-ended responses by phrasing them in a way that encourages storytelling.

Again, let's consider two different ways of asking the same questions:

Examples
2a. (needs work)
Has the kind of editing you do changed since you first joined Wikipedia?
2b. (much better)
How has the kind of editing you do changed since you first joined Wikipedia?


3a. (needs work)
What's the most difficult thing about editing Wikipedia?
3b. (much better)
Tell us about a recent editing experience where you felt frustrated.

Tips[edit]

  • Avoid the yes/no trap. It's easy to imagine someone just answering "Yes" or "No" to Example 2a. If that's all you want to know, make it a multiple-choice question. But if you want to get more detail, you should ask them how their work has changed. This will prompt them to think for at least a few seconds before they answer. Their answer may still be "It hasn't changed", but they are more likely to explain why it hasn't changed—which is a useful thing to know.
  • Ask for concrete examples. People generally find it easier to remember concrete things than abstract ones. Specific examples are especially useful data, but it can be hard to think one up on command. Asking people to think of things that happened recently—things that are still fresh and clear in their memory—makes it easier to recall a specific example. If that makes them think of a better example that's less recent, even better!
  • Ask for their experience, not their opinions. In most cases, personal experiences are more valuable survey results than vague impressions, gut reactions, or unsubstantiated opinions. Example 3a could be interpreted in several ways: does "difficult" mean a negative experience? or one that was challenging but ultimately positive? Asking someone how they were affected by an experience (how they felt) will lead to more accurate and useful feedback every time.

Examples[edit]

other examples of different ways of phrasing a question

See also[edit]

Related patterns[edit]

Other resources[edit]

other resources (links to surveys, articles about survey design, handy tools) related to this pattern