Research:Article feedback/Stage 1
|Article Feedback v5||Data & Metrics||Stage 1: Design
(December 2011 - March 2012)
|Stage 2: Placement
(March 2012 - April 2012)
|Stage 3: Impact on engagement
(April 2012 - May 2012)
Conversions and newcomer quality
Helping readers improve Wikipedia: First results from Article Feedback v5
The Wikimedia Foundation, in collaboration with editors of the English Wikipedia, is developing a tool to enable readers to contribute productively to building the encyclopedia. To that end, we started development of a new version of the Article Feedback Tool (known as AFTv5) in October 2011. The original version of the tool, which allows readers to rate articles based on a star system, launched in 2010. The new version invites readers to write comments that might help editors improve Wikipedia articles. We hope that this tool will contribute to the Wikimedia movement’s strategic goals of increasing participation and improving quality.
Testing new feedback forms
On December 22, 2011, we started testing three different designs for the AFTv5 feedback forms:
- Option 1: Did you find what you were looking for? (shown above)
- Option 2: Make a suggestion, give praise, report a problem or ask a question
- Option 3: Rate this article
The purpose of this first experiment was to measure the type, usefulness and volume of feedback posted with these feedback forms. For example, does asking a reader to describe what they were looking for (option 1) provide more actionable feedback than asking them to make a suggestion (option 2)?
We enabled AFTv5 on a small, randomly selected set (0.6%) of articles on the English Wikipedia, as well as a second set of high-traffic or semi-protected articles. A feedback form, randomly selected from the above three options, was placed at the bottom of each page. The feedback form was also accessible via a link docked on the bottom right corner of the page. The resulting comments were then analyzed along a number of dimensions.
What we learned
We conducted several studies to research these key research questions:
- Is this feedback useful to Wikipedia editors?
- How much feedback is posted for each option?
- What do readers think of these feedback forms?
To answer these questions, we ran an editorial evaluation of the feedback’s usefulness, an analysis of overall feedback volume, as well as a reader survey and usability study.
Is feedback useful?
We performed several rounds of blind assessment of the usefulness and type of feedback that we collected. To that end, a dozen Wikipedia editors volunteered to "hand code" hundreds of feedback posts: we are very grateful for their help, without which this analysis would not have been possible.
Overall, 45% of the feedback we examined was found useful by all coders ("everyone" in figure 2, middle column). About 65% of the feedback evaluated was marked as useful by at least one coder ("someone" in figure 2, left column). We also compared the usefulness of feedback collected via each feedback form, but we couldn’t find any significant differences between the three options.
Examples of feedback marked as useful include:
- This synopsis has the characters Nic and Jules reversed. (The Kids Are Alright)
- What is failure theory? Discuss its importance enumerate various failure theories and mention their field of application. (Positive psychology)
There were, of course, feedback posts that were not useful, such as
- This guy is gorgeous, get one of the 5 million good pictures of Rick would ya? (Rick Santorum)
- I wish martin luther king jr was alive because the untied states would not be like this if martin luther king jr had died (Martin Luther King, Jr.)
Results of this initial feedback evaluation are encouraging – almost half of all comments in the randomly selected sample were judged to be useful in helping to improve the encyclopedia. There are, however, going to be differences when it comes to the feedback associated with individual articles. For example, we found that feedback quality from the random sample was higher (45% useful) than a smaller sample of popular and/or controversial articles (33% useful). One possible explanation for this result is that the 115 articles we hand-picked were high-traffic and/or semi-protected pages (e.g. Barack Obama), which tend to attract a higher proportion of abusive feedback and irrelevant posts.
How much feedback is posted for each option?
Since the launch of AFTv5 we collected nearly 30,000 feedback posts from the random sample (averaging at 360 posts per day over the last month, about 70% of which including text). We observed that the vast majority of AFTv5 feedback (95%) was posted by anonymous users, which is consistent with our design goals for this feature and with what we found in AFTv4. We also observed that of the three feedback form designs, option 1 and option 2 generated significantly more posted comments than option 3.
Finally, we found that AFTv5 outperforms AFTv4 in terms of volume of feedback posted. Part of this is due to the fact that we included a feedback link docked at the lower right-hand corner of the browser window.
What do readers think of these feedback forms?
To learn what Wikipedia readers thought of the Article Feedback Tool when using it, we ran a short survey with people who had just posted feedback. A total of 1,472 people filled out that survey, with generally favorable responses: 64 percent found the feedback forms useful, on average. Option 1 and Option 3 were tied, with 66 percent approval for each, while 59 percent liked Option 2.
To understand how typical users view the article feedback process, we also ran a small usability study with four users recruited through usertesting.com. Participants were invited to test one of the feedback forms, then give suggestions on how the designs could be improved. This enabled our product team to refine the user interface design by watching video screencasts of their experience, and hearing their reactions. Overall, users found the feedback process pretty clear, but some were not sure how to edit articles.
Selecting a "winner"
After comparing the results of these various studies, we observed a slight overall preference for Option 1 (even though the other two options were also found useful). For that reason, we decided to optimize our design for Option 1 in the next phase of development. We will keep all other options in mind as we refine our design for the next version of this tool, and will consult with members of our community at each step of the way.
We are now working to make this feedback process more useful to editors and readers alike. Here are some of the features we plan to release and test in coming weeks:
New feedback links
Starting this week, we will test two new feedback links to invite more participation: a small text link below article titles; and a larger graphic button docked at the lower right corner of the browser window. The goal of this short test is to measure the impact of these more prominent links in getting users to contribute useful feedback and edits.
Browsing and filtering feedback
To help editors improve articles based on the best suggestions from our readers, we are developing a special feedback page. The goal of this page is to enable editors to view feedback from readers so they may act on the feedback, when appropriate. The Feedback Page will include tools to make it easy for editors to feature actionable feedback, as well as flag or hide offensive posts. In coming weeks, we will introduce this new set of tools to readers and editors, and refine them based on their reactions.
Measuring impact on engagement
Our next research will focus on the effects of feedback on editing activity. We will first analyze whether link prominence increases or decreases feedback quality and edit conversions. Second, we will test the feedback form against a direct call to edit Wikipedia articles. Lastly, we will also analyze the revert and survival rate of edits in these different experimental conditions. In the process, we will measure clicks on various buttons of the AFTv5 feedback forms, as well as for edit buttons. These tests will be limited to the same small random sample of articles (0.6% of the English Wikipedia) and will only last a few weeks. Read more in our data and metrics plan.
If you would like to contribute to this project, we’d love to have you on board. We are developing this new tool in collaboration with a workgroup of Wikipedia editors, with whom we meet regularly over IRC and other channels. We are looking for more volunteers to test and report on new features, and help improve this article feedback tool as a community.
Here are some of the ways you can contact us or learn more about AFTv5:
- AFTv5 overview page
- Comment on our talk page
- Sign up to evaluate feedback
- Email us at firstname.lastname@example.org
We'd like to give special thanks and recognition to some of the community members who have helped us develop this new tool. Workgroup participants include: Bensin, Dcoetzee, Dougweller, GorillaWarfare, RJHall, Sonia, Tom Morris and Utar. We are grateful to you all for your insights and commitment to this project!
We look forward to working with you all to extend the article feedback tool in coming months. Together, we hope to create a useful feedback system on Wikipedia, to help editors improve articles based on reader suggestions – and to invite readers to become editors over time.
On behalf of the Article Feedback Tool Team,
Fabrice Florin, Product Manager, Editor Engagement
Howie Fung, Director of Product Development
Aaron Halfaker, Research Analyst (contractor)
Oliver Keyes, Community Liaison, Product Development (Contractor)
Dario Taraborelli, Sr Research Analyst, Strategy