Jump to content

Research:Content Translation Newcomer Survey, India 2020

From Meta, a Wikimedia project coordination wiki
15:56, 11 June 2020 (UTC)
Duration:  2020-February – 2020-April
This page documents a completed research project.

Overview: Currently most feedback on Content Translation is received through informal channels and may not adequately represent the diversity of voices of those who use the tool. Moreover, there is not an easy way for edit-a-thon and community organizers to help solicit feedback on the tool during events involving Content Translation. The goal of this project was to develop and iterate upon an easily-customizable survey that can be used to more easily, quickly, and reliably collect feedback from a more diverse pool of Content Translation users. During the project, the survey was administered during a series of edit-a-thon events utilizing the Content Translation tool, and results provide insight into the Content Translation newcomer experience.

Content Translation Newcomer Experience Survey

10 Key Takeaways[edit]

  1. A survey administered in coordination with edit-a-thon events was an effective way of collecting feedback from new CX users. It also provided for a balanced sample in terms of gender and representation from many language communities.
  2. Administering in coordination with an edit-a-thon allowed us to collect feedback on general usability and perceptions, but not around discovery of the tool (since it was a guided introduction). Administering via the live interface would provide a better opportunity for collecting feedback around entry points.
  3. Speed matters. When examining what users enjoyed most about the tool, features that benefit speed of translation and article creation came in at the top.
  4. Vocabulary support options are highly valued as much as the overall option of having MT available is.
  5. Top dislikes about the tool were related to concerns/worry about users articles being accepted (not deleted), and technical issues with different content types (infoboxes, etc…).
  6. To build new user confidence, we can ease the process of publication and help build confidence that users articles will survive.
  7. Nearly 40% of respondents did not realize that they would need to improve the machine translations produced in CX. There are opportunities to better surface the nature of human-MT interactions.
  8. Section Translation receptiveness was around 82% overall. 52% responded directly favorable and 30% were neutral, suggesting they may not have considered and thought about the option previously to being asked about the possibility.
  9. Minimal gender differences were noted - both around what users enjoyed and disliked the most. A few gender difference tendencies were noted, but were only marginally statistically significant.
  10. Potential differences by function of the language community were explored, but a greater sample size is needed to test the statistical significance of such differences and ensure a representative sample.

NOTE: An updated, modified version of the survey has been added to Best Practices for Content Translation Events as an additional resource for collecting feedback.