Research:Testing Snuggle’s capacity to enhance the experience of new and experienced editors

From Meta, a Wikimedia project coordination wiki
Jump to navigation Jump to search
Created
10:44, 1 May 2018 (UTC)
Contact
no affiliation
Duration:  2018-May — 2019-May
Schedule.png

This page documents a planned research project.
Information may be incomplete and change before the project starts.


Wikipedia’s mission to provide a free encyclopedia depends not only on a motivated corps of experienced editors, but also on the ongoing recruitment and retention of new editors. As critical, Wikipedia requires its pages be protected against vandals intent on diminishing the quality of those pages. Pursuing these last two goals - retaining new editors and protecting against vandalism - may sometimes be at odds. In particular, Halfaker (2012) suggests that the the AI tool Huggle, introduced to help editors identify vandals, may inadvertently discourage new editors by rejecting contributions that, while falling short of Wikipedia standards, have been made in good faith. At the same time, seeing newcomers primarily as vandals can be demoralizing to experience editors.To counterbalance the potential negative effects of Huggle, Halfaker introduced Snuggle, a tool designed to give experienced Wikipedians the ability to identify and mentor good-faith new editors.

In this project, one of two studies CivilServant plans in collaboration with Aaron Halfaker, we aim to partner with four Wikipedia language communities to test whether Snuggle is able to both retain new editors as well as enhance the experience of experienced editors. The study will use randomized trials, but the exact shape of its design will be developed in collaboration with the needs and insights of the partnering Wikipedia communities. At the completion of the study we will report to the individual partnering communities on the effectiveness of Snuggle in their Wikipedia and to the broader Wikipedia community on the potential of Snuggle improving outcomes across Wikipedia language communities.

CivilServant and "Citizen Behavioral Science"[edit]

CivilServant is a nonprofit, the product of Nathan Matias’ PhD project at MIT’s Media Lab, that collaborates with online communities to improve the quality and scope of engagement on their platforms. CivilServant is committed to the principles of "Citizen Behavioral Science", using experimental tools that can effectively identify designs that serve the online community well, at the same time working with those communities to insure that the experimental process is open, transparent and driven by the insights and needs of that community. In past, CivilServant has worked with multiple reddit communities. CivilServant is incubated by the citizen media organization Global Voices, who have a history of supporting people around the world to add indigenous language material to the web, including Wikipedia. CivilServant is funded through donations from individuals and foundations. It does not take funds from for profit corporations.

Methods[edit]

In this study we will test the effectiveness of Snuggle, a mentoring tool for Wikipedia editors that uses AI technology, to see if its use increases the retention of new editors while at the same time enhancing the experience of experienced editors who use the tool and also continuing to minimize the harm of vandals. The basic design of the study is described below, but the exact design - including the exact conditions, recruitment of volunteers and outcome measures - will be determined in collaboration with the four partnering Wikipedia communities.

The experiment will include two levels of treatment and control. At one level, newcomers will be randomly assigned to one of at least two conditions; inclusion in Snuggle, giving them the opportunity to be identified and mentored by an experienced editor, or the control treatment in which they would not be included in Snuggle. Mentors who volunteer for the randomized trial will likewise be assigned to one of at least two conditions: using Snuggle to identify and mentor newcomers or a control condition in which they do not mentor newcomers.

The primary outcome of interest is newcomer retention (i.e. whether newcomers continue to contribute to Wikipedia). Other possible measures include newcomers’ and experienced editors’ attitudes towards other Wikipedians (determined by survey), number of removed comments, and number of overall edits.

Partnering Communities[edit]

The project is currently reaching out to a few Wikipedia communities who have already expressed interest in Snuggle to act as our pilot partner, helping us to internationalize Snuggle and setting up a base experimental design. We will soon announce a broad call to Wikipedia language communities to help us find three additional partners who will design and run a second round of experimental trials. To maximize the effectiveness of those trials, we will seek to collaborate with communities that are geographically diverse, are strongly committed to testing the potential impact of Snuggle and have a minimum number of potential experienced editors to act as mentors.

If you are active in a Wikipedia language community and want to hear from us when our formal call for partners goes out, please add your username here.

Timeline[edit]

Summer 2018: Work with pilot community and identify three additional partnering communities Late summer / early Fall: Work with communities to tailor experiment design Fall: Run study Winter 2019: Report to partnering communities and Wikipedia community as whole

Policy, Ethics and Human Subjects Research[edit]

CivilServant will work with Princeton University Institutional Review Board to insure the study’s design and consent are in line with ethical standards applied to scholarly research. In addition to those requirements, CivilServant will work with participating Wikipedia communities to ensure that the design respects the dignity of all participants. When those plans are complete and approved we will post them here.

Results[edit]

Results of the study will be be posted here. We will report not only on the effectiveness of Snuggle in each of the four communities that tested it, but also on its potential to improve the experience of newcomers and experienced editors in other Wikipedia communities.

References[edit]

Halfaker, A., Geiger, R. S., Morgan, J. T., & Riedl, J. (2013). The rise and decline of an open collaboration system: How Wikipedia’s reaction to popularity is causing its decline. American Behavioral Scientist, 57(5), 664-688.