CivilServant's Wikimedia studies

From Meta, a Wikimedia project coordination wiki
Jump to navigation Jump to search

CivilServant is a nonprofit that collaborates with online communities to test their ideas to make their communities flourish. We currently are planning studies with Wikipedia communities to test design interventions aimed at retaining new editors and enhancing the experience and motivation of experienced editors. This project is also a first step towards what we hope will be more projects that help Wikipedia communities answer questions important to you.


CivilServant’s is committed to the principles of “Citizen Behavioral Science.” using experimental tools that can effectively identify designs that serve the online community well, at the same time working with those communities to insure that the experimental process is open, transparent and driven by the insights and needs of that community.[1]

CivilServant is an outgrowth of Nathan Matias’ PhD research with Ethan Zuckerman at the MIT Media Lab. CivilServant is now being incubated as a non-profit by citizen media organization Global Voices, which has a history of supporting people around the world to add indigenous language material to the web, including Wikipedia. CivilServant is funded through donations from individuals and foundations. It does not take funds from for-profit corporations.

In the past two years, CivilServant has worked with communities on reddit to test ideas for preventing harassment, managing misinformation, and managing conflict in political discussions.

Current Wikipedia projects[edit]

CivilServant currently plans to complete studies in collaboration with non-English language Wikipedia communities, who will be the primary partners for our research. These projects are financially and administratively independent from the Wikimedia Foundation. We expect to collaborate with individual WMF employees or WMF teams as we discover areas where our community-shaped work needs advice or support from the foundation. We expect to run our research systems on the Wikimedia Labs infrastructure.

Current advisors to CivilServant's research with Wikipedians include Aaron Halfaker (Wikimedia Research), María Sefidari (Elected WMF Board Member), and Dariusz Jemielniak (Elected WMF Board Member).

Gratitude prompts[edit]

In this study, we aim to test whether prompts to thank other Wikipedia contributors can enhance the experience of editors and further motivate those editors. The basic design of the study is described below, but the exact design - including the treatment conditions and outcome measures - will be determined in collaboration with four partnering Wikipedia communities.

In this research, we plan to test two kinds of appreciation messages. The first system, "Thanks", allows readers to privately thank a contributor for a specific contribution on a Wikimedia project, including a new article or a spelling correction. A notification of appreciation is then directed to the contributor. A second system, "Love", allows any authenticated reader (someone with a username and password on the site) to send a "love" that shares a personalized message to a public page that lists all of the appreciation the person has received. In both cases, we will randomly assign participants to receive a prompt to express appreciation for others' contributions and observe the outcomes for sender and receivers.

The primary outcome of interest is editor productivity (i.e. whether editors make more contributions if a page they edited had a gratitude prompt), and other possible measures include readers’ and editors’ attitudes towards other Wikipedians (determined by survey), and cascade effects (i.e. if receivers of gratitude send gratitude messages themselves).

Retaining newcomers[edit]

Wikipedia’s mission to provide a free encyclopedia depends not only on a motivated corps of experienced editors, but also on the ongoing recruitment and retention of new editors. WMF has made the retention of newcomers one of its initiatives in its Growth Team. In independent but complementary work, we plan to collaborate with non-English language Wikipedias to test their ideas for retaining newcomers. One such study would test the effectiveness of French Wikipedia's welcome message.

In planning our research projects we initially anticipated testing the effectiveness of Snuggle to retain new editors while at the same time enhancing the experience of experienced editors. One of Snuggle's advantages is its ability to help identify "goodfaith" newcomers using machine learning. While the communities we have spoken to are strongly interested in mentoring tools and find Snuggle promising, we are now exploring using a modified version of Snuggle or other applications of machine learning to help communities identify newcomers that may most benefit from mentorship or other kinds of support.

WikiLovesAfrica 2019 recruitment[edit]

WikiLovesAfrica[2] is an annual photography contest where anyone across Africa can contribute media to Wikimedia Commons. From January through May 2019, CivilServant and the Princeton University class SOC412 plan to work with WikiLovesAfrica to test messages that recruit people to participate, and also to test messages that guide people to add accurate metadata to the images they uploaded.

Partnering with Wikipedia communities[edit]

The project is currently reaching out to a number of Wikipedia communities we have identified as both having ORES integration (necessary for using AI technology to identify goodfaith newcomers) and enough newcomers each month to give the study adequate statistical power. Those Wikipedias are: Arabic, French, Persian, Polish, Portuguese, Russian and Spanish. The analysis we used to select those Wikipedias is described here: CivilServant Initial Data Analysis For Community Outreach.

In each collaborating Wikipedia we intend to partner with a liaison in that community to act as our guide and research partner. If you are interested in being one of our liaisons, we invite you to read more about the role and, if still interested, to contact CivilServant's research manager Julia Kamin.

Transparency, Privacy, and Open Knowledge: How We Use Data[edit]

We approach every study with the expectation that it could contribute to scientific knowledge. For that reason, in addition to community partnership and review, we also ask a university ethics board (IRB) to review our research procedures.

In every study, we work to integrate strong privacy protections into our work to add to open, transparent, accountable research. While we sometimes introduce even greater privacy protections in especially sensitive cases, here are some of our common practices for most studies (as of Jan 21, 2019):

  • Planning:
    • We always consult with Wikipedia communities on the risks and benefits of our research and data collection before starting a study
  • Data storage:
    • When conducting interviews or surveys, we store contact information separately from personal responses
    • We store and analyze survey data on an access-controlled filesystem that is encrypted at rest
    • We comply with GDPR requests for removal of personal information
    • A year after a study concludes, our standard practice is to delete contact information associated with the study
  • Data publication
    • To protect the anonymity of our participants, we will never publish names or contact information and will never share this information with third parties
    • We never publish open data of survey answers that can be linked back to individual users
    • Before deciding if and how to publish open data, we consult with our community liaisons and carry out a threat model to ensure that our plan adequately manages the risks to participants
    • When publishing data for research transparency, we often use the following practices to reduce the risk of de-anonymization:
      • Generating study-specific unique identifiers for accounts
      • Omitting all information that is not essential to the analysis
      • Publishing aggregate counts rather than individual observations (for example, the number of edits rather publishing each edit)
      • Reducing the precision of measures (for example, reporting registration year rather than a registration timestamp)
      • Reporting aggregate groups rather than absolute ones (for example, reporting someone as "experienced" rather than the precise count of previous edits)

Project Updates[edit]


This project's team currently includes:


Funding For CivilServant's Work with Wikipedia[edit]

This project was made possible through the support of a grant from Templeton World Charity Foundation, Inc, after an application process and review by academic reviewers. They're supporting this grant because they're interested in our research questions, and in order to help CivilServant grow our core operations. Another goal of our grant is to develop software and processes that could help Wikipedians do future research on important questions that matter to you. As an independent research project, CivilServant's views are our own and do not necessarily reflect the views of Templeton World Charity Foundation, Inc. or the Wikimedia foundation.


  1. Matias, J. N., & Mou, M. (2018, April). CivilServant: Community-Led Experiments in Platform Governance. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (p. 9). ACM.
  2. "Celebrating Africa on Wikipedia". Wiki Loves Africa. Retrieved 2018-11-24.