Grants:Programs/Wikimedia Research Fund/Disinformation, Wikimedia and Alternative Content Moderation Models: Possibilities and Challenges

From Meta, a Wikimedia project coordination wiki
statusnot funded
Disinformation, Wikimedia and Alternative Content Moderation Models: Possibilities and Challenges
start and end datesStart date: 2022-02-01, End date: 2022-12-15
budget (USD)40,000-50,000 USD
applicant(s)• Ramiro Álvarez Ugarte



Applicant's Wikimedia username. If one is not provided, then the applicant's name will be provided for community review.

Ramiro Álvarez Ugarte

Project title

Disinformation, Wikimedia and Alternative Content Moderation Models: Possibilities and Challenges

Entity Receiving Funds

Provide the name of the individual or organization that would receive the funds.

Centro de Estudios en Libertad de Expresión y Acceso a la Información (CELE)

Research proposal[edit]


Description of the proposed project, including aims and approach. Be sure to clearly state the problem, why it is important, why previous approaches (if any) have been insufficient, and your methods to address it.

This research proposal seeks to answer a narrow question: how do wikipedians identify trustworthy sources when discussing controversial Wikipedia articles, in different cultural and social settings. The issue is relevant to address the “epistemic crisis” currently threatening democracies around the globe (Benkler, Faris, and Roberts 2018, chap. 1). The community-led moderation model championed by Wikipedia competes with the algorithmic and automated models promoted by social media platforms such as Facebook and Twitter. Both models are ultimately based on different conceptions of what the Internet is and should be. While the former builds upon the idealized but nevertheless real idea of communities developed around the early horizontal and decentralized Internet (Lessig 2006, chap. 6), the latter is based on corporate definitions of rules, codes of conduct and terms of services that speak of community but seldomly practice it. Hence, to understand how a real community works in practice would be a fundamental contribution not only to the content-moderation anxiety currently spreading throughout the West among regulators and civil society, but also to the imperative task of imagining what the future of the Internet may look like under a different model.

Our approach is based on previous research that has studied how Wikipedia community-led moderation works (de Laat 2012; Grimmelmann 2015; Geiger and Halfaker 2016; Caplan 2021). Our method seeks to inquire into archives and logged discussions among Wikipedia editors and contributors. (It may also seek to interview some of them if that move is called for by findings). For the purpose of narrowing down our research, we propose to identify three controversial issues, in three different countries. While our approach is yet tentative, we think that India, the United States and Brazil are three likely fruitful scenarios where to study how Wikipedia’s community deals with disinformation and the issue of “trusted sources” in contexts of disagreement and controversy. The stakes involved in better understanding how community-led moderation works in practice are high, even considering our narrow question. A community-led moderation system that successfully manages to sieve sources for trustworthiness may not only provide important insights for other moderation models but may also show how bottom-up approaches are more respectful of the fact of pluralism.


Approximate amount requested in USD.


Budget Description

Briefly describe what you expect to spend money on (specific budgets and details are not necessary at this time).

  • Research - 24,000
  • Research assistant - 18,000
  • Transcripts, coding - 3000 The budget seeks to cover the sallaries of at least two researchers, on a full-time basis. It reserves enough time for all the relevant stages of research, including the development of the “case studies” that will be used for comparison. We will also aim to work with other organizations to conduct research, specially in India.


Address the impact and relevance to the Wikimedia projects, including the degree to which the research will address the 2030 Wikimedia Strategic Direction and/or support the work of Wikimedia user groups, affiliates, and developer communities. If your work relates to knowledge gaps, please directly relate it to the knowledge gaps taxonomy.

We believe our research may yield important insights useful for Wikimedia’s future, insofar as it would be a third-party, independent research of internal practices. In particular, our proposal—seeking to understand how different Wikipedia’s communities deal with the challenge of identifying trusted sources on controversial issues—will serve the purpose of strengthening Wikipedia as a movement, by highlighting differences and shared values, one of which is related to the trust users place on Wikipedia as a source of information.


Plans for dissemination.

CELE plans to disseminate its research widely in its primary region of influence (Latin America), within the regional Wikipedia’s movement and more broadly within the local digital rights movement. Furthermore, CELE participates in fora such as World Movement for Democracy, the Carnegie Digital Democracy Network, and GNI. We regularly engage freedom of expression rapporteurs in the UN, the OAS, and the OSCE.

Past Contributions[edit]

Prior contributions to related academic and/or research projects and/or the Wikimedia and free culture communities. If you do not have prior experience, please explain your planned contributions.

While CELE’s main research is focused on Internet regulation from a legal standpoint, we are currently in the midst of researching Latin America’s digital rights movement through a social movement approach. Our initial findings suggest that many of the activists and organizations currently forming the movement in the region were attracted to these issues by the free culture communities developed in Latin America at the turn of the century. Our proposed research would underscore this social movement dimension, and the important of recovering the value of actual communities, that underscore the value of pluralism, in addressing content-moderation challenges.

I agree to license the information I entered in this form excluding the pronouns, countries of residence, and email addresses under the terms of Creative Commons Attribution-ShareAlike 4.0. I understand that the decision to fund this Research Fund application, the application itself along with all the information entered by my in this form excluding the pronouns, country of residences, and email addresses of the personnel will be published on Wikimedia Foundation Funds pages on Meta-Wiki and will be made available to the public in perpetuity. To make the results of your research actionable and reusable by the Wikimedia volunteer communities, affiliates and Foundation, I agree that any output of my research will comply with the WMF Open Access Policy. I also confirm that I have read the privacy statement and agree to abide by the WMF Friendly Space Policy and Universal Code of Conduct.