Grants:Programs/Wikimedia Research Fund/Disinformation, Wikimedia and Alternative Content Moderation Models: Possibilities and Challenges (2022-23)

From Meta, a Wikimedia project coordination wiki
Disinformation, Wikimedia and Alternative Content Moderation Models: Possibilities and Challenges (2022-23)
start and end datesJuly 2023 - July 2024
budget (USD)47,147 USD
fiscal year2022-23
applicant(s)• Ramiro Álvarez Ugarte



Ramiro Álvarez Ugarte

Affiliation or grant type



Ramiro Álvarez Ugarte

Wikimedia username(s)

Project title

Disinformation, Wikimedia and Alternative Content Moderation Models: Possibilities and Challenges (2022-23)

Research proposal[edit]


Description of the proposed project, including aims and approach. Be sure to clearly state the problem, why it is important, why previous approaches (if any) have been insufficient, and your methods to address it.

This research proposal seeks to answer a narrow question: how do Wikipedians identify trustworthy sources when discussing controversial Wikipedia articles. The issue is relevant to address the “epistemic crisis” currently threatening democracies around the globe (Benkler 2018, chap. 1). The community-led moderation model championed by Wikipedia competes with the algorithmic and automated models promoted by social media platforms. Both models are ultimately based on different conceptions of what the Internet is and should be (Lessig 2006, chap. 6).

Understanding how Wikipedia deals with disinformation around polarized issues may produce relevant insights into the strengths and challenges of a model that is based on a robust conception of community, as opposed to the others’, based on advertising and lucre.

Our approach is based on previous research on Wikipedia’s governance (de Laat 2012; Grimmelmann 2015; R. Stuart Geiger and Halfaker 2016; Caplan 2021). While until now that research has studied Wikipedia’s policies and procedures (McDowell and Vetter 2020), its communal ethos (Konieczny 2009) and its long history (Rijshouwer 2019), we propose to contribute to this line of research by producing three new case studies, that focus on important events in Latin America that were subjected to the all-too-familiar logics of polarization. In particular, we want to study the following Wikipedia entries:

- The 2019 political crisis in Bolivia.

- The death of prosecutor Alberto Nisman in Argentina (embedded within his biographical entry).

- The portuguese entry on the Operação Lava Jato .

We propose to study how Wikipedians validated sources and solved disputes around these three articles. Our goal is to better understand how reputable sources were identified, what sort of controversies emerged, and how these were resolved. We propose a methodology of trace ethnography (Rijshouwer 2019, 40) that will look at how these articles evolved and the kind of discussions that they generated among Wikipedians who proposed editions. Furthermore, we want to understand how editors understand their role, how they fight the polarization that affects their communities, and what strategies they develop to deal with tensions that naturally ensue. Our hypothesis is that commitment to well-established community norms (such as the need to use trustworthy sources and enforce the neutral point of view) might be a good antidote to combat disinformation under this moderation model.


  • Agustina del Campo, director of CELE
  • Josefina de las Carreras, chief of Staff and communications director at CELE


Approximate amount requested in USD.

47,147 USD

Budget Description

Briefly describe what you expect to spend money on (specific budgets and details are not necessary at this time).

The budget seeks to cover the sallaries of one full-time researcher, under the direction of a senior researcher. It also includes funds reserved for dissemination activities, described below.


Address the impact and relevance to the Wikimedia projects, including the degree to which the research will address the 2030 Wikimedia Strategic Direction and/or support the work of Wikimedia user groups, affiliates, and developer communities. If your work relates to knowledge gaps, please directly relate it to the knowledge gaps taxonomy.

We believe our research contributes to Wikimedia 2030 Movement Strategy Recommendations, in particular to the goals of (a) improving user experience; (b) to identify topics of impact and (c) to ensure equity in decision-making. We also believe our research fills knowledge gaps, in particular in relation to how Wikipedians in Latin America deal with verifiability and neutrality in a context of polarization. The research would document best practices, useful both for Latin American editors and other players within the Internet governance ecosystem. Beyond Wikipedia, the research could also have a broader impact on dinsinformation regulative initiatives in the region and in the three countries that will be studied.


Plans for dissemination.

1. Within the Wikipedia community. We propose to present and discuss findings with Latin American users groups.

2. LatAm digital rights movement. Share with Latin American social networks working on digital rights, such as the AlSur conglomerate. We will share findings through CELE's engagement activities with LatAm legislators and through CELE's yearly regional workshop.

3. Globally. We plan to share findings in global fora: IGF, RightsCon, GNI, the IACHR and the Paris Peace Forum.

Past Contributions[edit]

Prior contributions to related academic and/or research projects and/or the Wikimedia and free culture communities. If you do not have prior experience, please explain your planned contributions.

CELE's main research is focused on Internet regulation. We have worked on disinformation for it has been a rather pressing issue in Latin America, specially in the last few years. We have produced a report on [platform's reactions]( to the disinformation scare, a conceptual paper discussing [missing links]( in the global conversation about the issue, and a paper discussing policies dealing with disinformation [in the context of the COVID-19 pandemic](

I agree to license the information I entered in this form excluding the pronouns, countries of residence, and email addresses under the terms of Creative Commons Attribution-ShareAlike 4.0. I understand that the decision to fund this Research Fund application, the application itself along with all the information entered by my in this form excluding the pronouns, country of residences, and email addresses of the personnel will be published on Wikimedia Foundation Funds pages on Meta-Wiki and will be made available to the public in perpetuity. To make the results of your research actionable and reusable by the Wikimedia volunteer communities, affiliates and Foundation, I agree that any output of my research will comply with the WMF Open Access Policy. I also confirm that I have read the privacy statement and agree to abide by the WMF Friendly Space Policy and Universal Code of Conduct.