The main goal is to generate and assess actionable metrics for source controversiality in Wikipedia. To guarantee universality (i.e. applicability to all Wikipedia language editions), knowledge equity and avoid dependence on the specifics of a given language, we will solely rely on language-agnostic approaches using mainly data from editing activity. The project will build upon existing work of the Contropedia project that already exhibited the potential of language-agnostic approaches to measure the controversiality of wikilinks in a given article, and develop similar techniques to approximate controversiality of a given source across multiple articles and Wikipedia language editions.
Wikimedia's Strategy 2030 report identified misinformation and disinformation as threats to the Wikimedia movement's goal of making free knowledge available to all. Specifically, "Wikimedia projects are vulnerable to government, political, cultural, or profit-driven censorship and misinformation campaigns, as well as outright falsified content". Furthermore, there has been evidence that identifying non-reliable sources is an effective tool to combat disinformation and increase the knowledge integrity of Wikipedia. In this sense this project will combine the knowledge and community effort stored in the few language editions with already existing perennial sources list together with the interaction between community member thorough edits and reverts around references to generate measurements to assess source controversiality. This will facilitate information about source credibility to editors as well as readers and allow the generation of perennial source lists in many other language editions.
Contrary to the Contropedia, this project will calibrate and assess the quality of the developed metrics, comparing them with the already existing lists of Reliable/Perennial sources in the 9 language editions where such a list exists (at the time this project proposal is written).
To be able to deliver an actionable first proof of concept prototype, we will start by focusing only on articles related to the topic of Climate Change. This choice is motivated by recent reports by BBC News (Dec 2021) of knowledge integrity issues on non-English Wikipedias, ranging from neglect - science that's out of date - to a lack of balance, to active disinformation. This issue has caught the attention of different movement volunteers and WMF staff, providing abundant data and contextual qualitative knowledge. Once proven useful for articles related to the topic, we will seek support to be upscaled to the entire set of articles across the over 300 hundred language editions of Wikipedia.
The project consists of 3 phases:
- Phase 1: Data preparation
We will build a comprehensive set of articles related to the chosen topic in the existing language editions of Wikipedia.
To identify the subset of Wikipedia articles relevant to climate change, we will follow a similar methodology as done for example in Markusson et al. (2016)  for articles related to Geoengineering. The idea is to start at the Wikipedia page for the Category Climate Change in the English Wikipedia, and then expand and manually prune subcategories until a certain depth. Additionally we will also contrast and extend the obtained list of articles with this community curated list.
Once the set of articles relevant to the topic is identified, we will find their corresponding version in other language editions through inter-wiki links and retrieve their complete edit histories. Finally, all the references appearing in the revisions of these Wikipedia articles will be collected and standardised. In particular we will identify the source domain of every reference-link.
- Phase 2: Metric development & Evaluation
Using the data collected in Phase 1 and based upon the definition of controversiality used in the Contorpedia project (See Borra et al., 2015 ). We will develop metrics of controversiality first related to a specific reference and then aggregated to the corresponding source domains. Alternatively, the possibility of directly developing controversiality metrics for the source domain will also be explored.
The approach of Borra et al. (2015)  is based on counting substantial disagreeing edits (i.e, edits which involve some deletions of text but are not marked as vandalism) of sentences which contain the element under study (a reference in our case). This approach is only a starting point that will serve as a baseline and will be adapted to reach better results in the evaluation phase.
To be able to evaluate the metrics with the categories used in the lists of Reliable/Perennial source domains we will map them in at least three categories (Generally reliable, Generally unreliable, undecided) with optional separation into more categories. For this mapping purpose we will explore with both, simple straight forward discretizations of the controversiality metrics, as well as explainable machine learning algorithms to predict the categories based on different variants of the controversiality metrics and other language agnostic descriptors as input features.
We will perform several iterations of metric design and evaluation, focusing each time on how to improve upon the misclassified Source categories and comparing also with the scores obtained from the https://iffy.news/ project when available and from Media Bias Monitor and Media Bias Fact Check as used in recent study (Yang ad Colavizza, 2022) about potential biases of news sources on Wikipedia.
Finally, to avoid overfitting the project will also verify its results and the underlying processing pipeline on articles related to another set of topics not directly related to climate change. We plan to focus on vaccine hesitancy related articles starting from the Vaccine hesitancy category page, a timely topic given the recent related controversies about the COVID-19 vaccines.
- Phase 3: Building of a prototype
Finally, we will be develop of a proof of concept prototype capable of extracting the developed metrics for the sources of a set of given input articles.
- Markusson, N., Venturini, T., Laniado, D., & Kaltenbrunner, A. (2016). Contrasting medium and genre on Wikipedia to open up the dominating definition and classification of geoengineering. Big Data & Society, 3(2), 2053951716666102.
- Borra, E., Weltevrede, E., Ciuccarelli, P., Kaltenbrunner, A., Laniado, D., Magni, G., Mauri, M., Rogers, R., & Venturini, T. (2015). Societal controversies in Wikipedia articles. In Proceedings of the 33rd annual ACM conference on human factors in computing systems - CHI 2015, (pp. 193-196).
- Yang, P., & Colavizza, G. (2022). Polarization and reliability of news sources in Wikipedia. arXiv preprint arXiv:2210.16065.