Wikimedia Foundation/Legal/Community Resilience and Sustainability/Trust and Safety/2024 Election Anti Disinformation Report
Executive Summary
[edit]This report is a summary of the observations and recommendations from the Disinformation Response Teams (DRTs) set up by Trust & Safety to support three key elections in calendar year 2024.
Context
[edit]2024 was a milestone year for elections across the globe. Nearly 64 countries were scheduled to go to polls in 2024, with more than 4 billion people eligible to vote. Simultaneously, mis- and disinformation targeting these elections was identified as one of the biggest challenges, especially for platforms operating in the information ecosystem.
Taking into account the disinformation risks, impact of information manipulation on the safety of the Wikimedia movement, and the number of people going to the polls, the Trust & Safety team decided to support anti-disinformation work around three major elections in 2024. These were
- the Indian parliament election in April-May,
- the European parliament election in June and
- the US election in November
Three separate Disinformation Response Teams (DRTs) were set up to support the communities and anti-disinformation work before and during these elections. Each DRT was composed of interested volunteers from the affected communities, as well as members from the T&S Disinformation and Human Rights teams. The DRT met regularly to discuss both preventative measures and any necessary responses to disinformation on the projects, drawing on expertise from across the Foundation to support efforts where needed.
Below is the summary of the observations and recommendations from the DRTs.
Observations
[edit]Minimal on-wiki disinformation, lots of low-level vandalism
[edit]Across all three elections, the DRTs saw minimal disinformation activity in 2024. Higher awareness of election disinformation, pre-election investigations by T&S, better preparedness and robust community processes meant that on-wiki disinformation attempts were not only limited, they were quickly identified and remedied.
The DRTs did experience an uptick in hit-and-run vandalism across the relevant projects, where articles were vandalised by IP editors or throwaway accounts, before being screenshotted and shared on other platforms, including on communications channels (like Whatsapp and Telegram). These edits were quickly reverted[1] and accounts blocked by administrators. However, this trend indicates a growing interest in bad actors' seeking to use Wikipedia’s reliability to spread disinformation.
Disinformation deterrents- community maturity & diversity
[edit]The DRT work around the 2024 elections centered on large, established Wikipedia communities, enabling us to see the ways in which those highly established community processes and community composition served as deterrents to disinformation. These communities all had highly developed rules and systems, with efficient and effective enforcement mechanisms. This means that, for a disinformation spreader to be successful on these Wikipedias, they must engage with sophisticated community processes which, in turn, makes the cost of doing business high compared to, say, spreading disinformation via social media platforms.
Communities served by the DRTs were also quite diverse, in that not only are the editors based in multiple countries and speak other languages, they also come from different socio-cultural backgrounds. We sought to represent that same diversity within the DRTs. We found that varying on-the-ground perspectives allowed group members to better foresee who might be likely to spread disinformation, on what topics, and with what agenda, and more swiftly identify it when it arose.
Timely and bi-directional sharing of information
[edit]As we observed with previous iterations of pre-election work (in FY 2021-22 and 22-23), there is value in sharing information and insights early and often in the lead-up to an election. This sharing often includes common and recent disinformation trends, on-wiki activity related to relevant articles, and off-wiki insights on social and political issues. As such, a key function of the DRTs was enabling the constant sharing of information between the communities and the Foundation. DRTs shared insights, questions, concerns and approaches often and early, starting at least 4-6 weeks prior to the election date. This helped to ensure that community members involved in the election focused anti-disinformation work were aware of and prepared for the most relevant risks, and ensured that the Foundation was appropriately prepared to support these communities. All three DRTs continued their sharing sessions throughout the election cycle, allowing us to reassess priorities based on the actual threats happening in real-time.
Disinformation about Wikipedia on other platforms is the big challenge
[edit]Throughout this year, but more pronounced around the elections, was the attack on Wikipedia as an institution. Across traditional and social media, we saw an increase in disinformation activity targeting Wikipedia, the editors, as well as the Wikimedia Foundation. This is not entirely surprising given that disinformation actors are unlikely to give up when their on-wiki efforts are thwarted. Campaigns are highly likely to change their tactics, focusing on an affective messaging[2] to smear Wikipedia as biased against them or in favour of their political opponents.
Recommendations
[edit]Drawing from the observations above, T&S makes the following recommendations:
Invest in underserved communities
[edit]T&S Disinformation has focused on the most obvious targets of disinformation activity on our projects. These happen to be mostly some of the larger projects, which also have robust processes and a diverse pool of editors. There is an established history of ongoing collaboration–disinformation and otherwise–with some of these larger communities. As such, they have outsized access to insights, tools and resources helpful in anti-disinformation work.
The same can not be said of some medium-sized projects with an equal or higher threat of disinformation issues. It would then be prudent and timely that we start investing a portion of our resources in working with these communities. This could start in the form of supporting elections with perhaps less global attention, and expand to more regular work with the goal of building an ongoing partnership.
The anti-Wikipedia disinformation fight
[edit]As the focus of disinformation actors has shifted from prioritizing on-wiki manipulation to targeting Wikipedia in disinformation and smear campaigns on other platforms, we will need to adapt our response. It is highly likely that state and non-state actors will attempt to paint Wikipedia and the broader Wikimedia movement as deliberately biased, and by extension, unreliable. These campaigns will serve two broad goals- lower public trust by discrediting Wikipedia, and assert control over the neutral and independent information ecosystem that exists on Wikipedia.
T&S Disinformation should continue to organize a collaborative effort- uniting affiliates, online editors, and relevant Foundation teams like Legal, External Communications, Global Advocacy, and the Office of the CEO- to develop new responses to this changing dynamic.
Adjusting the scope of the Foundation's anti-disinformation work beyond elections
[edit]Elections remain a focal point of disinformation activity, and our response matters, both from a regulatory perspective and also making sure that Wikimedia projects are not used for enabling information manipulation around elections. However, disinformation campaigns are not just focusing on elections–Wikimedia volunteers are actively combating disinformation on the projects every day, and the Foundation should seek to expand its capacities in other subject areas as well.
Some of the more persistent and regular issues of our time- geopolitical conflict, climate change and local, domestic issues- will continue to demand ever increasing share of our resources. As such, it might be prudent to increase the Foundation's focus on these and plan for scenarios where a large share of anti-disinformation work is focused on non-election issues.
References and Notes
[edit]- ↑ Research:Wikipedia during 2024 Elections
- ↑ Martel, C., Pennycook, G. & Rand, D.G. (2020). Reliance on emotion promotes belief in fake news. Cognitive Research 5, 47 (2020)