Grants:Project/Ocaasi/Misinformation And Its Discontents: Narrative Recommendations on Wikipedia's Vulnerabilities and Resilience

From Meta, a Wikimedia project coordination wiki
statuswithdrawn
Misinformation And Its Discontents: Narrative Recommendations on Wikipedia's Vulnerabilities and Resilience
summaryTo prevent disinformation on Wikipedia, we need to explore the current understanding, community practices, external expertise, and potential interventions that can address it.
targetLarge and small Wikipedias from the Global North and Global South
amount$30,000 USD
granteeOcaasi
contact• jorlowitz(_AT_)gmail.com
this project needs...
contact
organization
volunteer
join
endorse
created on21:09, 17 February 2020 (UTC)


Project idea[edit]

What is the problem you're trying to solve?[edit]

Disinformation is an urgent topic of study and concern, sadly because it's eroding our civil society and trust in media through viral activity on massive social networks. Misinformation [1] is defined as merely wrong information, and may or may not be intentional. Disinformation [2], however, is intentionally wrong--and also harmful.

On Wikipedia, disinformation can come from good-faith editors who (mis)use sources that are actually (dis)information, single-purpose accounts that seek to skew and bias articles, paid editors with a financial interest in promoting client interests [3], coordinated groups that look to "brigade" topics where they have an agenda, and State Actors where governments seek to undermine political dissent or insert propaganda.

When it comes to this massive existential problem, we don't know its scope and we haven't identified the range of solutions. We haven't spoken with enough experts, or synthesized their wisdom and guidance. We have yet to distill community knowledge and connect people working on the same or similar issues from different perspectives. We are at the frontier of a disturbing trend in our digital information age, and we are working blind. We don't know the nature of the problem, we don't have a roadmap for the fixes, and we don't have a story that ties them together.

Wikipedia is generally considered a generally reliable source [4]. Wikipedia maintains this reputation despite a long history [5] of edit wars at controversial articles [6]. In many areas the encyclopedia has demonstrated remarkable capacity against attempts to introduce falsehoods and bias it. In other cases, however, even simple hoaxes [7] show off the gaps in our defenses and their broader negative impact ([8] [9][10]).

Considering the wholesale havoc which misinformation and disinformation has wrought on sites such as Youtube ( [11] [12] [13] [14] [15]), Facebook ([16] [17] [18] [19] [20] [21]), Twitter ([22] [23] [24] [25] [26]), and WhatsApp ( [27] [28] [29] [30] [31]), Wikipedia may even be in a relatively enviable position. At the same time, an encyclopedia holds itself to a higher standard of reliability than other social web properties--and perception of trustworthiness is arguably more important for a project that prides itself on good information.

As information spreads with increasing speed through social networks, as news outlets struggle to combat lies and propaganda, and as political and government attempts to bias the information landscape become more widespread and complex, how do we as a neutral encyclopedia address the growing threat of disinformation on our own projects?

There are troubling and recent examples: a made-up Nazi Death Camp" [32], inserting pro-gun bias [33], whitewashing celebrity scandals [34], pro-Iranian networks [35], Coronavirus disinformation [36], disappearing concentration camps [37], fake chemical plant explosions [38], blamed missile attacks on planes [39], state propaganda [40], hoaxes about famous accessories [41], false rap album release dates [42], cryptocurrency lies [43], government censorship [44], and historical revisionism [45].

Whatever defenses we have amassed, we are still under constant attack.

Larger Wikipedias are likely doing better than most in robustness and resilience against misinformation merely due to the size of their active contributor base. Yet even our larger wikis are not immune from nefarious attempts to bias them, and the vulnerability of smaller wikis with many fewer editors, patrollers, and tools is presumably far greater.

And for highly sophisticated disinformation campaigns [46] by paid hackers or intelligence agencies funded by foreign governments, we are not yet in a position to say whether there have been serious attempts, or if so, at what scale. Our vulnerabilities may be overstated [47], or, we could be dangerously blind to what is happening already and what is coming. The nature of Wikipedia's anonymous or pseudonymous contributors could allow a well-funded targeted effort to develop trusted users who could be influential; or to compromise existing trusted users (e.g. admins). We need to do deeper study, because we don't know what we don't know, and covert sophisticated campaigns would be, well, covert and sophisticated.

What is your solution to this problem?[edit]

This will be an investigative exploration, a narrative project with the goal of practical recommendations for implementation and expansion. The focus is on five core questions:

  • What policies, practices, and tools make some Wikipedias more robust against disinformation?
  • How are smaller or less active Wikipedias and Wiki communities more vulnerable to disinformation?
  • What trainings and tools would improve our resilience to disinformation attempts?

First, we must understand what disinformation means and how it operates. Then we must investigate the community members whose regular practice involves fighting disinformation (a preliminary list of interviewees can be provided to the grants committee to protect confidentiality). We must look to Foundation staff and outward to external experts who are actively planning ways to combat disinformation. Finally, we must compile, analyze, and recommend actionable interventions.

Methodology[edit]

I will have rich conversations and see where they lead, connecting threads across different people and subjects while trying to create a useful end-product of high-impact recommendations. The conversations are geared towards this and, rather than meandering, are trying to zero in on what interventions can make the most difference.

The selection of interviewees is dependent on a selective network crawl. In this method, a list of initial experts who have already demonstrated public experience studying the subject are asked to provide recommendations for other editors to speak with, who have similar projects or interests. Those editors can subsequently cross-reference peers of whom they are aware of working with the same challenges. In effect, this creates a functional network or web of community leaders who may not know each other specifically, but are connected by shared challenges and varying approaches to a common problem.

One potential drawback of a community network crawl is that you can hit "eddies" where the same people know of each other; you can get stuck in a familiar pool. In order to remedy this, I have seeded the potential interviewee list with a variety of starting points that vary by gender and region. This preliminary list of interviewees is already representative of multiple continents, projects, genders, and languages. Those are the main data points I will collect to approach diversity: region, wiki, gender identification, and language.

Specifically will target active editors who do the most patrolling, editors who work on controversial subjects, editors from the global south, and editors who have experience combating disinformation. Communities that are most successful at addressing disinformation relative to the size of their active editor base, and communities that are less successful at addressing disinformation relative to the size of their active editor base are of particular interest. These are the people most affected by and best positioned to expose and address disinformation.

I will create a Meta project page about disinformation where I collect learnings and resources. I will host and publish the landscape review, intervention menu, and actionable recommendations on that page. There will be a place for people to sign up for updates and to contribute to the broader initiative of finding solutions for disinformation. I will leverage social media to disseminate findings widely through blog posts, Facebook groups, mailing list posts, and, safety permitting, attendance at community conferences (or virtually).

Targeted solutions range from appropriate policies to citation tagging to community noticeboards to project trainings. One of the fundamental outcomes of this project is to identify different approaches to fighting disinformation, speak with those pioneering the approaches, contextualize where each approach would have the most value, and work to create a vision for how the approaches can work together to address disinformation.

Which of these solutions are most effective? How are they currently employed, on which projects, and by whom? A major goal of this project is to explore each area of intervention to determine its usefulness and its relation to other approaches. The best way to do this is direct conversation with the people leading these efforts.

SOLUTIONS TO INVESTIGATE FURTHER

Wikimedia Legal Terms of Use: Our projects need clear prohibitions on the intentional misuse of information. This begins with the Wikimedia Foundation's Terms of use, which states that "Engaging in False Statements, Impersonation, or Fraud" is a violation [48]. Second, the Foundation implemented a Conflict-of-Interest Policy that requires disclosure of "your employer, client, and affiliation with respect to any contribution for which you receive, or expect to receive, compensation" [49]. To complicate matters, however, national legislative attempts to make disinformation illegal actually could endanger full and accurate coverage of controversial subjects.

Wikimedia Foundation Research: Disinformation is already an active area of research at the Wikimedia Foundation. The Wikipedia Library program, Wikimedia Foundation Research, the Legal and Public Policy team, and the Wikicite initiative are each involved in addressing components of disinformation. An initial dive has been done through a literature review [50] of disinformation. There is a broader plan to address and improve Knowledge Integrity [51]. We now know more about how social media and Wikipedia articles interact [52], how readers use (or don't use) citations [53], which citations are being added and removed [54], and how to detect sockpuppets [55].

Outside Expertise: Outside the Wikimedia Movement, numerous organizations have formed or focused to attempt to understand and limit the harm of disinformation. Groups like Credibility Coalition ([56]), MediaWell [57], MisinfoCon [58], Meedan [59], Demos [60], Hewlett Foundation [61], and the Data & Society Research Institute [62] are among dozens looking to tackle this sprawling and complex problem. They are producing their own useful research on various dimensions of the subject ([63][64][65][66][67][68][69][70] [71]). There has also been a proliferation of fact-checking efforts ([72]][73][74][75][76][77][78][79][80][81]). In a tertiary way, Wikipedia is one of them.

Active Editors, Diversity, and Health: Having a sufficient number of informed editors, from a variety of backgrounds, in a civil editing environment, may be the most effective bulwark against disinformation. Larger Wikipedias have more capacity; more diverse Wikipedias have more willpower and resistance against bias; and lower-harassment communities invite in more people to positively contribute. The fundamental community components are essential to preventing issues like, for example, a national narrative becoming an entire language version of Wikipedia's narrative. Creating an informed, large, diverse and healthy community may ultimately be more important than any specific or sophisticated disinformation intervention.

Editing Policy: Inside our projects, communities benefit from clear principles that outline the correct ways to handle information and verification. This includes strong consensus on Neutral Point of View [82], Verifiability [83], Reliable Sources [84], Advocacy [85], and Conflict of Interest [86]. Strong communities also have forums to help apply these principles such as English Wikipedia's Reliable Sources Noticeboard [87], Conflict of Interest Noticeboard [88], and WikiProject Reliability [89].

Capacity and Training: No community arises fully formed, and there is a large role to play in basic capacity development and training. The Community Capacity team [90] at the Foundation has developed basic and useful materials [91] that give editors the understanding they need to apply core principles and policies of neutrality and verifiability. The Community Development team is building "The Learning Platform", to help spread these lessons in an engaging format. Top priority among the modules is basic information literacy. Many editors have simply never encountered concepts relating to finding, evaluating, and paraprhasing sources. Of note, there is a tension in communities that are more accustomed to "orality" and the heavily text-biased nature of Wikipedia--this has systemic bias implications as well.

Citation Categorization: One of the most promising areas of fighting disinformation is in better classifying, flagging, and labeling which sources are likely reliable or likely to contain disinformation. While many community members have advances knowledge of reliable sources, globally there is no way for any one person or project to evaluate all possible reliable sources. We could develop a global news index with ratings by category, perhaps using Wikicite and Wikidata. We could invite librarians or other information professionals to help us in this ranking process. We could rely on outside organizations' indexes [92]. There are already some impressive yet incomplete attempts at distilling citation reliability knowledge ([93][94][95][96][97][98][99][100]). These are not shared across all projects, however, and none of them are globally representative. An exciting prototype tool [101] could expose these rankings or labels to all readers as an aid for citation literacy.

Citation Access: Being able to read citations is a precursor to using good sources, as many of them are unfortunately locked behind paywalls. Project like The Wikipedia Library [102] give editors more tools to find and use good scholarly and academic reliable sources. Partnerships with organizations like the Internet Archive [103] meanwhile give readers a better chance at being able to lookup and verify information in citations. We know from basic study that more open citations are clicked on more often; access is a precondition for dispelling misinformation with sound research.

Algorithmic Assistance: The initial conclusion of the Foundation's literature review on disinformation was that our best opportunity is to help article patrollers [104] through machine-assisted edit scoring--to more efficiently and effectively identify and review potential disinformation. There has also been the development of algorithms that help identify which statements likely need a citation [105], and that may integrate nicely with tools that help editors fix those statements [106].

Project goals[edit]

I will do deep reading to provide a fuller picture around the problem of disinformation on Wikimedia projects: where it is most severe, and what current factors are making it less so. I will tie together multiple threads of media, studies, and community knowledge to create an enlightened narrative to guide discussions and future interventions. As a result, our community will be better informed and prepared to address a growing threat to our reputation and integrity.

I will investigate and distill broader expertise through targeted interviews with community members, staff, scholars, and organizations about disinformation. By deeply studying what the most experienced people on the subject know, I will capture the wisdom, innovations, and leadership residing inside out and outside of our projects. As a result, we will highlight our community knowledge and transform that wisdom into actionable insights.

I will provide clear recommendations for most effectively addressing disinformation, with proposed pathways towards implementation. I will compare and connect different approaches and analyze which are appropriate in different circumstances and how they can work together. As a result, community and staff should be able to better envision, plan, and execute next steps for their interventions.

I will share knowledge gained from the project through blog posts and conference presentations. I will use social media to spread learnings and further invite commentary. As a result, more people will know about the findings and know they have contacts who can connect them to resources and other experts.

Project impact[edit]

How will you know if you have met your goals?[edit]

Goals Impacts (outcomes) Result (measurement)
Provide a fuller picture of the landscape Our community will be better informed and prepared to address a growing threat to our reputation and integrity Report is written and published on Meta
Distill varied knowledge from community and experts Highlight the community of our expertise and transform that knowledge into actionable insights Number of interviews conducted: 25 by voice, 25 additional by text
Provide clear recommendations and a roadmap for interventions Community and staff should be able to better envision, plan, and execute next steps for their interventions Report is written and published on Meta
Share knowledge gained More people will know about the findings and have key contacts who can connect them to resources and other experts Blog post is written and 2 presentations are given at community or media conferences (virtually if necessary).

Do you have any goals around participation or content?[edit]

As this is an investigation and recommendations project, the only goals around participation would come at a future grant or project phase. The creation of a Meta portal, however, would provide for continued community engagement and improvement of content and resources, as well as a place for participants to sign up for updates and future collaboration or initiatives.

Project plan[edit]

Activities[edit]

Activities Products Followup
Intensive reading into academic and media papers, studies, and sources Written report hosted on a Meta Wikimedia project page Share landscape report on Meta and through community messaging channels asking for commentary and suggestions
Deep interviews with leaders from larger and smaller Wikipedia communities, languages, and regions (English, Spanish, Filipino, Arabic, Chinese, Eastern European, African...), and with outside experts, through both voice and text questions and conversations Written qualitative synthesis of interviews with high-level findings and key quotes published on Meta Share findings with interviewees, offer to make connections to and between interviewees, invitation to have scholars interact with our community
Synthesize recommendations for best approaches to fighting disinformation Written document hosted on a Meta Wikimedia project page Engage with Meta page project participants and alert community about developments on messaging channels and social media groups
Share findings widely inside and outside of the community Blog posts, social media engagement, and conference presentations Suggest second round of conversations with community members and external experts

Budget[edit]

Project Element Cost (USD) Hours Start (2020) End (2021)
Disinformation landscape review $3,000 60 September 1 September 30
Interviews with Wikipedians (Global North, Global South, Stewards...) $9,000 180 October 1 January 31
Interviews with Wikimedia Foundation Staff $3,000 60 October 1 January 31
Interviews with external media experts $3,000 60 October 1 January 31
Qualitative synthesis of interviews $3,000 60 February 1 February 28
Review of potential interventions $3,000 60 March 1 March 31
Recommendations for tools and trainings $3,500 70 April 1 April 31
Project Admin $2,500 100 September 1 March 31
TOTAL $30,000 650 hours September 1, 2020 April 30, 2021

note: This budget works out to approximately 2 dedicated, full days of work on this project per week over the course of 8 months.

Community engagement[edit]

I will create a Meta project page about disinformation where I collect learnings and resources. I will host and publish the landscape review, intervention menu, and actionable recommendations on that page. There will be a place for people to sign up for updates and to contribute to the broader initiative of finding solutions for disinformation.

I will target active editors who do the most patrolling, editors who work on controversial subjects, editors from the global south, and editors who have experience combating disinformation. Communities that are most successful at addressing disinformation relative to the size of their active editor base, and communities that are less successful at addressing disinformation relative to the size of their active editor base are of particular interest. These are the people most affected by and best positioned to expose and address disinformation.

I will leverage social media to disseminate findings widely through blog posts, Facebook groups, mailing list posts, and attendance at community and online information conferences.

Get involved[edit]

Participants[edit]

Jake Orlowitz (User:Ocaasi) founded The Wikipedia Library and ran it from 2011-2019. By the time he left the program at the Wikimedia Foundation, TWL had a half-million dollar budget and 6-person team on 4 continents. Through The Wikipedia Library, Jake developed partnerships with 70 leading scholarly publishers to provide free access to 100,000 scholarly journals and reference texts. 25,000 editors now have access to those sources through the Wikipedia Library Card Platform. Jake created the viral #1Lib1Ref and #1Bib1Ref citation campaigns, which now add 10-20 thousand new references each year from librarians around the world to Wikipedia. He started the Wikipedia Visiting scholar program, the Books & Bytes newsletter, the Wikipedia + Libraries facebook group, the Wikimedia and Libraries Usergroup, and the @WikiLibrary Twitter account.

Jake negotiated the collaboration with Turnitin to fix copyright violations on Wikipedia, started collaboration with Internet Archive to rescue 10 million dead citation links, integrated OCLC ISBN citation data into Wikipedia's reference autogeneration interface, and began a project to add Citoid to Wikidata. He developed the OAbot web app, and is a founding member of the Open Scholarship Initiative. He co-released a dataset of Wikipedia's most cited sources and the proportion of free-to-read sources on Wikipedia. Jake created The Wikipedia Adventure interactive guided tutorial and facilitated the first-ever for-credit Wikipedia editing course at Stanford Medical School. He is an English Wikipedia Administrator, 2-time Wikimedia Foundation grantee, former Individual Engagement Grants Committee member, founding board member of Wiki Project Med Foundation, former Organizing Committee member for Wikicite, Linked Data 4 Libraries Program Committee member, and founder of the Wikimedia Foundation's Knowledge Integrity Program.

Jake has presented about Wikipedia, citations, and reliability at five Wikimanias, Stanford University, Internet Librarian, the American Library Association, OCLC, and IFLA. He is a primary author of "The Plain and Simple Conflict of Interest Guide", "Conflict of Interest editing on Wikipedia", "Librarypedia: The future of Libraries, and Wikipedia", "The New Media Coalition Horizon Report for Libraries", "The Wikipedia Adventure: Field Evaluation", "Writing an open access encyclopedia in a closed access world", "The Wikipedia Library: The world's largest encyclopedia needs a digital library, and we are building it", "You're a researcher without a library: what do you do?", the Wikipedia "Research Help" portal, "Why Medical Schools Should Embrace Wikipedia", and the forthcoming Wikipedia @20 chapter "How Wikipedia Drove Professors Crazy, Made Me Sane, and Almost Saved the Internet." He has been interviewed by Publishers weekly in "Discovery Happens Here", Tow Journalism School for "Public Record Under Threat", and was featured in the documentary "Paywall: The Business of Scholarship".


Project Admin would help coordinate 50-75 interviews and assist in transferring interview notes into a unified document.

Advisors[edit]

The advisors are indeed aligned in this work and plan to be regularly providing insight, guidance, and a mutual sounding board for ongoing efforts

  • 1) Doc James will provide the perspective of a Board Member as well as a leader in WikiProject medicine, which has faced challenges from pharmaceutical and medical device industry promotion
  • 2) Patrick Earley will assist in networking with other news and media experts and organizations with whom he has collaborated on Trust & Safety issues in his Foundation role
  • 3) Asaf Bartov will ensure that emerging communities are represented and that any recommendations account for the limited amount of editors and resources in these smaller projects
  • 4) Sam Walton will consult as current lead of The Wikipedia Library card to investigate research approaches to combating misinformation through access to reliable sources
  • 5) Jonathan Morgan is an advisor on the translation of research into practical tools, as evidenced by the way his work on patrollers informed the implementation of the new page patrol interface and ORES watchlist scores
  • 6) Felix Nartey is present to ensure that Africa and other areas in the Global South are not excluded from this research, and to connect with scores of community leaders from differing regions with to his tremendous network among Wikimedia Affiliates
  • 7) Phoebe Ayers is present as an information literacy expert who can assist me in creating guidance that is consistent with library best practices and of clear educational value
  • 8) Diego Saez-Trumper is the foremost authority at the Foundation on the landscape of disinformation and the potential for machine learning and algorithmic assistance
  • 9) Lauren Maggio is a professor and expert in Technology & Distributed learning--her specialities include responsible conduct of research, qualitative methods, and knowledge synthesis.

Community notification[edit]

  • Wikimedia-l
  • Wikipedialibrary-l
  • Openaccess-l
  • Wikicite-discuss-l
  • Meta: Wikimedia and Libraries User Group
  • Meta: The Wikipedia Library
  • ENWP: WP:The Wikipedia Library
  • ENWP: WP:WikiProject Academic Journals
  • ENWP: WP:V
  • ENWP: WP:NPOV
  • ENWP: WP:RS
  • ENWP: WP:RS/N
  • ENWP: WP:COI/N
  • ENWP: WP:MED
  • Facebook: Wikipedia Weekly
  • Facebook: Wikimedia + Libraries

Endorsements[edit]

Do you think this project should be selected for a Project Grant? Please add your name and rationale for endorsing this project below! (Other constructive feedback is welcome on the discussion page).

  • Support Support Gamaliel (talk) 19:24, 19 February 2020 (UTC)
  • Support Support. Fascinating! I'd love to be involved, and in any case I think is very needed. Pundit (talk) 19:33, 19 February 2020 (UTC)
  • Support Support. Excellent idea and much needed. SarahSV talk 20:18, 19 February 2020 (UTC)
  • Support Support Misinformation and algorithms are the "hot topics" at the upcoming IFLA (international librarians) conference. These research outcomes will most certainly be used in the future by librarians. Bridges2Information (talk) 20:42, 19 February 2020 (UTC)
  • Support Support This is an intersting proposal. I'd be interested to be a part of it. —M@sssly 20:50, 19 February 2020 (UTC)
  • Support Support This will shed light on some questions that a lot of people have been asking. Even if the research finds that we should just keep doing what we're doing, it would be good to know what we're doing right. Clayoquot (talk) 21:08, 19 February 2020 (UTC)
  • Support SupportOzzie10aaaa (talk) 21:42, 19 February 2020 (UTC)
  • Support Support Interesting question and I hope it yields actionable results; let me know if I can help. -Reagle (talk) 21:56, 19 February 2020 (UTC)
  • Support Support Absolutely vital topic for everything we do, and our future as *reliable* free knowledge projects. NB: I agreed to be an advisor on this project, but support it regardless. -- phoebe | talk 23:05, 19 February 2020 (UTC)
  • Support Support I think this is a super important topic. As you see above, the grant applicant is well-informed. We've have been successful against disinformation but could do better, e.g. by diagnosing or measuring the bad activity more precisely or in more ways, or by having clearer early-warning systems, or by teaming up with other platforms. A focus on smaller projects makes sense; more eyes on the project would find more problems. Interviewing editors with experience and summarizing their comments will be really helpful and interesting. I look forward to reading that. -- econterms (talk) 01:16, 20 February 2020 (UTC)
  • Support Support Rosiestep (talk) 01:17, 20 February 2020 (UTC)
  • Support Support This will provide useful data and insights into the practices around misinformation, a hot topic in media and communications research. Its project page will make the process and the findings accessible to others doing similar research. Doctor 17 (talk) 06:05, 20 February 2020 (UTC)
  • Support Support Interesting proposal with important impact. Muhraz (talk) 10:36, 20 February 2020 (UTC)
  • Support Support I support this project --CristianCantoro (talk) 11:12, 20 February 2020 (UTC)
  • Support Support I support this as well, it's much needed and my organisation would be happy to collaborate where possible. Mihajlo961x (talk) 12:16, 20 February 2020 (UTC)
  • Support Support Great project. Benjamenta (talk) 13:30, 20 February 2020 (UTC)
  • Support Support This is a critical direction for our projects. --EpochFail (talk) 15:38, 20 February 2020 (UTC)
  • Support Support Looks well aligned to our mission, and very beneficial. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 15:59, 20 February 2020 (UTC)
  • Support Support. We must do this. I would ask why are we not doing this already? Our credibility is important to our mission, and this kind of study should help us defend our credibility by finding and hopefully protecting vulnerabilities. · · · Peter (Southwood) (talk): 06:53, 21 February 2020 (UTC)
  • Support Support Absolutely! Lately, YouTube has been using Wikipedia links to enable its viewers to fact-check information, and eliminate potential biases and wrong information. However, our projects are not completely free of misinformation, and in these times where internet is exploding all kinds of mis- and dis- information and as we are envisioning to provide knowledge as a service, a project like this will definitely add value. KCVelaga (talk) 08:20, 21 February 2020 (UTC)
  • Support Support Bondegezou (talk) 14:16, 21 February 2020 (UTC)
  • Support Support I am the author of w:WP:DISINFORMATION we need more info and investigations. -- GreenC (talk) 20:50, 21 February 2020 (UTC)
  • Support Support Nikkimaria (talk) 21:26, 22 February 2020 (UTC)
  • Support Support Misinformation is one of the biggest threats facing Wikipedia projects. As such, a project like this is needed to find the best methods to keep Wikipedia content factually correct. Daylen (talk) 08:47, 23 February 2020 (UTC)
  • Support Support. Identifying and rejecting misinformation and disinformation is a persistent challenge for Wikipedia contributors, and we need stronger solutions for protecting the integrity of our articles. Please let me know if I can help in any way. — Newslinger talk 14:13, 24 February 2020 (UTC)
  • Support Support Smallison (talk) 15:17, 24 February 2020 (UTC)
  • Support Support Psubhashish (talk)
  • Support Support Mcbrarian (talk) 20:52, 25 February 2020 (UTC)
  • Support Support It will help us to combat denialisms. Thanks for initiative! Ixocactus (talk) 21:32, 25 February 2020 (UTC)
  • Support Support Very much needed Uncommon fritillary (talk) 16:07, 26 February 2020 (UTC)
  • Support Support Much needed and Jake, who is engaged in and committed to the community, is an excellent person to do it. Mssemantics (talk) 20:41, 26 February 2020 (UTC)
  • Support Support Very valuable project. Strongest possible support. WWB (talk) 16:07, 28 February 2020 (UTC)
  • Support Support Looks like a great project. Very timely and good approach/method. Jake is the perfect person to conduct it! Hfordsa (talk) 00:13, 2 March 2020 (UTC)
  • Support Support need I add more?—CYBERPOWER (Chat) 14:08, 10 March 2020 (UTC)
  • Support Support as noted in the proposal, this is about a "massive existential problem" for Wikipedia, i.e. if this problem is not addressed, Wikipedia, through the loss of its credibility, could cease to exist as we now know it. If there is *anything* I can do to help the investigation, please just let me know. Smallbones (talk) 13:56, 15 March 2020 (UTC)
  • I support research that will improve Wikipedia's reputation as a reliable source of information. BobcatPanther (talk) 19:56, 27 March 2020 (UTC)
  • Support Support Jake's been always passionate about the reliability and verifiability of Wikipedia, which ended up in him founding The Wikipedia Library – a fantastic initiative to dispel misinformation, which has been a massive success. And his quest into studying disinformation on Wikipedia will have a similar, if not more, impact on thwarting systematic injection of disinformation by governments, organisations and ill-intentioned groups/people. If anyone could do this, it's him no doubt --—UY Scuti Talk 11:50, 5 May 2020 (UTC)