ARTT

From Meta, a Wikimedia project coordination wiki



Analysis and Response Toolkit for Trust (ARTT) aims to provide motivated citizens with tools and resources to discuss vaccine efficacy online. The ARTT project provides connectors to expert guidance that can be used in analyzing information online and responding to others through trust-building ways.

To understand ARTT, think back to a distressing social media post or exchange in recent months regarding something factual — this group will be no stranger to these examples. The topic could be related to climate change, or elections, or, let’s say: vaccine efficacy. Consider the emotions felt during the social media exchange, or maybe the way you wanted to dissect the information using sources such as the World Health Organization or a scientific journal, or even perhaps. Many of us, with little subject matter expertise, feel a sense of uncertainty about how to recommend one report versus another other.

But now consider too: what if this exchange online is with not just an unknown person in the world, but someone in your community that people you know rely upon: maybe a local journalist, or even your neighborhood lead on Nextdoor? Perhaps this person is your friend, or a family member? What do you say about vaccines, and how do you say it?

This challenge of communicating complex information in human relationships is the core inspiration behind the ARTT project.

One way to think about ARTT resources is as a pair of X-ray glasses. Community members or moderators can slip on these virtual X-ray glasses that then identify the right cues to assess information quality, and then help engage directly with others in appropriate, relevant ways.

The design of this tool will be built upon insights from fields such as journalism, media literacy, psychology, and conflict resolution. Our deliverables will include taxonomies and datasets that can support information quality assessment — for example, what signals can we get from cited, credible sources. Another deliverable will be a catalog of response and intervention research related to trusted conversations — for example, what prior research tells us about how to best communicate a correction, or to establish empathy.

The lead planning organizations for this project are Hacks/Hackers and the University of Washington Allen School of Computer Science and Engineering. Other partnering and advising organizations include WHO Vaccine Safety Net, Wikiproject: Medicine, CUNY, Social Science Research Council, Carter Center, Wikimedia DC, and the MuckRock Foundation.

The ARTT team is conducting a nine-month planning phase that will conclude with the creation of a tool prototype in August 2022.

Phase one funding for ARTT was provided by the National Science Foundation. The lead planning organizations are Hacks/Hackers and the University of Washington. Other partnering and advising organizations include WHO Vaccine Safety Net, Wikiproject: Medicine, CUNY, Social Science Research Council, Carter Center, Wikimedia DC, and the MuckRock Foundation.


So what does this have to do with Wikipedia?

Background: During workshops in February and April of 2022, we asked experienced Wikipedians to fill out a questionnaire to share how they assess the quality of Wikipedia articles related to vaccines. Attendees were also asked to provide feedback on the questionnaire itself. The responses will help to inform how ARTT assesses articles and the sources used to create them.

So, what about this tool? Let's say you were reading a Wikipedia article about vaccines. Think about your process. Do you take the article at face value? Do you assess its quality by reviewing its sources? Now imagine a tool that would work to highlight questionable claims and spotlight an article's low and high quality sources. ARTT could provide this service and potentially help lead to more productive conversations among editors. Furthermore, the ARTT could recommend Wikipedia articles of the highest quality to its users.

During this next phase, we are asking you, the Wikimedia community, to share your input and ideas about how the ARTT tool could be used within Wikipedia. We will also seek ideas on how Wikipedia can be used within the tool. Please see the prompts below and share your thoughts.

Areas Questions
Reliable sources Can ARTT strengthen a reliable source guide (perennial source guide)? Discussion
Article Quality Is it possible to increase the quality of a vaccine-related Wikipedia articles to the point where ARTT may recognize them as reliable sources?

What would it take for an article, itself, to become a reliable source?

Discussion
Talk pages Can the tool encourage collegial exchanges among editors? For example, if there was a tool that recommends tactics for effective communication with other editors, would you be inclined to use it? Discussion
Training practices Would a tool that recommends quality sources be of use to trainers? Discussion

Findings in mid-2022

At the Wikimedia Hack-a-thon on May 21-22, 2022, seven attendees completed a survey regarding what they consider when assessing article quality. Their answers showed that they are more likely to consider an article low-quality if it has:

  • few sources
  • dubious sources
  • warning banners
  • one editor and too few editors

Less important markers were:

  • low ORES scores
  • that it was recently created
  • many reverted edits
  • little talk page engagement
  • lack of WikiProject tags