Private Incident Reporting System

From Meta, a Wikimedia project coordination wiki
Jump to navigation Jump to search
Private Incident Reporting System - An Easier and Safer Reporting System for All

About[edit]

The Wikimedia Foundation wants to improve how people who experience harassment and other forms of abuse report such harmful incidents to provide safer and healthier environments for communities.

The newly formed Trust and Safety Tools team has been tasked with building the Private Incident Reporting System (PIRS). We aim to make it easy for users to report harmful incidents safely and privately.

Background of the project[edit]

How incidents, misconduct and policy violations are dealt with across Wikimedia spaces and projects has developed organically and is very decentralised.

Each Wikimedia project or community has their way of managing things. Reporting and processing of incidents happen in a variety of ways:

  • via wiki talk pages
  • via noticeboards
  • via email
  • via private discussions on off-wiki communication channels (Discord, IRC)

For many users, it is unclear what to do if an incident happens: where to go, who to talk to, how to report, what information to include in the report, how the report is processed, what happens afterwards etc.

Users must know how to report an issue and where to do so. There is also very little information on what will happen once a report is made and what expectations the user should have.

Some users do not feel safe reporting incidents when they occur because of the complexity of the reporting process and because of privacy concerns.

There is currently no standardised way for users to file reports privately.

Focus of the project[edit]

The high-level goal of this project therefore is to make it easier to address harassment and harmful incidents, ensure the privacy and safety of those reporting, as well as ensure the report reaches the correct entity that needs to process it, while not putting extra pressure on the ones who do the processing.

The Trust and Safety Tools Team is also looking at this incident reporting system as part of a larger incident management ecosystem for, e.g. preventive work such as managing disagreements before they escalate, incident processing, connecting and tracking cases etc.

We are not building this entire ecosystem at once. However, we will study how a whole ecosystem would work and how the individual systems within it will connect before we start building.

What's next?[edit]

Figuring out how to manage incident reporting in the Wikimedia space is not an easy task. There are a lot of risks and a lot of unknowns.

Our goal right now is to discuss an overall direction for moving forward and determine some specific pain points we can address that we can also learn from. We would like to refine these ideas with your help.

What has been done so far[edit]

The Trust and Safety Tools team is studying previous research and community consultations to inform our work. That is our first step. We revisited the Community health initiative User reporting system proposal and the User reporting system consultation of 2019. We have also been trying to map out some of the conflict resolution flows across wikis to understand how communities are currently managing conflicts. Below is a map of the Italian Wiki conflict resolution flow. It has notes on opportunities for automation.

Conflict resolution flow on Italian Wikipedia
On Italian Wikipedia, there's a 3-step policy in place for conflict resolution. This map visualizes this process and tries to identify opportunities for automation for both editors and admins.

Project phases[edit]

Phase 1[edit]

With the information we have gathered, in the next couple of months we want to start putting out some rough ideas around possible product direction, consult with the community on mockups and ideas to gather feedback from users.

We want to ensure we have heard everyone and validate that we have the right ideas and are on the right track.

We will also identify wikis that we can pilot PIRS on, establish a baseline for what we’re going to do in Phase 2, and identify potentially helpful metrics to look at when we start building. To get started, here are set of questions we have:

  • How do you currently deal with inappropriate behaviour in your community?
  • What are some of the most common UCoC violations in your community?
  • How are these currently being reported and processed? Is there a standard approach?
  • What would you like to change or do differently from the current approach?
  • The need to have a private space where users can file a report without feeling exposed or unsafe came up in our research. Many of the processes currently request for all conversations to be public. What do you think about exploring ways users can file private reports? What are some possible solutions? Is there anything particularly concerning we should be aware of when thinking about this?
  • What worries you about this project?
  • Are we missing something? Is there a question we didn't ask?

Phase 2[edit]

In Phase 2 we want to start building software based on Phase 1 findings and feedback.

Giving feedback and getting updates[edit]

We also have Timeline and Updates page you can put on your watchlist. We look forward to hearing your thoughts on the talk page.

Research[edit]

The following document is a completed review of research from 2015-2022 the Wikimedia Foundation has done on online harassment on Wikimedia projects. In this review we’ve identified major themes, insights, and areas of concern and provided direct links to the literature.

Literature review Q1 FY23 (1).pdf