Community health initiative/User reporting system

From Meta, a Wikimedia project coordination wiki
Jump to navigation Jump to search
Light-Bulb icon by Till Teenck green.svg

This page documents a feature the Wikimedia Foundation's Anti-Harassment Tools team has prioritized for software development.

🗣   We invite you to join the discussion!
🛠   Track in Phabricator at T166812.

The Wikimedia Foundation's Anti-Harassment Tools team is researching in preparation to design and build a user reporting system to make it easier for people experiencing harassment and other forms of abuse to provide accurate, actionable information to the appropriate moderation channel.

Our goal is to provide a reporting system for Wikimedia communities that puts less stress on users who are submitting the reports while simultaneously generating higher-quality reports.

Community input and feedback will be essential for the success of our work. This wiki page will be updated with our findings and discussion topics. Watchlist this page to stay involved!

About[edit]

Hand holding a compass, with trees and a beach in the background

Reporting harassment[edit]

When harassment or abuse happens between two or more Wikimedia users, most wikis request that the users in conflict self-resolve the issue by discussing on each others' talk pages. When this fails to resolve the situation, users can report the misconduct to wiki ‘moderators’ (other experienced users, administrators, and in extreme cases stewards or Wikimedia Foundation staff) in a variety of ways: on wiki talk pages, via email, or via IRC or another off-wiki communication channel. Most wikis have established noticeboards to report cases that require attention or suggested email lists for sensitive reports that require privacy. The identified ways to report harassment include:[1]

  • Email
  • IRC
  • Social media or messaging website/app (Facebook, Slack, Telegram, etc.)
  • In-person at an edit-a-thon, conference, or other event
  • Telephone call

Triaging reports[edit]

When a user receives or discovers a report of harassment the first step is usually to triage the report — "can I, as the recipient, facilitate the moderation? or is someone else better suited?"

We have found that for most simple cases of harassment, the user who receives the report will act as a mediator to work with both parties on defining the problem and agreeing on a solution forward. For cases where the receiving user does not want to become involved they will refer the user to another admin, the WMF, a noticeboard, a policy page, ArbCom, a local affiliate, or law enforcement depending on the severity of the incident.[1][2]

Triaging these reports can be time consuming and frustrating, especially for areas where reports are commonly misfiled or mislabeled.

Routing reports to the appropriate moderation channel[edit]

Because there is a variety of places to report harassment, and user misconduct can be closely related to content problems (neutrality, paid editing, edit warring, etc. — which have their own workflows to report) it can be difficult to determine exactly who or where to report user misconduct. The Anti-Harassment Tools team believes the software should do the heavy lifting and users should not require special knowledge to find the most appropriate place to request assistance. This will help both the reporters and the moderators.

Our current direction for the user reporting system is a system which routes the reporter to the appropriate channel given their responses to some questions. For example, if someone reports that they have been unjustly reverted the system might recommend they discuss it on the article's talk page; but if they report that they have been physically threatened the system would recommend they email Trust & Safety for immediate resolution. This system would be customizable for every wiki and would be able to evolve over time.

Updates[edit]

August 15[edit]

I've expanded § Requirements with five more sources: results from a 2018 survey about English Wikipedia's AN/I noticeboard, results from quantitative analysis on AN/I, the Harvard Negotiation and Mediation Clinical Program's analysis of AN/I, a summary of IdeaLab submissions on the topic of reporting harassment, and results from the 2017 Community Engagement Insights survey. I plan to add the 2018 IdeaLab submissions and the 2018 Insights survey results as soon as they are ready.

The "Things to Keep" section has not grown, mostly because the sources I'm adding do not solicit feedback about what works well but rather about what does not work well and what should change. The list of "Things to Change" is growing and will soon need to be sub-categorized in order to better understand common areas of frustration or opportunity. A trend that is already starting to crystalize is "who is responsible for handling harassment, and how can the software better help them make faster, more accurate decisions that stand up to scrutiny?" This is not an easy question — Wikipedia has no deadlines and while administrators have the tools to block users and slightly more social capital than non-administrators, there is no specific group of people responsible for making sure reports of misconduct are handled. (Short of ArbComs, who only handle severe cases.) This is also made difficult by the fact that moderation is often a thankless, unrewarding job that can attract unwanted attention. How can we find, train, and support moderators while still holding them accountable?

August 9[edit]

I've added the section § Triaging reports above based on our experience reading public cases of harassment as well as the Worksheet #1 notes from our Wikimania roundtable. Participating on a Wikimedia wiki requires dedication so unsurprisingly we found that the most common way to triage a report of harassment was for the user to handle it themselves (11) followed by referring to another admin or trusted user (9), the WMF (6), or to a public noticeboard (5). One person each mentioned referring to a policy page, ArbCom, a local affiliate, or law enforcement.

Also of interest from the first worksheet was the frequency of email in reporting harassment. 19 roundtable participants mentioned they receive reports via email, 11 mentioned talk pages, 8 IRC, 6 social media, 5 in-person, 4 to a listserv, noticeboard, or telegram, and 2 on a telephone call or via OTRS. It's becoming crystal clear that this reporting system will have to have a space for reporting harassment in private. The big question will be how to allow for private reports that treat all parties with dignity inside an environment of transparency and accountability. No small feat!

August 3, 2018[edit]

Slides from our Wikimania roundtable

This page has been updated with all the information to date about this project, most notably results and takeaways from a roundtable our team conducted in Cape Town, South Africa at Wikimania 2018. The anonymized raw notes can be found at Community health initiative/User reporting system/Wikimania 2018 notes and my synthesis can be found lower on this page at § Requirements. At the roundtable we asked the 25 participants (from 8 different language Wikipedias) to take notes and discuss how harassment is reported on their wiki (we are working on processing through some of these notes, and we'll add them here in the coming weeks.)

Most significantly at the roundtable we asked participants what currently works well about reporting harassment and what does not work well. We are hoping to buff out a list of things to keep and things to change about reporting harassment on Wikimedia before we begin designing the new reporting system. It's important to us that we come to an agreement with most participants about these lists before proceeding so the solutions we design will solve real problems and be realistic to enable on all projects over the coming years.

It is also important to us that we include the voices of a wide variety of people in this process: active Wikimedia users, newcomers, people who no longer contribute due to harassment, and external experts. As not everyone will feel comfortable participating here on Meta Wiki, my team will anonymize, transcribe, and summarize all findings throughout this process.

Thank you for reading, thank you for helping us define an ideal solution, and please let us know your thoughts on the talk page or via email to tbolliger@wikimedia.org ✌️

Previous[edit]

Our team (the WMF's Anti-Harassment Tools team) began looking into reporting channels in late 2017 knowing we would be working on building an improved system in 2018. This early work focussed on English Wikipedia's administrator's noticeboard for incidents (aka AN/I) in the form of both qualitative analysis and a survey for participants. The results are linked below in § Research. Given the amount of work for blocking tools this reporting system project was prioritized for the second half of 2018.

In April 2018 the Anti-Harassment Tools team met in Ft. Lauderdale Florida to discuss our 2018 commitments and agreed on an initial direction of a modular 'routing system' which is customizable per wiki and helps users who want to report an incident to the appropriate new channel. We agreed that introducing any new channels without consolidating or sunsetting old channels was irresponsible. As we progress through this project we hope to move tactically and make small improvements in small steps, rather than introduce large interruptive changes.

Requirements[edit]

There are elements of the current reporting systems that work well and that we want to keep, and there are parts that are problematic and should change. Before we design the new system we would like to reach agreements on what to keep and what to change so we can design the best possible reporting system.

Things to keep[edit]

  1. The ability to report directly to WMF Trust & Safety if desired or needed.[1]
  2. A system which assumes good faith, for the moderators, victims, and accused.[1]
  3. The authority of local projects[1]
  4. The ability to report and discuss in one’s native language[1][3]
  5. Accountability for moderators[1]
  6. Training of moderators and administrators (on wikis that have training)[1]
  7. A system that allows for some public reports, such as noticeboards for quick, straightforward, public cases[1][2][3]
  8. A system that allows for some reports to be private[1]
  9. A public system of documenting private cases[1]
  10. A scalable group of moderators, so many people can act when needed[1][3]
  11. A variety of different responses, appropriate to different situations[1]
  12. The ability to reach out to a specific admin or moderator if so desired[1][4]
  13. If you'd like to add to this list, please discuss on the talk page or submit a private submission via email to tbolliger@wikimedia.org

Things to change[edit]

  1. The entry point to “report harassment” is not currently visible or findable.[1][3][2][5]
    1. We should make it easier for people who have been harassed to come forward.[3]
    2. The current system is not proactively described to users. People could be educated about how to handle incivility before they encounter a problem.
    3. There should be a way to mark certain conversations as harassment or problematic. [3]
  2. Not all reports should be 100% public.[1][2][3][5]
    1. Alternatives to public on-wiki reporting channels are not emphasized enough.
    2. Trolls and passersby can derail public discussions.[2]
    3. Publicly discussing the incident with the person who harassed you causes further harm.[2][5]
    4. Private reports are not always appropriately documented on-wiki. The system should allow for private reports to be appropriately transparent. Consider a system where more than one moderator knows about a private report.
  3. Most current reporting systems do not use a consistent form making it difficult for reporters to know what type of information will be useful for moderators to properly address their issue.[1][2][5]
  4. The current documentation about harassment is not sufficient, including:[1]
    1. Definitions and examples of harassment
    2. Who is responsible for moderating harassment (including users, administrators, and ArbCom)
    3. How to report harassment and/or other forms of user misconduct[2][5]
  5. The system should not be designed in a vacuum, rather should involve the community, external experts, and digital rights leaders.[1]
  6. The volunteer moderators need more training on how to make appropriate decisions and for self-care. These volunteers should be trained in a way to train each other.[1][3]
  7. The current users in positions of authority on wikis do not fully understand the dangers of online harassment.[1][3]
  8. The current system requires one moderator to act (resolve a case, set a block, etc.) which often makes them a target of further harassment. Consider a system that uses a committee account or requires multiple users to respond to the harasser.[1][3]
  9. The current systems do not use consistency or precedent of previous cases. Not every case is treated the same.[1][2][5]
    1. It is currently difficult to search and find similar cases.[2]
    2. There are gaps between setting precedent and updating necessary policies.[5]
  10. The concept of Boomerang distracts from many cases and makes moderators skeptical of all reporters. The system should assume good faith of all parties.[1][2]
  11. The current workflows for cross-wiki or global dispute resolution are cumbersome.[1][5]
  12. Initial responses can take too long and should be quicker.[1][3][5]
  13. Case closure can take too long and should be quicker.[2][5]
  14. The excuse of productivity (e.g. a user edits frequently) is often a justification for bad behavior.[1][2]
  15. Language barriers put non-native speakers of the wiki’s language at a disadvantage when reporting or defending themselves against a report of harassment.[1]
  16. The current system does not encourage reporting harassment that you have observed but are not involved in.[1]
  17. It can be difficult to find other users to mediate a situation.[1][3]
  18. Chatting synchronously is difficult for many users as IRC is limiting. Consider building a chat system connected to Wikimedia accounts.[1]
  19. It is time-consuming to gather the facts of a case, such as important diffs. Consider building a system that allows for a collaborative marking.[1][2][5]
  20. It should be easier to flag an article or user to moderators privately.[1][3]
  21. Not all cases need a moderator, in many situations users could mediate their own problems if they had useful advice.[3][2]
  22. Responses to reports to functionaries, WMF, affiliates, or other volunteers are not often resolved.[4][3][6]
  23. Responses to reports to functionaries, WMF, affiliates, or other volunteers are not often useful or satisfactory.[4][2]
  24. Clever users can "game the system" to their advantage.[2]
  25. Complex cases (multiple parties, long time frames, etc.) are time-consuming and unrewarding to moderate.[2]
  26. There is no user group responsible for triaging reports, it is an adjunct responsibility for administrators. Consider clerking.[2][5]
  27. Sometimes moderators involved in an incident will be part of the decision making process.[2]
  28. There are unwritten social rules for participating in dispute resolution and on noticeboards.[2][5]
  29. There is no clear or consistent way to challenge an inadequately resolved report.[5]
  30. Prioritization of cases is manual, there is no special marking for cases that require urgency (not including emailing emergency@).[5]
  31. Some issues would benefit from the attention of multiple moderators' opinions, but there is no special marking these cases.[5]
  32. Public cases that have lengthy comment threads are difficult to navigate and follow.[5]
  33. Users must manually classify the problems they are facing.[5]
  34. If you'd like to add to this list, please discuss on the talk page or submit a private submission via email to tbolliger@wikimedia.org

Research[edit]

Reaching the Zone of Online Collaboration - Recommendations on the Development of Anti-Harassment Tools and Behavioural Dispute Resolution Systems for Wikimedia.pdf

The Wikimedia Foundation's Anti-Harassment Tools team wants to better understand these existing systems to identify any pain points or shortcomings that we can address with improved software. We Our research will heavily focus on English Wikipedia as the largest wiki community but we aim to build a tool that can be used by any wiki of any size. Research completed to date includes:

See also[edit]

References[edit]