Community health initiative/Improved tools and workflows to report harassment

From Meta, a Wikimedia project coordination wiki
Jump to: navigation, search
Light-Bulb icon by Till Teenck green.svg
Feature status: Prioritized, awaiting development

This wiki page documents information, plans, and ideas for a feature the Anti-Harassment Tools team is currently building or plans to build in the upcoming months.

🗣   We invite you to join the discussion on the talk page.

In 2018 the Wikimedia Foundation's Anti-Harassment Tools team will begin a research project into building improved tools and workflows for users to report harassment.

Our goal is to provide a reporting system for Wikimedia communities that puts less stress on users who are submitting the reports while simultaneously generating higher-quality reports.

About[edit]

When harassment or abuse happens between Wikimedians, users can report the misconduct to wiki ‘leadership’ (other experienced users, administrators, and in extreme cases stewards or Wikimedia Foundation staff) in a variety of ways: on wiki talk pages, via email, or via IRC or another off-wiki communication channel. Most wikis have established noticeboards to report cases that require attention or suggested email lists for sensitive reports that require privacy.

The Wikimedia Foundation's Anti-Harassment Tools team wants to better understand these existing systems to identify any pain points or shortcomings that we can address with improved software. Our work will include several months of research into the existing workflows and tools, followed by some proposals of how our team might address the deficiencies we identify. Our research will heavily focus on English Wikipedia as the largest wiki community and a large wealth of information for our needs.

Community input and feedback will be essential for the success of our work. This wiki page will be updated with our findings and discussion topics. We appreciate your help on our journey!


Research[edit]

Hypotheses[edit]

Information forthcoming. We are planning our research around a handful of hypothesis about the current and potential reporting workflows.

Current workflows for reporting harassment[edit]

The Wikimedia Foundation’s Anti-Harassment Tools team will be auditing these identified workflows and sharing our notes on this wiki page in the coming weeks.

  1. Noticeboards, including Administrators' Noticeboard/Incidents
  2. Another user’s talk page (including an admin’s talk page, Jimbo's talk page)
  3. Talk page of article
  4. OTRS
    1. General
    2. Oversight mailing list
  5. Email to other user (Wiki friend, admin, Checkuser, or Oversighter)
  6. ArbCom mailing list
  7. Functionaries mailing list
  8. IRC
  9. Off-wiki
  10. Wikimedia Foundation Support and Safety team
  11. Ombudsmen Commission
  12. Code of Conduct Committee for Tech Spaces

Updates[edit]

February 20, 2018[edit]

Today we had a meeting to discuss research related to the harassment reporting tool; we specifically covered auditing current harassment and conflict workflows. Below are notes we took during the meeting discussing different ways to approach this research. We are looking for feedback!

  • How will we create a taxonomy about comparing these?
    • ‘Official’ vs unofficial reporting methods
      • Official is something like AN/I which has a specific process, and is considered a part of policy or institutionalized reporting for EN:Wikipedia
      • 'Unofficial' is something like a posting on a talk page; it's a great place to start conflict mitigation but it's not considered reporting.
    • On-wiki vs. off wiki
      • Noticeboards vs emailing
    • Harassment vs Conflict
  • Level of detail for each workflow
    • Small handful (max 5) of detailed write-ups
    • Bullet list of other places where user misconduct is reported (maybe with some explanation)
  • What artfacts do we want to generate? (drawings, text, screenshots)
    • 5 User journeys — step-by-step narrative with illustrations/screenshots
    • On-wiki
    • All other Less important/official/used workflows will just be simple bullet lists
  • What do we need?
    • Product manager: Documentation of existing workflows to have a standard (yet living) artefact that a conversation can revolve around
    • Caroline Sinders, Design researcher: What is the scope of conflict mitigation. General discovery to highlight characteristics of these products. Gain understand what does/doesn’t work. Where do things get lost?
    • Community advocate:  Overview of current process that show deep knowledge of community’s current method. Wiki processes and policy.
  • Conflict vs. harassment — How much of non-harassment (yet conflict) workflows do we want to audit
    • Commonly used terms instead of harassment (misconduct, conflict, etc.)
    • We’ll need to be open to a larger discussion about user conflicts, specifically ways to differentiate between 'conflict', ‘harassment’, and 'abuse.' We take all of those seriously but they are different kinds of actions and some will warrant different kinds of responses.
  • Small scale vs. large scale harassment — What are we trying to solve
    • Trevor Bolliger: Small scale. “When a user wants to inform another user that they have been harassed, how do they do this?”

February 8, 2018[edit]

Collating past research and blowing out more robust and specific work flows from the above "Current workflows for reporting harassment." This is serving as an exploration and first initial 'reporting' audit, which is like a product audit. This entails listing out all of the steps in reporting harassment; for example, posting on AN/I would have something like "there's a problem between editors, one editor writes to another editor's talk page, problem persists for some time, that same editor posts to the other editor's talk page about bringing this problem to AN/I, .... ......." and so on and so forth. Some of this 'audit' may be word heavy, or it may feature diagrams, or workflows. However, it's designed to help illustrate how many steps exist in each method of reporting, highlight which of aspects of those reporting workflows use policy, rules, human infrastructure (so talking to each other), technology (that could be like an email listserve or a talk page), etc. All of these findings will be shared with the community and open to feedback and thoughts.

Project schedule[edit]

  • January-March 2018 — Background research, begin on-wiki consultation
  • April-June 2018 — Design and Prototype potential systems, discuss on-wiki
  • July-December 2018 — Software development

See also[edit]

  • /Links for Meta and English Wikipedia links about improving harassment reporting systems.