Jump to content

Incident Reporting System/Updates/cs

From Meta, a Wikimedia project coordination wiki
This page is a translated version of the page Incident Reporting System/Updates and the translation is 11% complete.
Outdated translations are marked like this.
Incident Reporting System

    Testování Systému hlášení incidentů (Minimální testovatelný produkt, beta, 10. listopadu 2023)

    Editoři jsou vyzýváni k otestování minimálního testovacího produktu pro Systém hlášení incidentů. Tým Trust and Safety vytvořil základní verzi produktu, která umožňuje uživateli podat hlášení ze stránky diskuse, kde došlo k incidentu. Poznámka: Tato verze produktu slouží k získání informací o odesílání hlášení na soukromou e-mailovou adresu (např. emergency(_AT_)wikimedia.org nebo skupinu správců). Toto nezahrnuje všechny možné scénáře, jako je například hlášení na veřejnou nástěnku. Potřebujeme vaši zpětnou vazbu, abychom zjistili, zda je tento přístup efektivní.

    Pro otestování:

    1. Navštivte jakoukoliv stránku na Beta Wiki obsahující diskusi, například User talk:Testing nebo Talk:African Wild Dog.

    2. Následně klikněte na tlačítko pro více možností (tři tečky) vedle tlačítka Odpovědět a poté klikněte tlačítko Report (viz obrázek 1). Tlačítko najdete i v postraním menu Nástroje (obrázek 2).

    3. Pokračujte vyplněním formuláře, následně ho potvrďte. Hlášení bude odesláno Produktovému týmu Trust and Safety, vidět ho budou moci jen oni. Upozorňujeme, že se jedná pouze o test, a proto systém nepoužívejte k hlášení skutečných incidentů.

    4. Při testování se zamyslete nad těmito otázkami:

    • Co si myslíte o tomto procesu hlášení? Co se vám na něm líbí/nelíbí?
    • Pokud víte, co jsou rozšíření, co říkáte na to, kdybyste měli tento systém na své wiki jako rozšíření?
    • Které problémy jsme v této počáteční fázi vývoje přehlédli?

    5. Po otestování zanechte prosím zpětnou vazbu na diskusní stránce.

    Pokud nevidíte tlačítko Report nebo se vám nedaří hlášení odeslat, prosím zkontolujte, že:

    • jste přihlášeni,
    • máte potvrzenou e-mailovou adresu na Beta Wiki,
    • je váš učet starý alespoň 3 hodiny a provedl alespoň jednu editaci,
    • máte povolené DiscussionTools, protože Systém je jejich součástí.

    Pokud nepoužíváte DiscussionTools, můžete hlášení provést pomocí nabídky Nástroje. Pokud nemůžete odeslat druhé hlášení, vemte na vědomí limit 1 hlášení za den pro neověřené a 5 hlášení za den pro automaticky ověřené. Tyto omezení pomáhají minimalizovat možnost zneužití systému zlomyslnými uživateli v procesu testování.

    Sharing incident reporting research findings – September 2023

    Research Findings Report on Incident Reporting 2023

    We have completed research about harassment on selected pilot wikis. The research, which started in early 2023, studied the Indonesian and Korean Wikipedias to understand harassment, how harassment is reported and how responders to reports go about their work.

    We have published findings of the studies.

    Four Updates on the Incident Reporting – July 2023

    Hello everyone! For the past couple of months the Trust and Safety Product team has been working on finalising Phase 1 of the Incident Reporting System project.

    The purpose of this phase was to define possible product direction and scope of the project with your feedback. We now have a better understanding of what to do next.

    1. We are renaming the project as Incident Reporting System

    The project is now known as the Incident Reporting System, with the word "Private" removed.

    In the context of harassment and the UCoC the word “Private” refers to respecting community members’ privacy and ensuring their safety. It does not mean that all phases of reporting will be confidential.

    We have received feedback that this term is confusing and can be difficult to translate in other languages. Therefore we are removing it.

    2. We have some feedback from researching some pilot communities

    We are conducting research on harassment in the Indonesian and Korean Wikipedia communities. With their feedback, we have been able to document how users in these communities report harassment and created maps out of the information. These maps represent, to the best of our knowledge, how community members on both wikis currently report incidents of harassment and abuse.

    If you have any feedback on these maps, you can give it on the talkpage.

    3. We have updated the project’s overview

    What we want to build moving forward

    • The Trust & Safety Tools team will be developing an extension for reporting incidents/UCoC violations.
    • The extension is intended to be configurable, communities should be able to adapt it to their local processes
    • The extension name is ReportIncident
    • The purpose of the extension is to:
      • Facilitate the filing of reports about various types of UCoC violations by Wikimedians
      • Route those reports to the appropriate entities that will need to process them
      • Facilitate the filing of reliable reports and filter out/redirect the unactionable ones.
      • Facilitate the filing of both private (e.g. to an email address) as well as public (e.g. on-wiki to an Admin noticeboard) reports according to local processes.
    • Extension is intended to be incident agnostic  (ability to support the reporting of different types of incidents)

    What we won’t be doing

    • The system is intended for reporting and routing only, we will not be dealing with processing reports
    • The system is intended for incidents with regards to UCoC violations. We will not use this for other type of requests (such as technical support requests, account access etc)
    • The system is NOT meant to replace existing processes on wikis. Our purpose is to make it easier to follow existing processes.

    4. We have the first iteration of the reporting extension ReportIncident

    In November last year we talked about how we should start small with a very limited scope, so  for our first iteration we thought about creating a very basic experience.

    What’s included in this initial iteration?

    • Ability to report from User Talk page
      • Report a topic header
      • Report a comment
    • Ability to complete a basic form and submit
    • The report will be sent to an email address (a dummy email for testing purposes).

    Designs

    The first version of the MTP (minimum testable product) will let a Wikimedian report an abusive topic header or comment on a talk page. Here are the designs.

    Implementing Designs – What’s next

    The Trust and Safety Product team is now working on developing these initial designs as an MTP, a proof of concept that will be deployed to Beta-cluster and tested internally. The purpose of this is to assess technical viability. If everything goes well the next step is to deploy to test.wikimedia.org for usability testing and feedback.

    Looking forward to your feedback about this first iteration on the talk page!

    listopad 2022

    Our main goal for the past couple of months was to understand the problem space and understand what people are struggling with, what they need, and their expectations around this project. We did this by:

    • Reviewing and synthesizing harassment research, surveys and other relevant documentation (going back to 2013)
    • Having user interviews with volunteers who have experienced or witnessed harassment on Wikipedia
    • Having discussions with Staff members, UCoC drafting committee and wiki functionaries.

    Our purpose was to identify priorities, scope and a possible product direction.

    Findings and next steps

    Focus on Safety

    The recommendation from the Movement Strategy discussions is to provide for safety and inclusion within the communities. As our ultimate goal is for people to feel safe when participating in Wikimedia projects, we will use this as the guiding principle for what to focus on in the minimum viable product (MVP).

    Project Approach: start small

    There are a lot of things to take in consideration when thinking about this project.

    • Many types of Users: reporter, responder, observer, accused, monitor
    • Many Use cases: doxing, abuse of power, content violations, security breaches, legal issues etc.
    • A lot of Complexities: admins as harassers, off wiki harassment, government interference etc.

    This project will grow and become more complex over time. So we need to start really small, with a very limited scope before we dive into anything more complex.

    Focus on two types of users

    We have identified a few different types of users:

    • Reporters: Users who have experienced harassment, and are filing a report.
    • Responders: Users who receive the report, and want to help.
    • Accused: The users who are named in the report.
    • Monitor: People who are interested in tracking the progress of reports, to understand the problem better or to ensure that people are treated properly.

    Since we want to start small, we will focus on reporters and responders first.

    MVP Approach (Short-term)

    The way we would like to approach this is to build something small that will help us figure out whether the basic experience actually works.

    Principles of the MVP:

    • We will design for and test and release on a few pilot wikis
    • Since our goal is to address safety we are going to focus only on 3.1 (Harassment) in the UCoC.
    • We will explore a basic experience for two user groups only:
      • Reporters will understand how to file a report, and feel comfortable enough to complete the report process.
      • Responders will receive clear reports, giving them the information that they need in order to understand the problem.
    • MVP will connect to current systems as they are (we are not changing any existing processes)

    This experiment should also help us explore and answer some important questions and learn things as we go:

    • Entry points (where reporting starts) – what are they, should we have one or more?
    • Users – do people easily discover the entry point? What do they think will happen when they engage it?
    • Scale – can we do this at scale? Will we overwhelm the responders? etc.
    • Data – can we build something that will help us collect the data we need in order to make decisions? What can we measure to know we’re moving in the right direction?
    What we are not doing (yet)

    The idea is to start with a really small scope, try a few things and learn as we go. Therefore we need to be very clear about what we are not going to do yet:

    • We are not solving for bad admins and/or other complex use cases
    • We are not fixing existing flawed processes
    • Not everything in the UCoC is about safety but we are focusing only on safety
    • Agnostic reporting – we cannot do this without validating a basic reporting experience works with a specific type of incident
    What happens after the MVP (long-term)

    We have some ideas about v2 and v3 but we want to experiment with an MVP first and see how people feel about it. What we learn now will be useful to make decisions about future versions.

    Some v2 and v3 ideas include:

    • Private reporting (creating a private space for reporters and responders to interact)
    • Escalation (having the ability to route cases to a different entity for further support)

    In order to explore these two ideas we need to ensure the basic/core experience actually works. If it does we will build on top of it.

    Discussions points

    • What do you think about this approach?
    • What scares/concerns you about this project?

    Looking forward to your feedback on the talk page!

    September 2022

    We have been collecting feedback, reading through existing documentation and conducting interviews in order to better understand the problem space and identify critical questions we need to answer. We are currently synthesising the information we have collected in an effort to start defining a more clear scope for the project. It is a lot of information to go through so this might take a while, there's so many things we need to learn!