Jump to content

Sistema de Notificação de Incidentes

From Meta, a Wikimedia project coordination wiki
This page is a translated version of the page Incident Reporting System and the translation is 19% complete.
Outdated translations are marked like this.
Incident Reporting System


A Fundação deseja melhorar o modo como as pessoas que experienciam assédio ou outras condutas abusivas informam sobre isso, para fazer das nossas comunidades lugares mais sãos e seguros.

A nova equipa de ferramentas de confiança e segurança (em inglês: Trust and Safety Tools team) tem a tarefa de criar um Sistema Privado para Reportar Incidentes (PIRS em sua siglas em inglês). O objetivo é as pessoas possam informar sobre incidentes de assédio e abusivos de maneira privada e segura.


Reporting and processing of harmful incidents has been a topic of interest for Wikimedia communities for many years. With the new Universal Code of Conduct being set up, it is crucial to also have a discussion about user reporting systems.

A forma como incidentes, má conduta e violações de políticas são tratados nos espaços e projetos da Wikimedia se desenvolveu organicamente e é muito descentralizada.

A cada projeto Wikimedia ou comunidade tem seus próprios procedimentos. O modo de informar e gerir relatórios desse tipo de incidente pode acontecer de várias maneiras:

  • através das páginas de discussão
  • Quadro de avisos
  • por email
  • através de comunicações privadas fora da wiki (como canais de IRC, Discord, etc.)

Para muitas pessoas, não está claro o que se pode fazer em um caso desses: para onde ir, com quem falar, como relatar, que informações incluir no relatório, como o relatório é processado, o que acontece depois, etc.

As pessoas precisam saber como relatar um problema e onde fazê-lo. Também há muito pouca informação sobre o que acontecerá quando um relatório for feito e que expectativas se deve ter.

Algumas pessoas não se sentem seguras ao relatar assédio ou abuso quando eles ocorrem, devido a complexidade do processo de relato e por causa de preocupações com a privacidade.

Atualmente não há uma maneira padronizada de se apresentar relatórios de maneira privada.

Foco do projeto

O objetivo principal deste projeto, portanto, é facilitar a abordagem de assédio e incidentes prejudiciais, garantir a privacidade e a segurança das pessoas que relatam, bem como assegurar que o relatório chegue à entidade correta que precisa processá-lo, sem colocar pressão extra sobre aqueles que fazem o processamento".

We want to ensure the privacy and safety of those reporting. We also want to ensure that reports have the right information and are routed to the appropriate entity that needs to process them, while not putting extra pressure on the ones who do the processing.

A equipe de ferramentas de confiança e segurança (em inglês: Trust and Safety Tools team) também está analisando este sistema de comunicação de incidentes como parte de um ecossistema maior de gerenciamento de incidentes para, por exemplo, trabalho preventivo como o gerenciamento de desacordos antes que eles aumentem, processamento de incidentes, conexão e acompanhamento de casos, etc.

Product and Technical Updates

Test the Incident Reporting System Minimum Testable Product in Beta – November 10, 2023

Editors are invited to test an initial Minimum Testable Product (MTP) for the Incident Reporting System.

The Trust & Safety Product team has created a basic product version enabling a user to file a report from the talk page where an incident occurs.

Note: This product version is for learning about filing reports to a private email address (e.g., emergency(_AT_)wikimedia.org or an Admin group). This doesn't cover all scenarios, like reporting to a public noticeboard.

Your feedback is needed to determine if this starting approach is effective.

To test:

1. Visit any talk namespace page on Wikipedia in Beta that contains discussions. We have sample talk pages available at User talk:Testing and Talk:African Wild Dog you can use and log in.

2. Next, click on the overflow button (vertical ellipsis) near the Reply link of any comment to open the overflow menu and click Report (see slide 1). You can also use the Report link in the Tools menu (see slide 2).

3. Proceed to file a report, fill the form and submit. An email will be sent to the Trust and Safety Product team, who will be the only ones to see your report. Please note this is a test and so do not use it to report real incidents.

4. As you test, ponder these questions:

  • What do you think about this reporting process? Especially what you like/don’t like about it?
  • If you are familiar with extensions, how would you feel about having this on your wiki as an extension?
  • Which issues have we missed at this initial reporting stage?

5. Following your test, please leave your feedback on the talk page.

Update: Sharing incident reporting research findings – September 20, 2023

The Incident Reporting System project has completed research about harassment on selected pilot wikis.

The research, which started in early 2023, studied the Indonesian and Korean Wikipedias to understand harassment, how harassment is reported and how responders to reports go about their work.

The findings of the studies have been published.

Please see the research section of the page for more.

Four Updates on the Incident Reporting Project – July 27, 2023

Hello everyone! For the past couple of months the Trust and Safety Product team has been working on finalising Phase 1 of the Incident Reporting System project.

The purpose of this phase was to define possible product direction and scope of the project with your feedback. We now have a better understanding of what to do next. Read more.

Project Scope & MVP – November 8, 2022

Our main goal for the past couple of months was to understand the problem space and user expectations around this project. The way we would like to approach this is to build something small, a minimum viable product (MVP), that will help us figure out whether the basic experience we are looking at actually works. Read more.


Project phases
Descobrir como gerenciar a comunicação de incidentes no espaço da Wikimedia não é uma tarefa fácil. Há muitos riscos e muitas incógnitas.

As this is a complex project it needs to be split into multiple iterations and project phases. For each of these phases we will hold one or several cycles of discussions in order to ensure that we are on the right track and that we incorporate community feedback early, before jumping into large chunks of work.

Fase 1

Preliminary research: collect feedback, reading through existing documentation.

Conduct interviews in order to better understand the problem space and identify critical questions we need to answer.

Define and discuss possible product direction and scope of project. Identify possible pilot wikis.

At the end of this phase we should have a solid understanding of what we are trying to do.

Fase 2

Create prototypes to illustrate the ideas that came up in Phase 1.

Create a list of possible options for more in-depth consultation and review.

Phase 3

Identify and prioritize the best possible ideas.

Transition to software development and break down work in Phabricator tickets.

Continue cycle for next iterations


15 July 2024 Update: Sharing Incident Reporting System Minimum Viable Product (MVP) User Testing Summary

In March 2024, the Trust & Safety Product team conducted user testing of the Minimum Viable Product (MVP) of the Incident Reporting System to learn if users know where to go to report an emergency incident, and if the user flow makes sense and feels intuitive.

We learned the following:

  • During user testing, all participants found the entry point to report an incident and the current user flow is well understood.
  • There was some confusion over two of the reporting options: “someone might cause self-harm” and “public harm threatening message”.

Two participants also made assumptions about the system being automated. One participant was concerned about automation and wanted a human response, whereas the other participant felt assured by the idea it would check if the abuser had any past history of threats and offences, and delete the offensive comment accordingly. All participants expected a timely response (an average of 2-3 days) after submitting a report. Read more.

21 September 2023 Update: Sharing incident reporting research findings

Research Findings Report on Incident Reporting 2023

The Incident Reporting System project has completed research about harassment on selected pilot wikis.

The research, which started in early 2023, studied the Indonesian and Korean Wikipedias to understand harassment, how harassment is reported and how responders to reports go about their work.

The findings of the studies have been published.

In summary, we received valuable insights on the improvements needed for both onwiki and offwiki incident reporting. We also learned more about the communities' needs, which can be used as valuable input for the Incident Reporting tool.

We are keen to share these findings with you; the report has more comprehensive information.

Please leave any feedback and questions on the talkpage.

Pre-project research

The following document is a completed review of research from 2015–2022 the Wikimedia Foundation has done on online harassment on Wikimedia projects. In this review we’ve identified major themes, insights, and areas of concern and provided direct links to the literature.

Previous work

A equipe de Ferramentas de Confiança e Segurança está estudando pesquisas anteriores e consultas à comunidade para informar nosso trabalho. Esse é o nosso primeiro passo. Revisitamos a Iniciativa comunitária de saúde Proposta de sistema de relatórios do usuário e a Consulta de sistema de relatórios do usuário de 2019. Também temos tentado mapear alguns dos fluxos de resolução de conflitos através dos wikis para entender como as comunidades estão atualmente gerenciando os conflitos. Abaixo está um mapa do fluxo de resolução de conflitos da Wiki italiana. Ele tem notas sobre as oportunidades de automação.

Na Wikipédia italiana, existe uma política de 3 passos para a resolução de conflitos. Este mapa visualiza este processo e tenta identificar oportunidades de automação tanto para os editores quanto para os administradores.

Frequently Asked Questions

Questions and answers from Phase 1 of the project

Q: Is there data available about how many incidents are reported per year?

A: Right now there is not a lot of clear data we can use. There are a couple of reasons for this. First, issues are reported in various ways and those differ from community to community. Capturing that data completely and cleanly is highly complicated and would be very time consuming. Second, the interpretation of issues also differs. Some things that are interpreted as harassment are just wiki business (e.g. deleting a promotional article). Review of harassment may also need cultural or community context. We cannot automate and visualize data or count it objectively. The incident reporting system is an opportunity to solve some of these data needs.

Q: How is harassment being defined?

A: Please see the definitions in the Universal Code of Conduct.

Q: How many staff and volunteers will be needed to support the IRS?

A: Currently the magnitude of the problem is not known. So the amount of people needed to support this is not known. Experimenting with the minimum viable product will provide some insight into the number of people needed to support the IRS.

Q: What is the purpose of the MVP (minimal viable product)?

A: The MVP is an experiment and opportunity to learn. This first experimental work will answer the questions that we have right now. Then results will guide the future plans.

Q: What questions are you trying to answer with the minimum viable product?

A: Here are the questions we need to answer:

  • What kind of reports will people file?
  • How many people will file reports?
  • How many people would we need in order to process them?
  • How big is this problem?
  • Can we get a clearer picture of the magnitude of harassment issues? Can we get some data around the number of reports? Is harassment underreported or overreported?
  • Are people currently not reporting harassment because it doesn’t happen or because they don’t know how?
  • Will this be a lot to handle with our current setup, or not?
  • How many are valid complaints compared to people who don't understand the wiki process? Can we distinguish/filter valid complaints, and filter invalid reports to save volunteer or staff time?
  • Will we receive lots of reports filed by people who are upset that their edits were reverted or their page was deleted? What will we do with them?

Q: How does the Wikimedia movement compare to how other big platforms like Facebook/Reddit handle harassment?

A: While we do not have any identical online affinity groups, the Wikimedia movement is most often connected with Facebook and Reddit in regard to how we handle harassment. What is important to consider is nobody has resolved harassment. Other platforms struggle with content moderation, and often they have paid staff who try to deal with it. Two huge differences between us and Reddit and Facebook are the globally collaborative nature of our projects and how communities work to resolve harassment at the community-level.

Q: Is WMF trying to change existing community processes?

A: Our plan for the IRS is not to change any community process. The goal is to connect to existing processes. The ultimate goals are to:

  • Make it easier for people who experience harassment to get help.
  • Eliminate situations in which people do not report because they don’t know how to report harassment.
  • Ensure harassment reports reach the right entities that handle them per local community processes.
  • Ensure responders receive good reports and redirect unfounded complaints and issues to be handled elsewhere.