Jump to content

Incident Reporting System/Updates/tr

From Meta, a Wikimedia project coordination wiki


August 2025: Conclusions from the conversations and next steps

[edit]

As we previously posted, we planned conversations with a few groups (Stewards, Arbitration Committees, and U4C members) to hear their comments on the development of the Incident Reporting System. Based on what we heard, we are designing a flow for reporting incidents that are not emergencies.

The new flow needs to be legally and community-acceptable, and designed to work across all wikis. It will reflect and complement, rather than duplicate or disrupt, existing practices of different communities. In consequence, we have decided to make it configurable via Community Configuration.

We are also exploring an improved list of violation types, and customizable "Get Support" pages based on the type of incident. In parallel, we are researching different wiki communities to better understand existing practices. This is to make the default setting reflect the typical needs well, and to make it address compliance requirements.

Lastly, we wanted to note here that as the previous Trust and Safety Product team, we have been merged with the former Product Security and formed Product Safety and Integrity. People previously involved in the IRS project have not changed their roles, including the owner (Madalina). However, due to this merger, Madalina now has support from Eric (the owner of the broader Safety and Security area) and Szymon (the Movement Communications person for the Safety and Security area). We encourage you to contact any of us if you'd like to talk about IRS. 🖖

June 2025: Stakeholder conversations

[edit]

To ensure that future development is aligned with community needs and what is feasible, we have paused deployments, and will be conducting conversations with key functionary groups (Stewards, ArbComs, and U4C members) to address concerns and gather input on the IRS's further development.

We will have structured calls with the groups in May-June 2025, with a focus on identifying major concerns and achieving a shared vision for the IRS.

December 2024: Incident Notification System is now available on Portuguese Wikipedia

[edit]
Incident Reporting System reporting menu on Portuguese Wikipedia.

We implemented the Incident Reporting System on Portuguese Wikipedia on Wednesday, December 11, 2024. Users there can quickly report emergency incidents directly to the Wikimedia Foundation Trust and Safety team or be directed to submit community-managed reports to the appropriate community pages - all from a single, centralized platform.

To report an incident, users can access the reporting functionality in the Discussion Tools menu by clicking the ellipsis menu or the overflow menu (...) next to each comment, and then click Report.

If you have any comments, please share them on the project talk page.

December 2024: Upcoming pilot

[edit]

We have released the first version of the Incident Reporting System which is now available on Beta. Visit a talk page, choose any comment and click the menu to access the reporting link.

In response to your comments, we have made some updates to the designs. You can check the latest designs here:

Emergency workflow

Non-emergency workflow

We will be enabling the Incident Reporting System on testwiki on December 9 for another round of testing and fixing bugs. If everything goes well we will be enabling the system on Portuguese Wikipedia likely in the week of 10 December.

We will share another update once we are live on testwiki and then on Portuguese Wikipedia.

November 2024: Improvements to the features

[edit]

Our next step will be to develop a version that we can test on a few pilot wikis. We have made improvements to the design:

  • Emergency incidents - Incidents relating to immediate threats of physical harm. These incidents need to be handled by the Wikimedia Foundation Trust and Safety team responding to emergency cases.
  • Non-emergency incidents - Incidents that are not immediate threats of harm, for example bullying, sexual harassment and other unacceptable user behavior. These incidents are handled through local community processes.

The Emergency flow will direct users to file a report that will be sent to the Trust and Safety team.

The Non-emergency flow will direct users to reach out to their local community for support, as outlined in community policies. This will be done through a “Get support” page that will contain links and information specific to each community. The intention is to have configuration options on this page so that each local community can add the relevant links as necessary.

We have some screenshots to demonstrate how this might work. For the next deployment the main focus is to test the emergency flow in production.

Emergency flow:

Non-emergency user flow:

We’d love to hear your thoughts on the current designs! Please comment on the discussion page.

July 2024: User testing summary

[edit]

In March 2024, the Trust & Safety Product team conducted user testing of the Minimum Viable Product (MVP) of the Incident Reporting System to learn if users know where to go to report an emergency incident, and if the flow makes sense and feels intuitive.

We learned the following:

  • During user testing, all participants found the entry point to report an incident and the current flow is well understood.
  • There was some confusion over two of the reporting options: “someone might cause self-harm” and “public harm threatening message”.
  • Two participants also made assumptions about the system being automated. One participant was concerned about automation and wanted a human response, whereas the other participant felt assured by the idea it would check if the abuser had any past history of threats and offences, and delete the offensive comment accordingly.
  • All participants expected a timely response (an average of 2-3 days) after submitting a report.

Beta'daki Vaka Bildirim Sistemi Asgari Test Edilebilir Ürünü test edin– 10 Kasım 2023

[edit]

Editörler, VBS Beta güncellemesinin ilk testini yapmaya davet ediliyor. Güven & Güvenlik Ürün Ekibi, kullanıcıların bir vakayı, gerçekleştiği tartışma sayfasından raporlamalarını sağlamak üzere temel bir ürün versiyonu oluşturdu. Not: Bu ürün versiyonu, bir özel eposta adresine (örn., emergency(_AT_)wikimedia.org veya bir hizmetli grubu) bildirim yapılması ile bilgi almada kullanılır.Her türlü senaryoyu , örneğin herkese açık bir duyuru panosunda bildirim yapılması gibi, kapsamaz. Bu bildirim başlatma yaklaşımının etkili olup olmadığını belirlemek için geribildiriminiz gereklidir.

Tet etmek için:

1. Beta'da tartışma içeren herhangi bir Vikipedi maddesine gidin. User talk:Testing v e Talk:African Wild Dog 'de kullanabileceğiniz ve oturum açabileceğiniz örnek tartışma sayfaları mevcuttur.

2. Sonra, Overflow (taşma) menüsünü açmak için herhangi bir yorumun Yanıtla bağlantısının yanındaki Overflow düğmesine (dikey üç nokta) tıklayın ve Bildir seçeneğine tıklayın (bkz. slayt 1). Ayrıca Araçlar menüsündeki Bildir bağlantısını da kullanabilirsiniz (bkz. slayt 2).

3. Bir bildirimi başlatmak için formu doldurun ve gönderin. Bildiriminizi görecek yegane kişiler olan Güven ve Güvenlik Ürün Ekibi'ne bir eposta gönderilecektir. Bunun bir test olduğunu unutmayın ve gerçek vakaları bildirmede bunu kullanmayın.

4. Test ederken şu noktaları düşününüz:

  • Bu bildirim süreci hakkında ne düşünüyorsunuz? Özellikle beğendiğiniz/beğenmediğiniz neler var?
  • Eğer eklentilere aşinalığınız varsa, bunun kendi wikinizde bir eklenti olarak yer alması hakkında ne düşünürsünüz?
  • Bu ilk bildirim aşamasında gözden kaçan konular neler?

5. Lütfen test ettikten sonra tartışma sayfasına geribildiriminizi bırakın.

Eğer Overflow menüsünü veya Bildir bağlantılarını bulamazsanız ya da formu gönderemezseniz, lütfen şunların olup olmadığını kontrol edin:

  • Oturum açtınız.
  • Beta hesap eposta adresiniz onaylandı.
  • Hesabınızın oluşturulması ve ilk düzenlemenizi yapmanızın üzerinden 3 saatten fazla süre geçti.
  • VBS, DiscussionTools ile çalıştığından, DiscussionTools ayarını açtınız.

DiscussionTools yüklenmezse, Araçlar menüsünden bu konuda bir bildirim yapılabilir. İkinci bir bildirim yapamazsanız, onaylanmamış kullanıcılar için günlük 1 bildirimlik, onaylanmış kullanıcılar içinse günlük 5 bildirim sınırını olduğunu unutmayınız. Testler öncesindeki bu tür zorunluklar, art niyetli kullanıcıların sistemi kötüye kulanması olasılığını azaltmaya yardımcı olmaktadır.

Sharing incident reporting research findings – September 2023

[edit]
Research Findings Report on Incident Reporting 2023

We have completed research about harassment on selected pilot wikis. The research, which started in early 2023, studied the Indonesian and Korean Wikipedias to understand harassment, how harassment is reported and how responders to reports go about their work. In summary, we received valuable insights on the improvements needed for both onwiki and offwiki incident reporting. We also learned more about the communities' needs, which can be used as valuable input for the Incident Reporting tool.

Vak'a Haber Tertibatı Layihası'na ilişkin dört adet muhasırlaştırma – 24 Muharrem 1445

[edit]

Cümleten selam olsun! Son birkaç ay İtimad ve Emniyet Vasıtaları grubu, Vak'a Haber Tertibatı'nın 1. aşamasını katileştirmek için gayret etti.

Bu aşamanın gayesi harikulade mal istikameti ve layihanın şümulunu tayin etmekti. Halihazırda buna müteakiben ne icra edeceğimize dair berkitilmiş bir ferasete sahibiz.

1. Layihayı Vak'a Haber Tertibatı olarak yeniden adlandırıyoruz

[edit]

Layiha bundan böyle Vak'a Haber Tertibatı olarak biline, "Hususi" kelimesi lağvedile.

Zulüm ve Umumi Tavır Yasası cihetince "Hususi" sözcüğü cemaat mensuplarının mahremiyetine hürmet etmek ve emniyetini sağlamak manasındadır. Haber vermenin tüm aşamalarının şahsa mahsus olacağı manasına gelmemektedir.

Bu ıstılahın muğlak, ve ecnebi dillere güç tercüme olabileceği şeklinde şerhe maruz kaldık. Binaenaleyh onu çıkarıyoruz.

2. Kimi ibret cemaatlerde yürütülen tahkikattan şerhimiz var

[edit]

Endonezya ve Kore Vikipedi cemaatlerinde zulüm üzerine tahkikat icra ediyoruz. Onların şerhiyle bu cemaatlerde istifadecilerin zulmü nasıl haber verdiklerini belgelendirmeye ve malutmat şeması yaratmaya muvaffak olduk. Bu şemalar, bizim irfanımızın müsaade ettiği ölçüde, cemaat istifadecilerinin her iki vikide zulüm ve suistimal vak'alarını nasıl haber verdiklerini temsil eder.

Şayet der hakk-ı bu şemalar şerhiniz varsa müzakere sayfasında beyanda bulunmak caizdir.

3. Layihanın hülasasını muhasırlaştırdık

[edit]

Min ba'd inşaat arzumuz

  • İtimad ve Emniyet Vasıtaları grubu, vak'aları/Umumi Tavır Yasası'na riayetsizlikleri haber vermek için bir uzantı inkişaf edecek.
  • Uzantının tebdil edilebilir olması teemmül edilmiştir. Cemaatlerin onu mahallî faaliyetlere intibak ettirebilmeleri vacibdir.
  • İsm-i uzantı ReportIncident idir.
  • Gaye-i uzantı şu idir:
    • Facilitate the filing of reports about various types of UCoC violations by Wikimedians
    • Route those reports to the appropriate entities that will need to process them
    • Facilitate the filing of reliable reports and filter out/redirect the unactionable ones.
    • Facilitate the filing of both private (e.g. to an email address) as well as public (e.g. on-wiki to an Admin noticeboard) reports according to local processes.
  • Extension is intended to be incident agnostic  (ability to support the reporting of different types of incidents)

What we won’t be doing

  • The system is intended for reporting and routing only, we will not be dealing with processing reports
  • The system is intended for incidents with regards to UCoC violations. We will not use this for other type of requests (such as technical support requests, account access etc)
  • The system is NOT meant to replace existing processes on wikis. Our purpose is to make it easier to follow existing processes.

4. We have the first iteration of the reporting extension ReportIncident

[edit]

In November last year we talked about how we should start small with a very limited scope, so  for our first iteration we thought about creating a very basic experience.

What’s included in this initial iteration?

  • Ability to report from User Talk page
    • Report a topic header
    • Report a comment
  • Ability to complete a basic form and submit
  • The report will be sent to an email address (a dummy email for testing purposes).

Designs

The first version of the MTP (minimum testable product) will let a Wikimedian report an abusive topic header or comment on a talk page. Here are the designs.

Implementing Designs – What’s next

[edit]

The Trust and Safety Product team is now working on developing these initial designs as an MTP, a proof of concept that will be deployed to Beta-cluster and tested internally. The purpose of this is to assess technical viability. If everything goes well the next step is to deploy to test.wikimedia.org for usability testing and feedback.

Looking forward to your feedback about this first iteration on the talk page!

November 2022

[edit]

Our main goal for the past couple of months was to understand the problem space and understand what people are struggling with, what they need, and their expectations around this project. We did this by:

  • Reviewing and synthesizing harassment research, surveys and other relevant documentation (going back to 2013)
  • Having user interviews with volunteers who have experienced or witnessed harassment on Wikipedia
  • Having discussions with Staff members, UCoC drafting committee and wiki functionaries.

Our purpose was to identify priorities, scope and a possible product direction.

Findings and next steps

[edit]
Focus on Safety
[edit]

The recommendation from the Movement Strategy discussions is to provide for safety and inclusion within the communities. As our ultimate goal is for people to feel safe when participating in Wikimedia projects, we will use this as the guiding principle for what to focus on in the minimum viable product (MVP).

Project Approach: start small
[edit]

There are a lot of things to take in consideration when thinking about this project.

  • Many types of Users: reporter, responder, observer, accused, monitor
  • Many Use cases: doxing, abuse of power, content violations, security breaches, legal issues etc.
  • A lot of Complexities: admins as harassers, off wiki harassment, government interference etc.

This project will grow and become more complex over time. So we need to start really small, with a very limited scope before we dive into anything more complex.

Focus on two types of users
[edit]

We have identified a few different types of users:

  • Reporters: Users who have experienced harassment, and are filing a report.
  • Responders: Users who receive the report, and want to help.
  • Accused: The users who are named in the report.
  • Monitor: People who are interested in tracking the progress of reports, to understand the problem better or to ensure that people are treated properly.

Since we want to start small, we will focus on reporters and responders first.

MVP Approach (Short-term)
[edit]

The way we would like to approach this is to build something small that will help us figure out whether the basic experience actually works.

Principles of the MVP:

  • We will design for and test and release on a few pilot wikis
  • Since our goal is to address safety we are going to focus only on 3.1 (Harassment) in the UCoC.
  • We will explore a basic experience for two user groups only:
    • Reporters will understand how to file a report, and feel comfortable enough to complete the report process.
    • Responders will receive clear reports, giving them the information that they need in order to understand the problem.
  • MVP will connect to current systems as they are (we are not changing any existing processes)

This experiment should also help us explore and answer some important questions and learn things as we go:

  • Entry points (where reporting starts) – what are they, should we have one or more?
  • Users – do people easily discover the entry point? What do they think will happen when they engage it?
  • Scale – can we do this at scale? Will we overwhelm the responders? etc.
  • Data – can we build something that will help us collect the data we need in order to make decisions? What can we measure to know we’re moving in the right direction?
What we are not doing (yet)
[edit]

The idea is to start with a really small scope, try a few things and learn as we go. Therefore we need to be very clear about what we are not going to do yet:

  • We are not solving for bad admins and/or other complex use cases
  • We are not fixing existing flawed processes
  • Not everything in the UCoC is about safety but we are focusing only on safety
  • Agnostic reporting – we cannot do this without validating a basic reporting experience works with a specific type of incident
What happens after the MVP (long-term)
[edit]

We have some ideas about v2 and v3 but we want to experiment with an MVP first and see how people feel about it. What we learn now will be useful to make decisions about future versions.

Some v2 and v3 ideas include:

  • Private reporting (creating a private space for reporters and responders to interact)
  • Escalation (having the ability to route cases to a different entity for further support)

In order to explore these two ideas we need to ensure the basic/core experience actually works. If it does we will build on top of it.

Discussions points

[edit]
  • What do you think about this approach?
  • What scares/concerns you about this project?

Looking forward to your feedback on the talk page!

September 2022

[edit]

We have been collecting feedback, reading through existing documentation and conducting interviews in order to better understand the problem space and identify critical questions we need to answer. We are currently synthesising the information we have collected in an effort to start defining a more clear scope for the project. It is a lot of information to go through so this might take a while, there's so many things we need to learn!