Довіра і Безпека

From Meta, a Wikimedia project coordination wiki
Jump to navigation Jump to search
This page is a translated version of the page Trust and Safety and the translation is 26% complete.
 Trust & Safety Resources Programs and Processes 

Оцінка і звітність

Проект на Фабрикаторі: Trust and Safety

Довіра і Безпека (Д&Б), англ. Trust and Safety (T&S), раніше відома як Community Advocacy, CA або SuSa,[1] визначає, будує та, за потреби, підтримує процеси, які забезпечують безпеку наших користувачів; проєктує, розробляє та реалізує стратегію, яка об'єднує юридичні, продуктові, інженерні, дослідницькі, навчальні та оцінювальні рішення для проактивного зменшення ризиків, а також керує загальною безпекою наших онлайн- і офлайн-спільнот у разі виникнення інцидентів. Довіра та Безпека є частиною Відділу стійкості та сталого розвитку спільноти Юридичного департаменту. Ми прагнемо надавати Фонду, волонтерським спільнотам і афіліатам, які він підтримує, чутливі, надійні та комплексні послуги з довіри та безпеки, але більшу частину нашого часу також витрачаємо лише на «гасіння пожеж». Щоб дізнатися більше про команду, перегляньте наш Огляд.

Чим ми займаємось

Команда «Довіра і безпека» підтримує персонал, громадськість і волонтерів у нашій спільноті за допомогою приблизно 11 робочих процесів у трьох широких сферах. Команда складається з двох підгруп: політики та операцій. Додаткову інформацію можна знайти в у розділі «Програми та процеси».

Довіра і Безпека

The Wikimedia Foundation aims to defer to local and global community processes to govern on-wiki interactions. However, at times, we must step in to protect the safety and integrity of our users, our contributors, and the public. We support a healthy environment on our projects through several work areas. Among other measures, we receive and handle reports of major safety issues on Wikimedia projects, including suicide threats, threats of violence, and child pornography. We also own the policies regarding Wikimedia Foundation bans of users from the projects and from Foundation-funded or supported events, and we work with other Foundation teams to address concerns about user privacy and freedom that do not necessarily rise to the level of bans.

As a part of the Foundation’s commitment to respect community autonomy, the Trust & Safety team does not handle general community or community-member disputes that may be addressed through community processes, nor does it serve as an appeal venue for community-made policies and decisions. While we are happy to assist community members in need of help, many times that help will consist of assisting the person to find the right community venue to solve their problem.

Flowchart detailing the investigation workflow when evaluating a request for an Office action

Regular workflows include:

  • Evaluation and reporting:
    • General Trust & Safety inquiries and reports of abuse can be submitted through: ca(_AT_)wikimedia.org, in line with the Office actions policy.
    • Threats of imminent physical harm can be submitted to: emergency(_AT_)wikimedia.org, according to the Threats of harm protocol.
    • Assessment of child protection concerns including reports of child pornography filed through: legal-reports@wikimedia.org, as per Child protection.
  • Global & Event bans: Requests, inquiries, investigations & maintenance. (See also log of global bans by T&S.)
  • AffCom support: Assist the Affiliations Committee as appropriate in relation to Trust and Safety matters
  • Identification and access rights for community: see access to nonpublic personal data noticeboard
  • T&S database management: mandatory and best-practice record-keeping
  • Trust and Safety Disinformation: The Trust and Safety Disinformation team focuses on supporting communities in identifying and countering disinformation campaigns on Wikimedia Foundation platforms. The team has supported the evaluation of long-standing community concerns about the disinformation issue on Croatian Wikipedia and collaborated closely with English language Wikipedia functionaries ahead of the US presidential election 2020 following the VP incident study.

Direct community support

In close collaboration with Product and other parts of the Legal department, the team leads the Anti-Harassment Program.

Other regular workflows include:

Internal support

We provide guidance, advice and support to Foundation staff, the Board, and committees. We assist staff routinely with community and content related concerns, including processing DMCA takedown and notification requirements and, where necessary, responding to search warrants and legally valid subpoenas. We manage requests for advanced user rights required for staff members to do their work by assessing needs and liaising with the stewards.

Regular workflows include:

  • Supporting Executives: the Executive Director, the Executive Office, Jimmy Wales.
  • Liaison work: to the election committee, the cross-wiki Ombuds Commission, the Code of Conduct Committee, and other WMF teams.
  • Supporting the staff: Advanced privileges and user rights for staff
  • Legal support: DMCA takedown and notification requirements, search warrant and subpoena compliance

Office actions workflow

Flowchart detailing how to report a Wikimedia-related incident
More information: Office actions

Part of the Trust and Safety team's responsibility within the Wikimedia Foundation is to undertake office actions. These articles are rare, and performed pursuant to the Terms of Use, typically as a last resort. Office actions are used to handle privacy violations, child protection, copyright infringement, systematic harassment, and other violations of the Terms of Use that cannot be handled through community-governed processes.

The process leading up to an office action varies considerably based on the action and the circumstances surrounding it. The strongest actions in common use are those taken against users of the websites, typically in the form of global or event bans. These actions are the result of user conduct investigations undertaken by T&S Specialists, which go through a rigorous review cycle as documented in the flowchart to the right.

Other office actions can include deletions of illegal material. This typically consists of sensitive images of minors which violate the laws of the United States. T&S also performs deletions to satisfy the Foundation's DMCA Policy, an archive of which is maintained on Foundation wiki.


In September 2020, a case review committee was created to allow directly involved community members to request review of a community committee of Trust & Safety behavioral investigation outcomes. This committee is equipped to review certain office actions on appeal from individuals directly involved in the case (as the requesting or sanctioned party). For more information, see Office actions#Appeals. This will remain in place until a permanent process is created through the Universal Code of Conduct conversations in 2021.