Trust and Safety: Difference between revisions

From Meta, a Wikimedia project coordination wiki
Content deleted Content added
{{Foundation content|maintained=yes}}
Adding Disinformation section
Line 40: Line 40:
* <translate><!--T:35--> '''Identification and access rights''' for community: see [[<tvar name=1>Special:MyLanguage/Access to nonpublic personal data policy/Noticeboard</tvar>|access to nonpublic personal data noticeboard]]</translate>
* <translate><!--T:35--> '''Identification and access rights''' for community: see [[<tvar name=1>Special:MyLanguage/Access to nonpublic personal data policy/Noticeboard</tvar>|access to nonpublic personal data noticeboard]]</translate>
* <translate><!--T:36--> '''T&S database management''': mandatory and best-practice record-keeping</translate>
* <translate><!--T:36--> '''T&S database management''': mandatory and best-practice record-keeping</translate>
* <translate>'''Trust and Safety Disinformation''': The Trust and Safety Disinformation team focuses on supporting communities in identifying and countering disinformation campaigns on Wikimedia Foundation platforms. The team has supported the evaluation of long-standing community concerns about [[Croatian Wikipedia Disinformation Assessment-2021|the disinformation issue on Croatian Wikipedia]] and collaborated closely with English language Wikipedia functionaries ahead of the US presidential election 2020 following [[:File:Biden Campaign Disinformation Retrospective.pdf|the VP incident study]].</translate>


<translate>
<translate>

=== Direct community support === <!--T:7-->
=== Direct community support === <!--T:7-->



Revision as of 08:51, 20 August 2021

 Trust & Safety Resources Programs and Processes 

Evaluation and reporting:

Phabricator project: Trust and Safety

Trust and Safety (T&S), formerly known as Community Advocacy, CA or SuSa,[1] identifies, builds and – as appropriate – staffs processes which keep our users safe; design, develop, and execute on a strategy that integrates legal, product, engineering, research, and learning & evaluation to proactively mitigate risk as well as manage the overall safety of our online and offline communities when incidents happen. Trust and Safety comprises part of the Community Resilience & Sustainability wing of the Legal Department. We aim to provide compassionate, credible, and comprehensive Trust and Safety services to the Foundation and the volunteer communities and affiliates it supports but much of our time is also just spent "fire-fighting". For more information on the team, see our Overview.

What we do

The Trust and Safety team supports staff, the public and volunteers in our community through approximately 11 workflows in three broad areas. The team is composed of two sub-teams: Policy and Operations. You can find more details in under Programs and Processes.

Trust and Safety

The Wikimedia Foundation aims to defer to local and global community processes to govern on-wiki interactions. However, at times, we must step in to protect the safety and integrity of our users, our contributors, and the public. We support a healthy environment on our projects through several work areas. Among other measures, we receive and handle reports of major safety issues on Wikimedia projects, including suicide threats, threats of violence, and child pornography. We also own the policies regarding Wikimedia Foundation bans of users from the projects and from Foundation-funded or supported events, and we work with other Foundation teams to address concerns about user privacy and freedom that do not necessarily rise to the level of bans.

As a part of the Foundation’s commitment to respect community autonomy, the Trust & Safety team does not handle general community or community-member disputes that may be addressed through community processes, nor does it serve as an appeal venue for community-made policies and decisions. While we are happy to assist community members in need of help, many times that help will consist of assisting the person to find the right community venue to solve their problem.

Flowchart detailing the investigation workflow when evaluating a request for an Office action

Regular workflows include:

  • Evaluation and reporting:
    • General Trust & Safety inquiries and reports of abuse can be submitted through: ca(_AT_)wikimedia.org, in line with the Office actions policy.
    • Threats of imminent physical harm can be submitted to: emergency(_AT_)wikimedia.org, according to the Threats of harm protocol.
    • Assessment of child protection concerns including reports of child pornography filed through: legal-reports@wikimedia.org, as per Child protection.
  • Global & Event bans: Requests, inquiries, investigations & maintenance. (See also log of global bans by T&S.)
  • AffCom support: Assist the Affiliations Committee as appropriate in relation to Trust and Safety matters
  • Identification and access rights for community: see access to nonpublic personal data noticeboard
  • T&S database management: mandatory and best-practice record-keeping
  • Trust and Safety Disinformation: The Trust and Safety Disinformation team focuses on supporting communities in identifying and countering disinformation campaigns on Wikimedia Foundation platforms. The team has supported the evaluation of long-standing community concerns about the disinformation issue on Croatian Wikipedia and collaborated closely with English language Wikipedia functionaries ahead of the US presidential election 2020 following the VP incident study.


Direct community support

In close collaboration with Product and other parts of the Legal department, the team leads the Anti-Harassment Program.

Other regular workflows include:

Internal support

We provide guidance, advice and support to Foundation staff, the Board, and committees. We assist staff routinely with community and content related concerns, including processing DMCA takedown and notification requirements and, where necessary, responding to search warrants and legally valid subpoenas. We manage requests for advanced user rights required for staff members to do their work by assessing needs and liaising with the stewards.

Regular workflows include:

  • Supporting Executives: the Executive Director, the Executive Office, Jimmy Wales.
  • Liaison work: to the election committee, the cross-wiki Ombudsman Commission, the Code of Conduct Committee, and other WMF teams.
  • Supporting the staff: Advanced privileges and user rights for staff
  • Legal support: DMCA takedown and notification requirements, search warrant and subpoena compliance

Office actions workflow

Flowchart detailing how to report a Wikimedia-related incident
More information: Office actions

Part of the Trust and Safety team's responsibility within the Wikimedia Foundation is to undertake office actions. These articles are rare, and performed pursuant to the Terms of Use, typically as a last resort. Office actions are used to handle privacy violations, child protection, copyright infringement, systematic harassment, and other violations of the Terms of Use that cannot be handled through community-governed processes.

The process leading up to an office action varies considerably based on the action and the circumstances surrounding it. The strongest actions in common use are those taken against users of the websites, typically in the form of global or event bans. These actions are the result of user conduct investigations undertaken by T&S Specialists, which go through a rigorous review cycle as documented in the flowchart to the right.

Other office actions can include deletions of illegal material. This typically consists of sensitive images of minors which violate the laws of the United States. T&S also performs deletions to satisfy the Foundation's DMCA Policy, an archive of which is maintained on Foundation wiki.

Appeals

In September 2020, a case review committee was created to allow directly involved community members to request review of a community committee of Trust & Safety behavioral investigation outcomes. This committee is equipped to review certain office actions on appeal from individuals directly involved in the case (as the requesting or sanctioned party). For more information, see Office actions#Appeals. This will remain in place until a permanent process is created through the Universal Code of Conduct conversations in 2021.

References