Trust and Safety/Overview
An overview of Trust and Safety ("T&S") at the Foundation and Wikimedia's approach relative to other platforms.
The first job of T&S is to keep the online volunteer communities on our platform safe. Being a nonprofit organization with constrained resources, the Foundation traditionally relies on peer-elected volunteers ("functionaries") to do most of the trust and safety work on behalf of their own project communities for which other platforms hire hundreds of staff and contractors. This can include fighting spam and vandalism, resolving local community conflicts, and content moderation such as editorial issues and the deletion of copyright violations. Wikimedia relies on volunteer functionaries and staff with T&S-relevant roles and user rights: more than 6,500 volunteer admins, 12 Arbitration Committees ("ArbComs") with three to ten members each; specialist roles building on adminship (189 local checkusers, 105 local oversighters, 39 stewards), and the staff team (ten FTE). However, 337 of our 889 projects (38%) have no local admins, depending entirely on steward, global admin, and Foundation services. ~85% of Wikipedia wikis (representing <4% of Wikipedia editors) have little to no functional local conduct policies, essential to self-governance.
Wikimedia Foundation T&S Operations: T&S routinely handles workflows tied to the Foundation's role as an internet service provider (ISP), including close collaboration with Legal on statutory child protection and DMCA obligations or personality right violations of people with Wikipedia articles. It also leads the org's law enforcement collaboration, covering issues like threats to life, threats against Wikimedia offices, POTUS threats, school shootings, and terrorism threats published on our platform, as well as persecution of volunteers by hostile governments. Other core work tied to the ISP role includes legal case support, partnering with ArbComs and the stewards, and administration of sensitive volunteer functionary and staff access rights on the platform.
The team handles four types of investigations: a) long-term volunteer harassment and conduct investigations into volunteers beyond the capacity of volunteer functionaries or affiliates. This category also includes mediation support for the Affiliations Committee (AffCom); b) HR-investigations into online workplace harassment targeting staff; c) handling sustained security and privacy challenges in close collaboration with Legal and Security; and d) supporting staff investigations and the vetting of all Foundation hires and appointments in collaboration with Talent & Culture and Legal.
Wikimedia Foundation T&S Policy: T&S maintains a standing subteam (4 FTE) that focuses on Foundation initiatives to structurally improve community health under the board's three investment priorities outlined in November 2017 when adopting the Wikimedia 2030 strategy. These programs include the Universal Code of Conduct (UCoC), digital security training, building preventive anti-harassment resources and event safety material for offline spaces, and functionary support for the volunteer Technical Code of Conduct Committee for Wikimedia's technical on- and offline spaces.
Wikimedia Foundation T&S Product: T&S routinely partners on key organizational initiatives like user privacy, security, and data management; provides the technical program management for the Anti-Harassment Program; and routinely advocates for volunteer functionary tech needs.
Wikimedia traditionally falls under the community-reliant branch of the three standard approaches to trust and safety. This contrasts to the industrial approach taken by platforms such as Facebook, Twitter, and YouTube, who outsource trust and safety work to thousands of contractors or employees who assess against a minutely defined checklist, or the artisanal approach taken by companies such as Medium and Vimeo where all such work is done by employees with little automation and more latitude for interpretation.
There are challenges and advantages within the model that could be explored at length, but here we focus on several worth highlighting:
- Our model recognizes and respects the ability and right of volunteers to shape projects in ways that work best in their context.
- We are partners in a complex ecosystem. Staff must protect the ecosystem carefully, both in taking action where necessary and leaving action to the local level when action can be handled by volunteers in a manner that protects them, other users, and our platform adequately.
- In order for our volunteers to exercise self-governance, they need robust product and engineering support. Wikimedia must balance its resources to enhance the platform for readers and content contributors while also ensuring that functionaries and T&S staff have effective, efficient tools as mandated by the Board's statement from May 2020.
- In order for our volunteers to exercise self-governance, they need to understand the framework of Foundation policies enabling self-governance, which often are not issued in their language and are frequently translated by volunteers. Wikimedia is not yet where it needs to be in terms of online training infrastructure and modules. We also have not yet found an effective system for measuring how policies are being interpreted and applied across our hundreds of language projects. These are both gaps we continue to seek to address.