User:CLo (WMF)/enwiki reporting system research

From Meta, a Wikimedia project coordination wiki

Written Nov 2018.

As we begin work on making a new reporting system, it is important to understand what kinds of reporting systems and paths exist on wiki. It will also greatly help us if we have a broad understanding of current best practices or knowledge on reporting systems, online anti-harassment, and online crisis management.

Our current main focus of research is English Wikipedia. Formal reporting systems are generally only necessary at scale, and because of English Wikipedia's age and size, it has developed many different reporting systems, making it a good case study.

Terminology[edit]

Involved users[edit]

For the purposes of reporting system work, I have identified three main categories of users who might engage with any potential reporting system.

Moderators

These are users who receive, deliberate and act upon reports. Moderators may or may not be administrators or other user groups, and this position is not always a named one. An administrator who receives and reads noticeboard reports, who concludes them by administering formal sanctions such as blocks, is a moderator. Equally, an influential editor with no user rights beyond that of any other editor may act as a moderator in a smaller project, receiving informal reports and acting upon them by engaging in conversation to reach a conclusion.

Reporters

These are users who bring reports to moderators. These users can be, but are not necessarily, the direct targets of misconduct or abuse; they could be bystanders, or be accused of misconduct themselves.

Accused users

These are users who have been named as engaging in misconduct or abusive behaviours. Accused users may pull in friends or supporters if reported in a publicly-accessible space.

None of these groups are mutually exclusive. A moderator might be a target of harassment, making them a reporter, yet be accused of misconduct and abuse of their power by others. Reporters bringing a case to a formal noticeboard may have their behavior scrutinized, in the process becoming accused of misconduct themselves. Part of the difficulty of designing a reporting system will be accounting for this overlap when thinking about who might engage with it.

Additionally, these groups do not exist in a one-to-one proportion. The number of moderators will be far smaller than that of accused users or reporters; one reporter may report multiple users, and an accused user may be reported by multiple people. Any given report may also be handled by any number of moderators.

Formal and informal systems[edit]

Current reporting systems on Wikimedia projects can be categorized as formal or informal. Formal systems are codified and supported by policy or code or both, designed to facilitate reporting. Examples of formal reporting systems include noticeboards or ArbCom. Informal systems are various networks of communication, relationships, code and policy that can be repurposed for reporting purposes. Examples of informal systems include private off-wiki correspondence, or use of project-affiliated social spaces (e.g. IRC) to report misconduct.

Formal systems can be useful because they provide an obvious and structured way to bring a case to a moderator’s attention. Ideally, formal reporting systems ease the burden of reporting cases while upholding important community values when dealing with reports, such as maintaining good faith for involved parties and committing to transparency in the way reports are handled and documented. However, these systems can fail because of their rigidity and slower speed, making them unsuitable for emergencies, acute abuse, or particularly complex cases. The specific way in which formal reporting systems are set up for a given project may also be open for abuse or manipulation.

Informal systems are much faster and more flexible as they leverage existing relationships to reach moderators who could act upon an informal report. Ideally, informal systems can allow reporters to discreetly bring cases to moderator attention, especially for more complex cases that require speed and discretion. Because they are not bound by the structures of formal systems, they can accommodate more edge cases, involve different methods of mediation, and come to more complex resolutions. However, informal systems are very opaque to those without lots of knowledge or deep involvement in the Wiki community. Their existence can also make both involved groups and observers uneasy, since they definitionally exist outside of the “official” system, and therefore are assumed to be less legitimate and not governed by the same community values.

Formal and informal systems are not mutually exclusive, and ideally complement each others’ strengths and weaknesses. The balance of formal-to-informal reporting system use will differ by project, based on their policy, available moderator labor, and project values. Because misconduct is a social issue, I am extremely skeptical that informal systems will ever disappear, or that they should. Formalizing previously informal networks can also be costly, causing these reporting paths to lose much of the flexibility and speed that makes them so advantageous in the first place.

Lastly, no matter whether a system is formal or informal, it all ultimately hinges on the trust of all involved users. A system fundamentally must be trusted by all parties in order to function. If moderators do not trust the system, they will not engage with it, meaning no reports will be acted upon. If reporters do not trust a system, they will not raise reports through it, rendering it useless. If accused users do not trust the system, no outcome rendered through that system will be understood as legitimate, making it more likely for accused users to fight whatever outcomes the system produces, causing further harm to all involved rather than settling the case as we would hope.

Additional considerations[edit]

In addition to these reporting systems, there are three major considerations at play: policy, values and labour. Policies are the social and operation policies put into place in any given project, such as English Wikipedia's "standard" 31-hour block length used for a variety of infractions. These are the guidelines, best practices, procedures, and rules governing engagement with reporting systems, both formal and informal, on a project. They may be clearly documented or only loosely defined norms, but policy does not have to be written down to be policy.

Values are the moral and social values prized by a given community, which shapes the design of reporting systems. Additionally, these social values provide a frame by which the community understands the authority and legitimacy of a reporting system in all aspects, from its perceived efficacy, to the validity of the outcomes it generates. For example, the Wikimedia community highly prizes transparency. For reporting systems, this is interpreted as publicly-viewable processes, outcomes, and the identities of the involved users. Transparency in this case is not just a design consideration put into place to achieve a certain kind of efficiency or mode of operation, but a value to be strived for in the way the entire system operates. Because the current reporting system aligns with a certain dominant interpretation of "transparency", the system engenders a feeling of trust from its users, broadly. However, we know that the same commitment to transparency is harmful and serves to chill the participation of other users who are not properly served by the system as it stands. Our current conundrum is the fact that, whatever product we make, it must be able to claim adherence to equally important values even as we change key features, otherwise it will not be trusted enough to function.

Lastly, labour concerns both the question of who performs the work necessary to keep these systems running, as well as the conditions under which that labour is performed. Administrator retention is a topic of enormous concern to administrators and functionaries, and burnout is a persistent spectre hanging over those who perform this mediating and moderating work. The volunteer nature of these roles means that there is no financial compensation, and little to no training or support. Depending on the size of the project, it can also be isolating work. And, because of the fluidity of categories and the value of objectivity across Wikimedia, a moderator who is themself a target of abuse may find it difficult to find help, especially if they are part of only a few admins, or are the only active administrator on their project.

En-wiki reporting systems[edit]

File:Enwiki Reporting system workflow draft.svg
A diagram of some of the current reporting systems on English Wikipedia.

On English Wikipedia, our current target of study, all easily-found reporting systems and most of the formal reporting system is public; the majority of public reporting spaces require the reporter to notify the accused user as a condition of use, which could potentially deter reporters from bringing new cases. Informal reporting systems are de facto private channels, yet their opacity means they are accessible only to experienced editors.

The diagram details 21 different reporting and deliberation spaces on English Wikipedia, but it is still a non-exhaustive list. Some of the striking features of the current system include:

  • Most of the easiest-to-access reporting spaces require the reporter to notify the accused user directly, as a condition of use.
  • The vast majority of private reporting systems are also among the hardest to find for a newer user.