Jump to content

Movement Strategy/Recommendations/Iteration 2/Community Health/12

From Meta, a Wikimedia project coordination wiki

Investing in equity-centered technologies

[edit]

Q 1 What is your Recommendation?

[edit]

To enhance equity with the movement, we recommend investment in technologies that will support people using diverse languages, people with disabilities, people with legitimate fears for their security, and people with varied levels of access to fully participate in the Wikimedia movement.

Concrete examples of the technologies deemed necessary include but are not limited to:

  1. easily accessible incident reporting and support systems which equitize the ability to articulate concerns and have them addressed effectively regardless of language or location, while respecting privacy,
  2. collaboration and dialogue software to equitize the ability to articulate perspectives influencing communities and the movement,
  3. tools and mechanisms that will allow for safe selection processes for roles of authority as well as making project community-wide policy decisions, including the option of non public voting in the case of election of individuals,
  4. industry best practices related to content moderation to free functionaries, who also fulfill important social roles in communities, from handling content that is politically sensitive or psychologically damaging such as child abuse.

All of these technologies should be developed so as to be operational on mobile devices as well as on desktop.

Furthermore, the evaluation processes for assessing proposed investments in technology should be amended to include equity parameters. This can relatively easily be done as the movement structurally is a reuser of technology and not itself a major technological innovator. This role enables it to observe emerging technologies on other use cases, evaluate them through the lens of creating and curating freely licensed knowledge embedded in equity perspectives, and reach an informed investment decision.

Q 2-1 What assumptions are you making about the future context that led you to make this Recommendation?

[edit]

The social dynamics of a global online-centered movement present an ongoing challenge that requires continuous professional resources to adapt the ecosystem to new threats and emerging technologies. However, decisions on how to handle individual users or content should ultimately be made by humans for the foreseeable future.

Volunteers should not be handling classes of concerning material that pose considerable risks to their wellbeing and regulatory scrutiny. Most big platforms these days routinely provide users with a safe, non-public path to articulate concerns - including on community-centric sites like Reddit - and Wikimedia has to adhere to these basic standards.

The Wikimedia movement is largely a reuser of technological innovation and usually not a technological pioneer.
Technologically we are decades behind other existing platforms and have no reason to believe that the movement could change this for the duration of the recommendation’s time horizon. There are new challenges which can only be addressed with the new technology innovation and there is an urgency to adapt as a technological follower.

This summer, 2019, mobile traffic overtook desktop traffic but the projects have been built through - and for - desktop-centric paths. Maintaining the desktop focus is a growing burden both on existing communities, complementing their traditional desktop use with mobiles in their daily lives, as well as newcomers, often entirely reliant on mobile devices and encountering the desktop-centric structure as one early key hurdle to conversion into active content contribution and articulation of views.

Q 2-2 What is your thinking and logic behind this recommendation?

[edit]

The technology currently used in the movement bears the imprint of being developed for a smaller, non-diverse community where a command of English could be taken for granted. If we want to move to greater equity, we have to adopt and introduce systems that facilitate safe participation by a community that is diverse in all aspects. This means allowing for the use of a multitude of languages and cultures as well as providing essential guarantees for safety and privacy, especially for groups or individuals that may have reasons to fear abuse or harassment.

Also we should ensure that the exposure of contributors to content that can be considered harmful is minimised. Wikimedia is the last major platform that relies on users, in our case volunteers, to identify both known and novel concerning content. Assuming no functionary volunteers to serve by having to handle known material, unnecessarily, the working group thinks it is time to align the platform with established best practices and continuously contribute to developing them to enhance volunteer - and reader - protection.

Q 3-1 What will change because of the Recommendation?

[edit]

This recommendation aims to clarify the contribution technology can make to a thriving movement and clarifies the role of the ecosystem in the field. It will lead to improvement in the individual health and wellbeing of contributors and renew outdated practices.

The incident reporting system section is advocating to align the Wikimedia platforms with well-established reporting practices offered across the digital ecosystem by both community-centric and user-centric platforms. It does not suggest to change the Wikimedia moderation model from the community-reliant current approach to an industrial model as mapped in Caplan (2018). Instead, it recommends to renew the existing approach by building on the existing pre-work done through the user reporting system consultation on Meta in early 2019 and lead it to its necessary conclusion: offering a reliable, safe, and non-public community reporting path for concerns and adjudication. Additionally, the development of a non-public path for reporting was strongly supported by CHWG survey respondents who described being targeted for harassment upon reporting incivility and even content issues such as copyright violation, and that this risk of reprisal discouraged their taking action against interpersonal conflict as well as legal content issues.

The collaboration and dialogue software section takes a similar approach: the platforms communications software needs to be renewed to make a credible offer to broader ranges of contributors, notably those relying on mobile. The need to update mobile editing interface was also described by survey respondents.

Source: [stats.wikimedia.org/v2 stats.wikimedia.org/v2]

The safe selection mechanism for functionaries, (which is defined in the recommendation “Privacy and security for everyone” o includes administrators), recognizes the pressures editing communities are under and their struggle in trying to recruit people prepared - and capable - of helping to carry the burdens of self-organization. The easiest way to implement it would be to overhaul the SecurePoll extension already serving various onwiki elections.

The working group sees reforming the selection mechanism as one key component in addressing this problem alongside the proposed customizable training and certification infrastructure (see Recommendation: “Building an inclusive global community”), the improved tools addressing abuse (see Recommendation: “Privacy and security for everyone”), the code of conduct renewing the shared foundations of behavior across communities (see Recommendation: “Joint set of rules”), and the self-care micro grants and term limits built into “Redefining power structures” and “Aligning resource allocation with community health goals”. Only together will those six components give communities the opportunity to become sustainable resilient by recruiting from broader pools of talent, enabling more flexible use of the tools across both desktop and mobile paths, and build functionary teams that serve local communities adequately; judged both by their own needs and in protecting them against regulatory pressures on the platform provider who needs to be able to continuously demonstrate to public policy makers that exemptions for Wikimedia from regulation targeting the likes of Facebook and Google continue to be justified.

The section on content moderation and industry best practices related to classes of concerning content like child protect and politically sensitive material extends the principle of the movement being a technology reuser in pursuing the charitable-educational mission. The platform industry has well-established hash checking mechanisms for more than a decade to identify and deal with child protection material. It also used said template more recently for terrorism issues as well.

Q 3-2 Who specifically will be influenced by this recommendation?

[edit]
  • WMF,
  • project communities,
  • individual editors.

Q 4-1 Could this Recommendation have a negative impact/change?

[edit]

The shift to offering a safe, non-public pathway to adjudicating concerns, while aligned with best practices across platforms generally, represents a significant cultural shift for some Wikimedia communities. Therefore, not all current users, including functionaries, might be supportive of said shift and feel it is better to re-evaluate their current level of engagement.

Q 4-2 What could be done to mitigate this risk?

[edit]

The envisioned incident reporting system allows communities considerable levels of customization and would be complementary to existing processes for the time being. It is also supported through improved training offers (see Recommendation: Building an inclusive global community) and enhanced functionary selection mechanisms.

Q 5 How does this Recommendation relate to the current structural reality? Does it keep something, change something, stop something, or add something new?

[edit]

It requires building or modification or technological platforms to include these capabilities.

Q 6-1 Does this Recommendation connect or depend on another of your Recommendations? If yes, how?

[edit]

Yes it would depend on building an inclusive global community. For eg : If a user files a gender discrimination concern through the envisioned reporting system, its appropriate handling depends on adequately training functionary evaluators (see Recommendation: Building an inclusive global community).

Q 6-2 Does this Recommendation connect or relate to your Scoping Questions? If yes, how?

[edit]

No

Q 7 How is this Recommendation connected to other WGs?

[edit]

Technology Working Group, Advocacy, Resource Allocations, Partnerships