Jump to content

Strategy/Wikimedia movement/2018-20/Transition/Discuss/Provide for Safety and Inclusion

From Meta, a Wikimedia project coordination wiki

This is an open discussion space dedicated to defining potential priorities for implementation from the Provide for Safety and Inclusion recommendation.

Code of Conduct

[edit]
  • Support Support Ad Huikeshoven (talk) 20:25, 22 November 2020 (UTC) ; actually I prefer the Contributor Covenant version 2, to the concoction. What should already have happened is ratification by the board. The discussion should have started about how we are all going to enforce any guideline in this respect, before supposing any specific guideline. A code of conduct is just paper, without enforcement.[reply]
  • Oppose Oppose Most of this year's enwiki arbitration candidates have expressed concerns about the drafts, especially one submitted to the Board. Not just the candidates, many other editors also have concerns and/or are against implementation of the Code during feedback phase. We shall not make this a high- or top-priority if the communities remain divided about this "initiative" and if the impact of the Code would be potentially tremendous and irreparable. Furthermore, I'm unsure whether the Code is effective in bringing and hold communities together as promised. George Ho (talk) 01:14, 23 November 2020 (UTC)[reply]
  • Oppose Oppose to this edition, tailored by WMF to coverup disturbed personalities they use to enforce their strategies in smaller wikis. User:PEarley (WMF) has deleted and omitted Workplace bullying / mobbing / moral harassment as well as Culture of fear and Toxic workplace from the conduct of code.   ManosHacker talk 14:00, 24 November 2020 (UTC)[reply]
  • Oppose Oppose. Too controversial, wait until the WMF has rebuilt the trust of the community. I don't oppose a code of conduct, I oppose a code of conduct being dictated by the WMF and arbitrary, heavy-handed, centralised enforcement. MER-C (talk) 19:37, 24 November 2020 (UTC)[reply]
  • Oppose Oppose. Benjamin (talk) 06:46, 28 November 2020 (UTC)[reply]
  • Oppose Oppose. The WMF did not ask if this was something that was desired. While the WMF has insisted that it is intended only for smaller wikis which have not had time to develop conduct policies of their own, it does not say that. Nor is there an opt-out mechanism for communities which already have their own established policies and processes and would prefer those. Those deficiencies are fatal, and to be blunt, with the level of trustworthiness WMF has shown, seem a backdoor to pulling another Fram incident and pointing to this. Seraphimblade (talk)
  • Oppose Oppose The Foundation is the poster-child of "the road to hell is paved with good intentions". They actively sabotaged the community's efforts to clean up pervasive abuse on Azwiki, the Foundation royally screwed up in the Framban case, doubled-down on that screwup, and is trying to ram this dysfunctional CodeOfConduct initiative down our throats. Alsee (talk) 19:36, 2 December 2020 (UTC)[reply]
  • Oppose Oppose. Conduct policy on Wikimedia wikis is set by the communities, not by the WMF. The WMF is not authorized to establish such a policy. (Additionally, the organization clearly does not know how, in practice, to put together an appropriate policy.) --Yair rand (talk) 07:20, 14 December 2020 (UTC)[reply]

Private incident reporting

[edit]

Create pathways for users to privately report incidents, either technical or human, including harassment, to have them addressed effectively and with appropriate urgency regardless of language or location.

  • Neutral Neutral, partly Oppose Oppose. Of course, such pathways would help in certain situations. In other situations, though, they could lead to the result that too many communicative problems in the Wikimedia communties are not adressed openly in a free discussion any more, but are "solved" by "punishing" the conflict party that has first called WMF for help. Similar situations already happened and I don't want to see them happen more often. A way to privately report "human incidents, including harassment" should never be a standard way of dealing with conflicts between Wikimedians. It should only be used (1.) in situations that really mean an acute personal danger for one of the parties – like already possible via emergency(_AT_)wikimedia.org – or (2.) (maybe) in very small Wikis. I use the plural "communities", because one can distinguish a global community, often called the "movement", and many communitys in the single Wikipedia language versions/Wikidata etc. See also [1]. --DerMaxdorfer (talk) 16:49, 2 December 2020 (UTC)[reply]
  • Oppose Oppose private reporting, until the Foundation stops violating it's own commitment and The Global Community Consensus that they cease abusively meddling in routine content disputes. Alsee (talk) 20:00, 2 December 2020 (UTC)[reply]
  • Thank you both for your useful feedback. Based on results from here and the Global Conversations, there's now a dedicated part of the space to discuss the highly-prioritized initiatives for global implementation in 2021-22 (which do not include this initiative). You're invited to continue the conversation there, if interested --Abbad (WMF) (talk) 16:05, 17 December 2020 (UTC).[reply]

Baseline of community responsibilities

[edit]

Establish a baseline of community responsibilities for safeguarding and maintaining a healthy working atmosphere for both online and offline Wikimedia involvement, along with procedures for reporting and follow-up.

  • ..

Develop a safety assessment and execution plan

[edit]

Research and develop an adaptable safety assessment and execution plan, as well as rapid response and support infrastructure. This would ensure that on- and offline contributors have available resources ready and accessible to mitigate the harm caused by their Wikimedia activities, including:

  • psychological support (e.g., therapists, counselors, mediators),
  • technical support (e.g., anonymization mechanisms),
  • legal assistance (e.g., list of partner lawyers, facilitation of legal representation at a local level),
  • a fast-track escalation path in life-threatening situations,
  • procedures for reacting to large scale challenges, such as socio-political unrest and environmental threats,
  • training and opportunities to raise awareness and build response capacity for community safety and conflict management
  • ..

Advocacy - local capacity development

[edit]

Develop local capacity for advocacy for legal and regulatory frameworks our communities need to make our projects thrive.

  • ..

Built-in platform mechanisms for safety

[edit]

Establish built-in platform mechanisms aimed at the safety of contributors in their contexts (e.g., anonymization mechanisms). To be assessed for opportunities and risks for particular projects or contexts including the risk of harassment and vandalism.