Strategy/Wikimedia movement/2018-20/Working Groups/Diversity/Recommendations/1
- 1 Recommendation # 1: Code of Conduct
- 1.1 Q 1 What is your Recommendation?
- 1.2 Q 2 What assumptions are you making about the future context that led you to make this Recommendation?
- 1.3 Q3. What will change because of the Recommendation?
- 1.4 Q^. How does Recommendation relate to the current structural reality? Does it keep something, change something, stop something, or add something new?
- 1.5 Q4a. Could this Recommendation have a negative impact/change?
- 1.6 Q4b. What could be done to mitigate this risk?
- 1.7 Q5. Why this Recommendation? What assumptions are you making?
- 1.8 Q6. How is this Recommendation connected to other WGs?
- 1.9 Q7. Does this Recommendation connect or depend on another of your Recommendations?
- 1.10 Q^. What is the timeframe of this Recommendation in terms of when it should be implemented? 2020, 2021, etc. Does it have an urgency or priority? Does this timeframe depend on other Recommendations being implemented before or after it?
Recommendation # 1: Code of Conduct
Q 1 What is your Recommendation?
That all stakeholders/participants in any and all activities be required to sign an agreement to adhere to a code of conduct.
Q 2 What assumptions are you making about the future context that led you to make this Recommendation?
Numerous studies     over time have indicated problems with unhealthy behavior and harassment exist in the movement. Groups which have been marginalized by society as a whole are often specifically excluded by being left out of the power structures. Even those operating in good faith may alienate others by strict adherence to rules and policies.
Q3. What will change because of the Recommendation?
1. It is anticipated that the change will create a more welcoming environment across all platforms and facilitate removal of disruptive people who refuse to abide by the Code. With a clear message that the Foundation has zero-tolerance for unacceptable behavior in any endeavors associated with it, a clear message of equitable participation and inclusion will bolster dynamic, positive growth.
3. Platform policies of civility are inadequate to deal with the behavioral issues and hostility/distrust which exists between various stakeholders makes it impossible to surmount the policy gaps and protect the community/foundation.
4. These types of problems are internet-wide and the quasi-anarchy of having no clear policy or definition escalates the inability to take action and distrust when action is taken.
5. An ombudsman position would need to be created in order to act as a liaison between the communities and the WMF to deal with these types of issues.
Q^. How does Recommendation relate to the current structural reality? Does it keep something, change something, stop something, or add something new?
All of the above. It expands on the existent “Safe space policy”; changes it to more accurately reflect interaction in all venues in which the Foundation and its stakeholders/participants engage; attempts to eliminate barriers to inclusion and opening new relationships; and adds a layer of safety/healthy atmosphere so that all participants are free to share their diverse knowledge.
Q4a. Could this Recommendation have a negative impact/change?
Those who do not wish to comply could leave the platform, as it is a major shift in the way things have been done in the past with each project developing its own “rules of engagement”. The argument for maintaining project-specific conduct guides is that they can be culturally sensitive.
The argument is somewhat nullified because language plurality in larger projects covers broad geographic areas and cultural diversity. There will be resistance to change, but the need to develop an overriding universal organizational policy to which anyone involved with a Wikimedia brand must adhere is imperative. These should be developed with transparency to the various project Arbcoms to avoid the pitfalls which have happened in the past (see links Q6). Stakeholders will have the ability to add platform specific rules, but not to remove any provision of the universal organizational policy.
Q4b. What could be done to mitigate this risk?
Unsure why it would be of value to retain toxic behaviors, thus there do not appear to be reasons to mitigate the risk.
Q5. Why this Recommendation? What assumptions are you making?
The Wikipedia community has struggled with how to deal with overly-aggressive editors and has been either a) ineffective in preventing bullying and harassment, b) applies disproportionate remedies that don’t fit the offense, or c) has applied a grade school playground “group detention” solution that sometimes punishes the victims equally with the aggressors. Can we handle harassment? The recent Fram debacle (Buzzfeednews) has shown that this is not just a problem that en.WP struggles with but that there have been other problems on other language Wikis. (German, Chinese, English, Belgian etc.)
To be effective and address the roles and responsibilities of community leaders as well as foundation leaders, the policy must be drafted by experts and vetted by the legal department. It must make clear who handles what type of behavioral issue. To leave dealing with behavioral issues to policy designed by non-experts (untrained volunteers/staff) runs the risk of jeopardizing the rights of both the accused and accuser AND could have serious legal ramifications for the foundation.
Q6. How is this Recommendation connected to other WGs?
Related to Community Health and impacts technology, as an interface would have to be developed for users to acknowledge their compliance.
Q7. Does this Recommendation connect or depend on another of your Recommendations?
Q^. What is the timeframe of this Recommendation in terms of when it should be implemented? 2020, 2021, etc. Does it have an urgency or priority? Does this timeframe depend on other Recommendations being implemented before or after it?