Talk:Community health initiative/Archive/2018
|Please do not post any new comments on this page. This is a discussion archive first created on 01 January 2018, although the comments contained were likely posted before and after this date. See current discussion.|
Auditing Report Tools
The Wikimedia Foundation’s Anti-Harassment Tools Team is starting research on ways reports are made about harassment used across the internet, while also focusing on Wikimedia projects. We are planning to do 4 major audits.
Our first audit is focusing on reporting for English Wikipedia. We found 12 different ways editors can report. We then divided these into two groups, on-wiki and off wiki reporting. On-wiki reporting tends to be incredibly public, off wiki reporting is more private. We’ve decided to focus on 4(ish) spaces for reporting that we’ve broken into two buckets, ‘official ways of reporting’ and ‘unofficial ways of reporting.’
Official Ways of Reporting (all are maintained by groups of volunteers, some are more adhoc volunteers e.g. AN/I)
- Noticeboards: 3rr, AN/I, AN
- Arb Com Email Listserv
- We’ve already started user interviews with arb com
Unofficial Ways of Reporting:
- Highly followed talk page (such as Jimmy Wales)
Audit 2 focuses on Wikimedia projects such as Wikidata, Meta and Wikimedia Commons. Audit 3 will focus on other open source companies and projects like Creative Commons and Github. Audit 4 will focus on social media companies and their reporting tools, such as Twitter, Facebook, etc. We will be focusing on how these companies interact with English speaking communities and their policies for on English speaking communities, specifically because policies differ country to country.
Auditing Step by Step Plan:
- Initial audit
- Write up of findings and present to community
- This will include design artifacts like user journeys
- On-wiki discussion
- Synthesize discussion
- Takeaways, bullet points, feedback then posted to wiki for on-wiki discussion
- Move forward to next audit
- Parameters for the audit given to us from the community and the technical/product parameters
We are looking for feedback from the community on this plan. We anticipate to gain a deeper level of understanding about the current workflows on Wikimedia sites so we can begin identifying bottlenecks and other potential areas for improvement. We are focusing on what works for Wikimedians while understanding on what other standards or ways of reporting are also out in the world.
- Just noticed this project through mention on The Guardian. Where is the feedback page?
- As a 7 year editor til I finally lost it at constant Wikipedia harassment in 2014 and was booted. So glad to see this is happening. I'm too behind still on personal projects, plus another wiki I work on, to try again. But I use Wikipedia every day, so glad for everything that improves its content! Carolmooredc (talk) 05:50, 16 March 2018 (UTC)
Critical essay about the Community health initiative
|Community health initiative/Archive/2018|
Stanford prison effect
Prima facia, the Community health initiative ("CHI") places a substantial burden on Wikipedia administrators to both investigate and enforce anti-harassment policies on the Wikipedia project. However, as demonstrated by the Stanford prison experiment those who perceive themselves to be in positions of authority are likely to abuse it, especially in the absence of effective oversight from higher level authorities. The entire CHI proposal assumes that administrators will not themselves be the individuals discharging harassing behaviours. The proposal does not set out what a Wikipedia user with a grievance against an administrator, who may abuse their position of authority, can do to represent their complaints and seek a remedy; thus creating an impression that there are no ways for the reasonable person to believe that they may seek redress for abuse of power. This also strengthens the perception of unfettered authority in those that seek to abuse it.
Absence of victim support provisions
Finkelhor et al. (2000) found 31% of online users reported being very or extremely upset, 19% were very or extremely afraid, and 18% were very or extremely embarrassed by online harassment. Ybarra (2004) discovered a positive relationship between electronic bullying and clinical depression. On the extreme side of the scale electronic bullying can be a predominant factor in suicide. This suggests that those who are subject to electronic harassment are likely to need emotional support after an incident; however, the CHI proposal doesn't contain any provisions which help victims of harassment and bullying on Wikipedia find a pathway to assistance with treatment or management of the potential harm. This creates a reactionary system which does little to address the negative impact of bullying on users.
Isolation and internalisation of abuse regulation
Those who experience harassment which causes them distress generally have the right to report it to the relevant policing authorities. In many countries, including the United Kingdom, there are robust systems in place to deal with harassment which include the provision of support services. Using established judicial procedures mitigates the risk of a Stanford prison effect due to the higher oversight on multiple levels within these systems. However, the CHI proposal does not mention the creation of machinery that would engage with legitimate judicial process and predicates a system of "internal abuse regulation" similar to that seen deployed by the Roman Catholic Church in response to high levels of abuse of children from amongst its membership.
Moving the burden of proof
To those well versed in law, a statement which advocates a shift in the burden to prove innocence for a criminal offence such as harassment onto the accused is alarming. In fact, there is a concept of ei incumbit probatio qui dicit, non qui negat (“the burden of proof is on the one who declares, not on one who denies”) that is enshrined into the Universal Declaration of Human Rights, Article 11.
The issue of defining harassment
Harassment, in most jurisdictions, is a criminal offence especially when it causes alarm or distress. As a result of this there are various sources of law dependent on jurisdiction which define what constitutes harassment. For there to be order in the enforcement of Wikipedia harassment policies, the term itself must first be unambiguously defined. The question as to who or what defines the term is very important, establishing an interpretation that is exclusive to application of Wikipedia policy creates the risk of Wikipedia moving from an encyclopedia to a non-authoritative dictionary, even a pseudo-court. If the interpretation is to be sought elsewhere, then the question as to where becomes relevant, will it be from British Case Law, US Case Law, a literal definition provided for by Collins Dictionary etc?
Risk of becoming a pseudo-court
As mentioned above, for Wikipedia to undertake in self-regulating with regard to harassment then it must first place itself in an authority to define the term itself or select which outside source to use. It will then be incumbent on Administrators to apply an interpretation of harassment when deciding if a person is guilty of it or not, thus becoming judges as to if a person has committed a criminal offence. While no competent Court would accept the judgement of a Wikipedia Administrator in determining the guilt of an alleged offender, it still risks creating an impression to Wikipedia users that its administrators exercise excessive powers which extend into ruling regarding criminal matters.
Putting Administrators at risk
Another important element which seems to have been omitted from the Community Health Initiative proposal is how Wikipedia Administrators may be protected from the harm inflicted against them as a form of revenge for undertaking their duties regarding harassment. The World Wide Web is a vast expanse which exists beyond, below, around and above Wikipedia and those who conduct harassment against others online have access to all manner of tools which can be utilized when attacking another person. The wise Administrator would be all too aware of this and that knowledge itself may influence their decisions when and if to act, especially when there are no systems of protection to fall back on.
Those who have discharged serious harassing behaviors have demonstrated mens rea and actus rea in committing a criminal offence and there is little to nothing preventing that same person escalating the severity of their criminality, especially when on a platform which proposes internal solutions to resolving the conduct which does little more than offer the deterrent of a ban from Wikipedia.
This problem can present itself between those with authority on the Wikipedia project. For example, if Administrator A notices Administrator B&C&D harassing user E then Administrator A may be afraid to act if Administrator B&C&D appears to have the power to do him harm, such as to his reputation etc. As there are no clearly defined internal checks and balances mentioned in the CHI this situation is more likely to occur.
Hey Wikipedia. I've written the collapsed essay regarding this CHI proposal. I hope that it offers another perspective on some issues which can be discussed further. Thank you. Dogs curiosity (talk) 21:35, 7 August 2018 (UTC)
- Too much deja vu all over again! I'm having flashbacks! Wikipedia just a tiny corner of a much bigger problem. Sigh... Carolmooredc (talk) 21:55, 7 August 2018 (UTC)
- Hello Dogs curiosity, Thanks for your interesting essay. There is a variety of topics to think about. One point I want to respond to is regarding support for people. The Trust and Safety team which is part of the Community health initiate has developed a page with Mental health resources. In particular I recommend Samaritans' Supporting someone online page as a useful resource. As to some of your other points, I will come back to them when I have time to give a more thorough reply. SPoore (WMF) (talk) , Trust and Safety Specialist, Community health initiative (talk) 22:21, 7 August 2018 (UTC)