Community health initiative
Community health initiative
Helping the Wikimedia volunteer community to reduce the level of harassment and disruptive behavior on our projects.
The Wikimedia Foundation's Community Tech and Community Engagement teams are underway on a multi-year project of research, product development, and policy growth to help the Wikimedia volunteer community to reduce the level of harassment and disruptive behavior on our projects.
This initiative addresses the major forms of harassment reported on the Wikimedia Foundation’s 2015 Harassment Survey, which covers a wide range of different behaviors: content vandalism, stalking, name-calling, trolling, doxxing, discrimination, anything that targets individuals for unfair and harmful attention.
This will result in improvements to both the tools on the MediaWiki software (see Anti-Harassment Tools) and the policies on communities suffering the most from disruptive behavior (see Policy Growth & Enforcement.) These improvements need to be made with the participation and support of the volunteers who will be using the tools in order for our efforts to be successful (see Community input.)
- 1 Background
- 2 Goals
- 3 Community input
- 4 Anti-Harassment Tools
- 5 Policy Growth & Enforcement
- 6 Schedule & prioritization of work
- 7 See also
- 8 References
Harassment on Wikimedia projects
On Wikipedia and other Wikimedia projects, harassment typically occurs on talk pages (article, project, and user), noticeboards, user pages, and edit summaries. Edit warring and wiki-hounding can also be forms of harassment. Conduct disputes typically originate from content disputes, such as disagreements about the reliability of a source, neutrality of a point-of-view, or article formatting and content hierarchy. These disputes can become harassment at the point when an editor stops thinking of this as a discussion about ideas, and starts associating the opponent with the idea — turning the other editor into an enemy, who needs to be driven off the site. This unhealthy turn is more likely to happen when the content is closely associated with identity — gender, race, sexual orientation, national origin — because it's easy for the harasser to think of the target as a living representation of the opposing idea. This is especially true when the target is a member of a historically disadvantaged group, and has disclosed information about their identity during the course of their time on the projects.
The English-language Wikipedia community (and most other projects) have drafted conduct policies for their communities to follow including policies on civility, harassment, personal attacks, and dispute resolution. The spirit of these policies is right-hearted but enforcement is difficult given deficiencies in the MediaWiki software and the ratio of contributors to active administrators. The dispute resolution processes encourage users to attempt to resolve issues between themselves before bringing the situation to the attention of administrators on the Administrator's Noticeboard, and eventually ArbCom for extreme situations.
Online harassment is a problem on virtually every web property where users interact. In 2017, the Pew Research Center concluded that 41% of all internet users have been the victim of online harassment. In 2015 the Wikimedia Foundation conducted a Harassment Survey with 3,845 Wikimedia user participants to gain deeper understanding of harassment occurring on Wikimedia projects. 38% of the respondents confidently recognized that they had been harassed while 51% of respondents witnessed others being harassed. In 2016-17 Jigsaw and Wikimedia researchers used machine learning techniques to to evaluate harassment on Wikipedia in the Detox research project. They found that only 18% of all identified attacks on English Wikipedia resulted in a block or warning, 67% of attacks come from registered users, and nearly 50% of all attacks come from contributors with over 100 annual edits.
In Wikimedia's 2017 Community Engagement Insights Report, it was found that 31% of all 4,500 survey respondents felt unsafe in any Wikimedia online or offline space at any time during their tenure, 49% of 400 users avoided Wikimedia because they felt uncomfortable, and 47% of 370 users indicated that in the past 12 months they had been bullied or harassed on Wikipedia. Furthermore, 60% of people who reported a dispute to functionaries say their issue was "not at all resolved" and 54% called their interaction with functionaries "not at all useful."
This research is illuminatory and one of the impetuses for this Community Health Initiative, but is only the beginning of the research we must conduct for this endeavor to be successful.
Community requests for new tools
The Wikimedia community has long struggled with how to protect its members from bad-faith or harmful users. The administrative toolset that project administrators can use to block disruptive users from their projects has not changed since the early days of the MediaWiki software. Volunteers have asked the Wikimedia Foundation to improve the blocking tools on a number of occasions, including:
- a 2013 survey of the German Wikipedia community
- as part of the Inspire Campaign on the Gender Gap, held in 2014, community members expressed a desire for better reporting systems for targets of harassment. This has been a recurring theme in surveys, consultations, and workshops on the topic.
- in the 2015 Community Wishlist survey
- in numerous IdeaLab grant proposals. (See our list of which proposals relate to our work.)
- and in a number of Phabricator requests, including "Send a cookie with each block" (November 2015), "Allow User agent (UA)-based IP Blocks" (May 2015), and "Throttle account creation and email sending per browser as well as IP address" (July 2015). See this ticket for coordination of many blocking-tools related tasks
- In the 2017 Community Engagement Insights Report 84% of 300 users requested better reporting tools, 77% requested better noticeboards, 74% requested better blocking tools, and 75% requested better wiki policies.
In preparing for this initiative, we've been discussing issues with the current tools and processes with active administrators and functionaries. These discussions have resulted in requested improvements in several key areas where admins and functionaries see immediate needs — better reporting systems for volunteers, smarter ways to detect and address problems early, and improved tools and workflows related to the blocking process. These conversations will be ongoing throughout the entire process. Community input and participation will be vital to our success.
In January 2017, the Wikimedia Foundation received initial funding of US$500,000 from the Craig Newmark Foundation and craigslist Charitable Fund to support this initiative. The two seed grants, each US$250,000, will support the development of tools for volunteer editors and staff to reduce harassment on Wikipedia and block harassers. The grant proposal is available for review at Wikimedia Commons.
- Reduce the amount of harassing behavior that occurs on Wikimedia projects.
- Fairly resolve a higher percentage of incidents of harassment that do occur on Wikimedia projects.
Potential measures of success
There are challenges to measuring harassment but we still want to be sure our work has an impact on the community. Current ideas include:
- Decrease the percentage of identifiable personal attack comments on English-language Wikipedia, measured via Detox, Sherloq, or a similar system.
- Increase the confidence of admins with their ability to make accurate decisions in conduct disputes, measured via focus group or consultation.
- Decrease the percentage of non-administrator users who report seeing harassment on Wikipedia, measured in a follow-up to the 2015 Harassment Survey.
- Confidence of English Wikipedia administrators with their ability to make accurate and efficient decisions in conduct disputes.
Annual and quarterly goals
- Fiscal year 2017-2018
Gathering, incorporating, and discussing community input is vital to the success of this initiative. We are building features for our communities to use — if we design in a vacuum our decisions will assuredly fail.
The plans presented in the grant, on this page, and elsewhere will certainly change over time as we gather input from our community (including victims of harassment, contributors, and administrators,) learn from our research, and learn from the software we build. Community input includes, but is not limited to:
- Socializing our goals
- Generating, refining, validating, and finalizing ideas with community stakeholders
- Conversations about freedom of expression vs. political correctness. It’s very important that this project is seen as addressing the kinds of abuse that everyone agrees about (obvious sockpuppet vandalism, death threats) and the kinds of abuse that people will differ over (gender, culture, etc.). The project will not succeed if it’s seen as only a “social justice” power play.
Over the course of this initiative we plan to communicate with the community via regular wiki communication (talk pages, email, IRC) in addition to live-stream workshops, in-person workshops at hack-a-thons and Wikimanias, and online community consultations. At the moment, the best place to discuss the Community health initiative is on Talk:Community health initiative.
In short, we want to build software that empowers contributors and administrators to make timely, informed decisions when harassment occurs. Four focus areas have been identified where new tools could be beneficial in addressing and responding to harassment:
We want to make it easier and more efficient for editors to identify and flag harassing behavior. We are currently questioning how harassment can be prevented before it begins, and how minor incidents be resolved before they snowball into larger uncivil problems.
- AbuseFilter performance management, usability, and functionality improvements
- Reliability and accuracy improvements to ProcseeBot
- Anti-spoof improvements to pertinent tools
- Features that surface content vandalism, edit warring, stalking, and harassing language to wiki administrators and staff
According to Detox research, harassment is underreported on English Wikipedia. No victim of harassment should abandon editing because they feel powerless to report abuse. We want to provide victims improved ways to report instances that are more respectful of their privacy, less chaotic and less stressful than the current workflow. Currently the burden of proof is on the victim to prove their own innocence and the harasser's fault, while we believe the MediaWiki software should perform the heavy lifting.
- A new harassment reporting system that doesn't place the burden of proof on or further alienate victims of harassment.
Proficiency with MediaWiki diffs, histories, and special pages is imperative for admins to be able to analyze and evaluate the true sequence of events of a conduct dispute. Volunteer-written tools such as Editor Interaction Analyzer and WikiBlame help, but current processes are time consuming. We want to build tools to help volunteers understand and evaluate harassment cases, and inform the best way to respond.
- A robust interaction timeline tool, which will allow wiki administrators to understand the interaction between two users over time, and make informed decisions in harassment cases.
- A private system for wiki administrators to collect information on users’ history with harassment and abuse cases, including user restrictions and arbitration decisions.
- A dashboard system for wiki administrators to help them manage current investigations and disciplinary actions.
- Cross-wiki tools that allow wiki administrators to manage harassment cases across wiki projects and languages.
We want to improve existing tools and create new tools, if appropriate, to remove troublesome actors from communities or certain areas within and to make it more difficult for someone who's blocked from the site to return.
Some of these improvements are already being productized as part of the 2016 Community Wishlist. See Community Tech/Blocking tools for more information.
- Per-page and per-category blocking tools to enforce topic bans, which will help wiki administrators to redirect users who are being disruptive without completely blocking them from contributing to the project; this will make wiki admins more comfortable with taking decisive action in the early stages of a problem.
- Tools that allows individual users to control who can communicate with them via Echo notifications, email, and user spaces.
- Make global CheckUser tools work across projects, improving tools that match usernames with IP addresses and user agents so that they can check contributions on all Wikimedia projects in one query.
- Sockpuppet blocking tools.
Policy Growth & Enforcement
In addition to building new tools, we want to work with our largest communities to ensure their user conduct policies are clear and effective and the administrators responsible for enforcing the policies are well-prepared.
Beginning with English Wikipedia, a large community from which can be obtained a wealth of data, we will provide contributors with research and analysis of how behavioral issues on English Wikipedia are a) covered in policy, and b) enforced in the community, particularly noticeboards where problems are discussed and actioned. We will provide research on alternate forms of addressing specific issues, researching effectiveness, and identifying different approaches that have found success on other Wikimedia projects. This will help our communities make informed changes to existing practices.
Schedule & prioritization of work
The team is currently working on the Interaction Timeline, and a demo should be available for testing in November. We will build tools to allow users to restrict which user groups can send them direct emails before December. We're also discussing with users how to AbuseFilter performance management and how to build better blocking tools.
Our projects are currently prioritized on the Anti-Harassment Phabricator workboard in the 'Epic backlog' column. We invite everyone to share their thoughts on our prioritization on Phabricator tickets, on our this page's talk page, or by sending us an email.
Projects are prioritized by the product manager, taking into account:
- Readiness — What is designed, defined, and ready for development? Are there any blockers?
- Value — What will provide the most value to our users? What will solve the biggest problems, first? Has our research identified any exciting opportunities? Have our previous features identified any new opportunities or problems?
- Feasibility — What can we accomplish given our time frame and developer capacity? Are we technically prepared? Is there external developer support that will accelerate our work?
- Support — What has received support from the users who participate in the current workflows? What ideas have momentum from people currently affected by harassment on Wikipedia?
March 2017 - June 2017
- Hire and onboard the new hires of the team.
- The start of new research projects to help the team gain a deeper understanding of the current needs and problems of the community.
- Discussions with the volunteer community as a whole, as well as targeted discussions with wiki functionaries who handle disruptive behavior, people who have experienced and witnessed harassment on our wikis, and other important stakeholders.
July 2017 - June 2018
- Focused research on the Harassment Reporting System.
- Work with the community on potentially creating a new user group for volunteers who want to work specifically on harassment cases. The community advocate will work with the Support and Safety team to provide training and support for these volunteers.
- Designing, building, releasing, and iterating on:
- usability & performance improvements to AbuseFilter
- accuracy improvements to anti-spoof tools across multiple pertinent tools
- a robust interaction history tool, similar to the edit interaction analyzer
- a private system for admins to discuss and evaluate incidents of harassment
- per-page blocking tools
- cross-project improvements to CheckUser tools
- robust sockpuppet identification and blocking tools
July 2018 - June 2019
- Designing, building, releasing, and iterating on:
- the Harassment Reporting System
- tools that surface brewing situations of harassment and vandalism to community leaders before they become large-scale incidents
- a dashboard system for wiki administrators to help them manage current investigations and disciplinary actions
- cross-wiki tools that allow wiki administrators to manage harassment cases across wiki projects and languages
- Internationalizing all tools we've built to date
- Community health initiative/Links for more links.
- Community health initiative/Notes for team member notes.
- Community health initiative/The Team for a list of WMF staff working on this initiative.
- Phabricator workboard
- Research:2015 Harassment Survey
- Inspire Campaign on "Addressing harassment" (June 2016)
- Board Statement on Healthy Community Culture, Inclusivity, and Safe Spaces (November 2016)
- "Wikipedia:List of administrators/Active". 2017-02-13.
- "Wikipedia:Harassment § Dealing with harassment.". 2017-02-12.
- Duggan, Maeve (2017-07-11). "Online Harassment 2017". Pew Research Center: Internet, Science & Tech. Retrieved 2017-07-25.
- "Algorithms and insults: Scaling up our understanding of harassment on Wikipedia – Wikimedia Blog". Retrieved 2017-02-13.
- "Wikimedia Foundation receives $500,000 from the Craig Newmark Foundation and craigslist Charitable Fund to support a healthy and inclusive Wikimedia community – Wikimedia Blog". Retrieved 2017-02-13.
- "File:Wikimedia Foundation grant proposal - Anti-Harassment Tools For Wikimedia Projects - 2017.pdf - Meta" (PDF). meta.wikimedia.org. Retrieved 2017-02-14.