Community health initiative/Wikimania 2017

From Meta, a Wikimedia project coordination wiki

Poster[edit]

Leaflets[edit]

  • Status: DONE
  • Owner: Trevor, packed & ready to go.

Roundtable[edit]

  • Status: On Track
  • Owner: Sydney & Trevor
  • Submission: Building_a_Better_Dispute_Resolution_System
  • Format
    • We'll need to set context about our team and the roundtable
      • 5 minutes
    • Talk to a global audience to get requirements for our work. (problems too?)
      • e.g. "The reporting system must not cause further pain to victims of harassment. The reporting system must not allow for false reports."
    • Allow attendees to self-organize into groups about a specific topic:
      • Reporting
      • Evaluation
      • Remedies
      • Preventative
    • Ask someone from each group to take notes, then provide a 2-3 minute summary of what they discussed.
  • Things to do
    • Sydney to write discussion prompts & instructions for small groups
    • Sydney to set up Etherpad
    • TBD printouts.
  • Script
    • Hello everyone! I’m Trevor Bolliger, a product manager on the Anti-Harassment Tools team at the Wikimedia Foundation. I’m here today with my colleagues [names]. My team is part of the Community Health Initiative, which is a Foundation led effort to provide resources to Wikimedia volunteers to reduce the amount of harassment and disruptive behavior on their wikis. We have fliers for everybody so you can read more about our work. We’re excited to be here at Wikimania because we want to make sure our work is on the right track and we’re fixing real problems. 
    • Over the next two years, my team will be building a better dispute resolution system. But we won't be doing it alone. We know that your input, and the input of other users on Wikimedia wikis, will be absolutely vital to our success. We don't want to design and build in a vacuum.
    • We're just getting started on this work. So, during this roundtable, we’d love to hear from you about the biggest bottlenecks and time-consuming parts of the existing dispute resolution workflows, and your ideas on how to address them. My goal is for you to leave with inspiration for how to better resolve disputes — tools to use, methods for engaging with users, tactics for keeping discussions civil. My other goal is for my team to gain a more robust understanding of how our software can empower you to be more efficient.
    • Here's how we see how dispute resolution currently works on most wikis: (draw on a whiteboard if possible.)
      • There are some deterrents to disputes — wiki policies on civility, social norms of etiquette, the collaborative nature of a wiki where anyone can join any conversation, the transparency and openness of every talk page. But disputes still occur.
      • When a user wants to report a dispute, they usually have several options. They can write on an admin's talk page. They can post on a public noticeboard, or for serious incidents they can email their problem to a functionary committee. Their options are either entirely transparent to the world, or handled entirely in private.
      • When an admin or other community leader wants to investigate a dispute so the problem can be properly resolved, the process is manual. To understand the full picture, they have to read through multiple discussions on multiple pages and cobble together a timeline of what happened before making a confident decision.
      • When a decision has been reached, the community has few options for how to remedy the situation. Blocks and bans are the two most common types of responses, one is overly strict (but definitely sometimes necessary) while the other is easily skirted and impossible to actually enforce.
    • And we think we will be able to build solutions to address every step of this funnel.
      • We think there may be things we can build that prevent small-scale incidents from boiling over into large-scale disputes.
      • We also think that neither email nor wiki pages are the most appropriate tools to report harassment — some disputes shouldn't be open for the world to see, but an email-only system is also sure to fail. Additionally, the onus shouldn't fall on the victim to prove they were harassed. The reporting system shouldn't open anyone up to additional pain.
      • We believe the software should do the heavy lifting in dispute investigations. We want to build tools to help volunteers understand and evaluate harassment cases, and inform the best way to respond.
      • We want to improve existing tools and create new tools to remove troublesome actors from communities or certain areas within and to make it more difficult for someone who's blocked from the site to return.
    • There are a lot of aspects to dispute resolution, so we’re going to ask you to cluster into small groups and select one of four topics: 
      1. How to prevent harassment before it occurs
      2. How disputes are reported to community leaders
      3. How disputes are evaluated by community leaders
      4. Remedies to users found in violation of a policy
    • When you form a group, introduce yourselves and which wiki you’re most active on. Then as a group select one of these topics.  I’d also be very grateful if someone from each group could take notes — on paper or on wiki.

Lecture[edit]

Lightning Talk[edit]

  • Status: Ready!
  • Owner: Trevor
  • Submission: Anti-Harassment_Tools_on_Wikimedia_communities
  • Slides
  • Script
    • Hello everyone! I’m Trevor Bolliger, a product manager on the Anti-Harassment Tools team at the Wikimedia Foundation. Thank you for having me here, this is my first Wikimania and so far I’m having a great, productive, time.
    • My Anti-Harassment Tools team is part of the Community Health Initiative, which is a Foundation led effort to provide resources to Wikimedia volunteers to reduce the amount of harassment and disruptive behavior on their wikis. 
    • We believe that the current software available on Wikimedia projects are not sufficient when users want to report harassment, or when admins want to investigate a conduct dispute. We believe the software should do the heavy lifting so users can spend most of their time doing what's actually important — building the encyclopedia.
    • The tools we will build fall into four focus areas, each as important as the last. 
    • The first area is Detection. We want to make it easier and more efficient for editors to identify and flag harassing behavior. We are currently questioning how harassment can be prevented before it begins, and how minor incidents be resolved before they snowball into larger uncivil problems. Currently our team is working on improvements to AbuseFilter so more filters can be enabled to monitor blatant wrongdoing. 
    • The second area is Reporting. No victim of harassment should abandon editing because they feel powerless to report abuse. Currently the burden of proof is on the victim to prove their own innocence and the harasser's fault. We want to provide users improved ways to report instances that are more respectful of their privacy, less chaotic and less stressful than the current workflows. My team’s analyst Caroline Sinders is currently analyzing Admin Noticeboards and is giving a lecture tomorrow titled “foobar” that you should all attend. 
    • The third area is Evaluation. We want to make it easier and less time consuming for administrators to understand the sequence of events in user disputes so they can make more accurate and timely decisions. This month I am starting a research project into the Edit Interaction Analyser and we already have some exciting ideas how to make it even more powerful.
    • Our fourth and final focus area is Blocking. We want to improve existing tools and create new tools, if appropriate, to remove troublesome actors from communities or certain areas within and to make it more difficult for someone who's blocked from the site to return. In September, our team will release the first iteration of the Mute feature which allows users to specify users from whom they do not want to receive on-wiki notifications.
    • There's a lot of work to do, and we need your help. Wiki community participation will be vital to the success of our work. Not just at Wikimania but also online. We want to make sure we’re building useful tools to real problems, and we won’t be able to know this without you.
    • To be a part of our work, find me, find our Community Health Initiative poster, or visit us on Meta wiki at “Community Health Initiative” or CHI.
    • Thank you!

Wikiconference workshop[edit]

  • Status: Ready!
  • Owner: Trevor
  • Submission: Building_MediaWiki_software_to_enforce_Page_&_Topic_bans
  • Length: 50 minutes
  • Script
    • OPENING
      • Hello everyone! I’m Trevor Bolliger, a product manager on the Anti-Harassment Tools team at the Wikimedia Foundation. I’m here today with my colleagues [names]. 
      • My team is part of the Community Health Initiative, which is a Foundation led effort to provide resources to Wikipedians to reduce the amount of harassment and disruptive behavior on their wikis. We have fliers for everybody so you can read more about our work. 
      • We’re excited to be here at Wikiconference North America because we want to make sure our work is on the right track and we’re fixing real problems. 
      • During this workshop, we’d love to hear from you about how we can build functionality into the MediaWiki software to allow communities to enforce page or topic bans. 
      • My goal for today is to leave with notes and information so my team can build the right tools. This workshop will not be the end of our consultation — we’ll continue to discuss these topics on Wikipedia and Meta Wiki so we’re certain we’re reaching a wide audience.
      • First, I’m going to set some context about bans and blocks. Bans are formal probations from editing on Wikipedia. Blocks are the technical method used to actually disallow users from publishing changes to a page. I think of bans like telling a child “you’re grounded” — words that can be easily violated. Blocking is akin to locking a child in their bedroom — effective, yet harsh.
      • There are several common types of bans but today we’re only focusing on Page bans and Topic bans. Page bans prohibit the user from editing a single, specific page — ZZ Top. Angela Merkel. The 1989 Kansas City Chiefs. Topic bans prohibit the user from editing multiple pages of a broad topics — music, German politics, American sports.
      • There are many ways we can build these tools so we’re glad you’re here to help us talk through the considerations.
      • I want to get started by having a discussion with the entire room about our general opinions of bans. Do they work? Are they effective countermeasures to disruptive behavior? 
        • Call on a few people.
        • Potential other questions for group discussion:
          • Would building page blocks or topic blocks make them more effective?
        • Wrap up honestly. Mention consensus or lack thereof.
      • Thank you everybody. We have a few more discussion questions, but I think they'll be best handled in smaller groups. If everyone could split into groups of 5 or so people. Introduce yourselves. We’ve printed up some lists of questions for each group to discuss. Please take notes! Either on paper or on the Etherpad link on the paper I’ve provided.  Members of my team will be walking around, asking questions and offering to help. Don’t worry about covering all the questions.
      • Any questions right now?
    • WHILE WALKING AROUND
      • When you walk up to a group, introduce yourself and ask which question they’re talking about. Ask if there’s agreement. Throw out an idea of your own, but try to stimulate the conversation.
    • AT THE END
      • Alright everybody, we have a few minutes left so lets start winding down.
      • Can one person from each group share a summary of what they discussed?
      • I’d like one person from each group to share what the most contentious topic you discussed. Which topic needs do further discussion on wiki to reach consensus?
      • OK, thank you everybody! You can follow this project on meta wiki at this URL.
  • Etherpad link: https://etherpad.wikimedia.org/p/PageBlocking
  • Printouts link: https://docs.google.com/document/d/1VIvMwqNnahY-57h3Oz5qV-FMA0qcBWFCnjoaE_sEOLI/edit
  • Meta page: Community health initiative/Page or topic blocking

General Preparedness[edit]

  • Elevator pitch — Everyone should come prepared with their own way to describe the Community Health Initiative and our Anti Harassment Tools team work.
  • Come up with a list of expected FAQ and be prepared to respond
    • "How are you funded?"
      • Our five person team has been funded by a grant from the Newmark Foundation specifically for work on Anti-Harassment Tools. The rest of the Community Health Initiative is prioritized and funded by regular annual planning processes.
    • "Which languages and community are we working with?"
      • Sydney
    • "What is happening with Detox?"
      • Caroline
    • "How are you capturing data?"
      • Caroline
    • "How can I get and stay involved?"
      • Point them to our meta page — "Community health initiative" or "CHI"
    • "Can you review my block?"
      • No, that's not what we do.
    • "I've been harassed on Wikipedia"
      • Try to point people to SuSa.
    • "I am being harassed at Wikimania"
      • Point people to SuSa.

Panels we should attend[edit]

From https://wikimania2017.wikimedia.org/wiki/ProgrammeItems in bold are CHI hosted events.

Thursday, August 10

Friday, August 11

Saturday, August 12

Sunday, August 13