Iniciativa por la salud de la comunidad

From Meta, a Wikimedia project coordination wiki
Jump to navigation Jump to search
This page is a translated version of the page Community health initiative and the translation is 58% complete.

Other languages:
العربية • ‎English • ‎español • ‎فارسی • ‎italiano • ‎日本語 • ‎polski • ‎português do Brasil • ‎Türkçe
Light-Bulb icon by Till Teenck green.svg

Iniciativa por la salud de la comunidad
Ayudar a la comunidad de voluntarios de Wikimedia a reducir el nivel de acoso y comportamiento disruptivo en nuestros proyectos.

Los equipos de Tecnología comunitaria y Compromiso con la comunidad de la Fundación Wikimedia están envueltos en un proyecto de investigación, desarrollo de productos y desarrollo de políticas para ayudar a la comunidad de voluntarios de Wikimedia a reducir el nivel de acoso y comportamiento disruptivo en nuestros proyectos.

Esta iniciativa aborda las principales formas de acoso denunciadas en la encuesta de acoso de 2015, que cubre una amplia gama de comportamientos: vandalismo de contenidos, stalking (atención obsesiva y no solicitada), insultos, troleos, doxxing (publicación de información privada), discriminación, cualquier comportamiento dirigido a individuos que busca una atención injusta y nociva.

Esto dará lugar a mejoras tanto en las herramientas del software de MediaWiki (véase Herramientas antiacoso) como en las políticas en vigor en las comunidades que más sufren este comportamiento disruptivo (véase Desarrollo y aplicación de políticas). Estas mejoras tienen que hacerse con la participación y el apoyo de los voluntarios que usarán las herramientas para que nuestros esfuerzos tengan éxito (véase Respuesta de la comunidad).

Proyectos actuales y recientes

El equipo está construyendo la Línea de tiempo de interacciones, una herramienta que muestra una historia cronológica de dos usuarios en las páginas que los dos hayan editado. Esta herramienta ayudará a los administradores a investigar casos de acoso. Una primera versión estable debería estar lista para finales de febrero de 2018.

El equipo también está discutiendo qué herramientas de bloqueo y mejoras deberíamos construir después. Queremos proporcionar herramientas más precisas y sofisticadas para permitir a los administradores responder mejor a los acosadores. ¡Participa en la página de discusión!

También estamos empezando a investigar un Sistema de Denuncias de Acoso, un proyecto importante para 2018.

Recent projects include adding anti-spoof improvements and performance measurements to AbuseFilter, building and releasing two User Mute features, and a user preference which allows users to restrict which user groups can send them direct emails.

Trasfondo

Community Health Initiative Wikimania Poster.jpg

Acoso en los proyectos de Wikimedia

On Wikipedia and other Wikimedia projects, harassment typically occurs on talk pages (article, project, and user), noticeboards, user pages, and edit summaries. Edit warring and wiki-hounding can also be forms of harassment. Conduct disputes typically originate from content disputes, such as disagreements about the reliability of a source, neutrality of a point-of-view, or article formatting and content hierarchy. These disputes can become harassment at the point when an editor stops thinking of this as a discussion about ideas, and starts associating the opponent with the idea — turning the other editor into an enemy, who needs to be driven off the site. This unhealthy turn is more likely to happen when the content is closely associated with identity — gender, race, sexual orientation, national origin — because it's easy for the harasser to think of the target as a living representation of the opposing idea. This is especially true when the target is a member of a historically disadvantaged group, and has disclosed information about their identity during the course of their time on the projects.

The English-language Wikipedia community (and most other projects) have drafted conduct policies for their communities to follow including policies on civility, harassment, personal attacks, and dispute resolution. The spirit of these policies is right-hearted but enforcement is difficult given deficiencies in the MediaWiki software and the ratio of contributors to active administrators.[1] The dispute resolution processes encourage users to attempt to resolve issues between themselves before bringing the situation to the attention of administrators on the Administrator's Noticeboard, and eventually ArbCom for extreme situations.[2]

Harassment Survey 2015 - Results Report

Online harassment is a problem on virtually every web property where users interact. In 2017, the Pew Research Center concluded that 41% of all internet users have been the victim of online harassment.[3] In 2015 the Wikimedia Foundation conducted a Harassment Survey with 3,845 Wikimedia user participants to gain deeper understanding of harassment occurring on Wikimedia projects. 38% of the respondents confidently recognized that they had been harassed while 51% of respondents witnessed others being harassed. In 2016-17 Jigsaw and Wikimedia researchers used machine learning techniques to to evaluate harassment on Wikipedia in the Detox research project. They found that only 18% of all identified attacks on English Wikipedia resulted in a block or warning, 67% of attacks come from registered users, and nearly 50% of all attacks come from contributors with over 100 annual edits.[4]

In Wikimedia's 2017 Community Engagement Insights Report, it was found that 31% of all 4,500 survey respondents felt unsafe in any Wikimedia online or offline space at any time during their tenure, 49% of 400 users avoided Wikimedia because they felt uncomfortable, and 47% of 370 users indicated that in the past 12 months they had been bullied or harassed on Wikipedia. Furthermore, 60% of people who reported a dispute to functionaries say their issue was "not at all resolved" and 54% called their interaction with functionaries "not at all useful."

This research is illuminatory and one of the impetuses for this Community Health Initiative, but is only the beginning of the research we must conduct for this endeavor to be successful.

Solicitudes comunitarias de nuevas herramientas

The Wikimedia community has long struggled with how to protect its members from bad-faith or harmful users. The administrative toolset that project administrators can use to block disruptive users from their projects has not changed since the early days of the MediaWiki software. Volunteers have asked the Wikimedia Foundation to improve the blocking tools on a number of occasions, including:

In preparing for this initiative, we've been discussing issues with the current tools and processes with active administrators and functionaries. These discussions have resulted in requested improvements in several key areas where admins and functionaries see immediate needs — better reporting systems for volunteers, smarter ways to detect and address problems early, and improved tools and workflows related to the blocking process. These conversations will be ongoing throughout the entire process. Community input and participation will be vital to our success.

Financiación externa

Propuesta de subvención para «Herramientas antiacoso para proyectos Wikimedia»

In January 2017, the Wikimedia Foundation received initial funding of US$500,000 from the Craig Newmark Foundation and craigslist Charitable Fund to support this initiative.[5] The two seed grants, each US$250,000, will support the development of tools for volunteer editors and staff to reduce harassment on Wikipedia and block harassers. La propuesta de subvención está disponible para su revisión en Wikimedia Commons.

Objetivos

  • Reducir el nivel de comportamiento abusivo en los proyectos de Wikimedia.
  • Resolver de forma justa un mayor porcentaje de incidentes de acoso en proyectos de Wikimedia.

Medidas potenciales de éxito

There are challenges to measuring harassment but we still want to be sure our work has an impact on the community. Current ideas include:

  • Decrease the percentage of identifiable personal attack comments on English-language Wikipedia, measured via Detox, Sherloq, or a similar system.
  • Increase the confidence of admins with their ability to make accurate decisions in conduct disputes, measured via focus group or consultation.
  • Decrease the percentage of non-administrator users who report seeing harassment on Wikipedia, measured in a follow-up to the 2015 Harassment Survey.

Objetivos anuales y trimestrales

For commentary on our quarterly progress, see Community health initiative/Quarterly updates.

Community input

Gathering, incorporating, and discussing community input is vital to the success of this initiative. We are building features for our communities to use — if we design in a vacuum our decisions will assuredly fail.

The plans presented in the grant, on this page, and elsewhere will certainly change over time as we gather input from our community (including victims of harassment, contributors, and administrators,) learn from our research, and learn from the software we build. Community input includes, but is not limited to:

  • Socializing our goals
  • Generating, refining, validating, and finalizing ideas with community stakeholders
  • Conversations about freedom of expression vs. political correctness. It’s very important that this project is seen as addressing the kinds of abuse that everyone agrees about (obvious sockpuppet vandalism, death threats) and the kinds of abuse that people will differ over (gender, culture, etc.). The project will not succeed if it’s seen as only a “social justice” power play.[6]

Over the course of this initiative we plan to communicate with the community via regular wiki communication (talk pages, email, IRC) in addition to live-stream workshops, in-person workshops at hack-a-thons and Wikimanias, and online community consultations. At the moment, the best place to discuss the Community health initiative is on Talk:Community health initiative.

Herramientas antiacoso

Abreviando, queremos construir software que faculte a colaboradores y administradores a hacer decisiones oportunas e informadas cuando se produzca un caso de acoso. Se han identificado cuatro áreas en las que las nuevas herramientas podrían ayudar a abordar y responder al acoso.

Detección

Queremos hacer que sea más fácil y eficiente identificar y marcar el comportamiento abusivo. En la actualidad, estamos preguntándonos cómo prevenir el acoso antes de que se produzca y cómo se pueden resolver incidentes menores antes de que escalen a problemas mayores de falta de civismo.

Características potenciales:

  • Gestión de rendimiento, usabilidad y mejoras en la funcionalidad del Filtro antiabuso.
  • Mejoras en la fiabilidad y precisión de ProcseeBot.
  • Mejoras anti-spoof para las herramientas pertinentes.
  • Características que revelen casos de vandalismo de contenido, guerras de ediciones, stalking, y lenguaje abusivo a administradores wiki y al personal.

Denuncia

Según la investigación de Detox, el acoso se denuncia poco en la Wikipedia en inglés.[4] Ninguna víctima de acoso debería dejar de editar porque se sienta incapaz de denunciar el acoso. Queremos proporcionar a las víctimas formas mejores de denunciar que sean más respetuosas con su privacidad, menos caóticas y menos estresantes que el flujo de trabajo actual. En la actualidad, la carga de la prueba está en la víctima, que debe demostrar su propia inocencia y la culpabilidad del acosador, mientras que nosotros creemos que el software MediaWiki debería encargarse del trabajo pesado.

Características potenciales:

Evaluación

Es imperativo que los administradores sean competentes con los diferenciales, los historiales y las páginas especiales para poder analizar y evaluar la secuencia real de acontecimientos en una disputa de conducta. Algunas herramientas construidas por voluntarios, como Editor Interaction Analyzer (Analizador de Interacciones con el Editor) y WikiBlame (WikiCulpa), ayudan, pero los procesos actuales llevan mucho tiempo. Queremos construir herramientas que ayuden a los voluntarios a entender y evaluar los casos de acoso e informar sobre la mejor manera de responder a ellos.

Potential features:

  • A robust interaction timeline tool, which will allow wiki administrators to understand the interaction between two users over time, and make informed decisions in harassment cases.
  • A private system for wiki administrators to collect information on users’ history with harassment and abuse cases, including user restrictions and arbitration decisions.
  • A dashboard system for wiki administrators to help them manage current investigations and disciplinary actions.
  • Cross-wiki tools that allow wiki administrators to manage harassment cases across wiki projects and languages.

Bloqueos

We want to improve existing tools and create new tools, if appropriate, to remove troublesome actors from communities or certain areas within and to make it more difficult for someone who's blocked from the site to return.

Some of these improvements are already being productized as part of the 2016 Community Wishlist. See Community Tech/Blocking tools for more information.

Potential features:

  • Per-page and per-category blocking tools to enforce topic bans, which will help wiki administrators to redirect users who are being disruptive without completely blocking them from contributing to the project; this will make wiki admins more comfortable with taking decisive action in the early stages of a problem.
  • Tools that allows individual users to control who can communicate with them via Echo notifications, email, and user spaces.
  • Make global CheckUser tools work across projects, improving tools that match usernames with IP addresses and user agents so that they can check contributions on all Wikimedia projects in one query.
  • Improved blocking tools, including sockpuppet blocking tools.

Priorización del trabajo

Our projects are currently prioritized on the Anti-Harassment Phabricator workboard in the 'Epic backlog' column. We invite everyone to share their thoughts on our prioritization on Phabricator tickets, on our this page's talk page, or by sending us an email.

Projects are prioritized by the product manager, taking into account:

  • Readiness — What is designed, defined, and ready for development? Are there any blockers?
  • Value — What will provide the most value to our users? What will solve the biggest problems, first? Has our research identified any exciting opportunities? Have our previous features identified any new opportunities or problems?
  • Feasibility — What can we accomplish given our time frame and developer capacity? Are we technically prepared? Is there external developer support that will accelerate our work?
  • Support — What has received support from the users who participate in the current workflows? What ideas have momentum from people currently affected by harassment on Wikipedia?

Policy Growth & Enforcement

In addition to building new tools, we want to work with our largest communities to ensure their user conduct policies are clear and effective and the administrators responsible for enforcing the policies are well-prepared.

Beginning with English Wikipedia, a large community from which can be obtained a wealth of data, we will provide contributors with research and analysis of how behavioral issues on English Wikipedia are a) covered in policy, and b) enforced in the community, particularly noticeboards where problems are discussed and actioned. We will provide research on alternate forms of addressing specific issues, researching effectiveness, and identifying different approaches that have found success on other Wikimedia projects. This will help our communities make informed changes to existing practices.

Véase también

Referencias

  1. "Wikipedia:List of administrators/Active". 2017-02-13. 
  2. "Wikipedia:Harassment § Dealing with harassment.". 2017-02-12. 
  3. Duggan, Maeve (2017-07-11). "Online Harassment 2017". Pew Research Center: Internet, Science & Tech. Retrieved 2017-07-25. 
  4. a b "Algorithms and insults: Scaling up our understanding of harassment on Wikipedia – Wikimedia Blog". Retrieved 2017-02-13. 
  5. "Wikimedia Foundation receives $500,000 from the Craig Newmark Foundation and craigslist Charitable Fund to support a healthy and inclusive Wikimedia community – Wikimedia Blog". Retrieved 2017-02-13. 
  6. "File:Wikimedia Foundation grant proposal - Anti-Harassment Tools For Wikimedia Projects - 2017.pdf - Meta" (PDF). meta.wikimedia.org. Retrieved 2017-02-14.