Community Wishlist Survey 2021/Archive/Harrassment and micro agression notification

From Meta, a Wikimedia project coordination wiki

Harrassment and micro agression notification

  • Problem: continuing and recurrent micro agrression towards editors happen daily on all projects.
  • Who would benefit: all editors would beneficiate from a more friendly atmosphere, especially underrepresented communitie. People might start to pay more attention to the way they formulate their ideas and advices, if they knew that a simple procedure to track and notify these was present.
  • Proposed solution : a notification. procedure present on all lateleral boxes on all projects in all languages with which editors affected by rude language, derogatory terms and microagressions could generate a notification. Possibly by mailing them directly to an OTRS procedure, with copy to Trust and Safety and generating a non public data base which could be consulted by selected members of each community.
  • More comments:
  • Phabricator tickets : https://phabricator.wikimedia.org/T268366 and https://phabricator.wikimedia.org/T268366
  • Proposer: Nattes à chat (talk) 19:36, 20 November 2020 (UTC)[reply]

Discussion

  • I support this, or a mechanism like this, to make harassment / abuse / discrimination reporting much more accessible and to make sure that the reports are documented and go through the proper channels. RachelWex (talk) 21:18, 20 November 2020 (UTC)[reply]
  • I support this as a generic 'report' button next to every comment (maybe as part of Discussion Tools) that allows for quick reporting of offensive language to admins/other users. HappyMihilist (talk) 05:55, 21 November 2020 (UTC)[reply]
  • I'd expect the signal-to-noise ratio would be a problem. Probably the most common source of complaints are people upset that their edit was (correctly) reverted and/or someone tries to explain what was wrong with their editing. Users who are on the road to being blocked often complain everyone and every comment is somehow abusive against them. Alsee (talk) 06:24, 21 November 2020 (UTC)[reply]
    • Yup, which is why I propose that the complaints would not be put in a public database. However, the general feeling is that although this might be distorted and used against people who complain in the first place. the problem is that there is no procedure to moderate efficiently micro agrresions in our communities, and this is driving people away by creating a toxic climate. There is a contradiction between affirming that everybody is welcome and can edit, and allowing such toxic behaviors to thrive. Nattes à chat (talk) 10:38, 21 November 2020 (UTC)[reply]
  • I strongly support such a mechanism, which not only needs a technical solution but alos better professional and volunteer resources (including formation). Kvardek du (talk) 11:58, 21 November 2020 (UTC)[reply]
  • Strong support. It needs more details (to be discussed I guess). I think a OTRS system would be good, and a team of some trustworthy volunteers should handle notifications in their native language(s) - maybe through a T&S team and volunteers team cooperation on OTRS. Also, we need to find a way to avoid the "background noise", such as Alsee explained above. - .Anja. (talk) 16:33, 21 November 2020 (UTC)[reply]
  • +1. Thank you for this proposal--JMGuyon (talk) 17:58, 21 November 2020 (UTC)[reply]
  • +1, one of the hardest unsolved problems on this platform without satisfactory resolution. Micro aggression rarely warrant/result in warnings/punitive measures but they have an obvious negative impact on community editing. — Shushugah (talk) 21:25, 21 November 2020 (UTC)[reply]
  • I think such a system will clash with existing reporting places on each wiki (i.e you'll see complaints about an user ending up both on OTRS and en:WP:ANI and equivalents) unless it can be configured to redirect complaints to a configurable venue. Jo-Jo Eumerus (talk, contributions) 08:55, 23 November 2020 (UTC)[reply]
  • Strongly against. There's no scientific validation for the concept of micro aggression and no way of establishing standards, it's a tool to further politicize and corrode WP. We need less administrative overhead and minder control, be it self-appointed or community-led, not more. --Tickle me (talk) 21:47, 24 November 2020 (UTC)[reply]
  • Strongly oppose this Orwellian proposal. Wikipedia does not need to become even more bureaucratic and, as an open project, should not have a secret database of user behavior. What's next? Mandatory behavior modification and reeducation camps for editors? Disagreements between contributors, including their behavior, can and should be resolved in public on talk pages. --Ehn (talk) 03:26, 25 November 2020 (UTC)[reply]
  • Procedurally - I don't see how this can happen without a discussion with the community. I think this is basically a policy proposal and should be disqualified on those grounds. I am also concerned about having another shadow system of "governance" that would result in w:en:WP:FRAM all over again - where the evidence received cannot be reviewed or responded to by the alleged perpetrator. --Rschen7754 06:26, 25 November 2020 (UTC)[reply]
  • +1 --El Pantera (talk) 11:32, 25 November 2020 (UTC)[reply]
  • +1 we need some action about the continuous low grade harassment, that passes for discussion, and provides cover for the incivility. we need a trained response team to deal with anti-social annoying users that harm the project, i.e. Fram, and his enablers. Slowking4 (talk) 02:20, 27 November 2020 (UTC)[reply]
Hello Slowking4 May I respectfully request that you review your comment and revise it so that it focus on questions and possibly concerns about the proposal? Tagging other users as "anti-social" is inappropriate. I am requesting this revision so that current and future participants feel that they can participate in the discussion without having to make or receive unfriendly remarks or personal attacks such as this. Happy new year in advance. T CellsTalk 19:27, 3 December 2020 (UTC)[reply]
"anti-social" is appropriate. that was not a personal attack, that was an accurate description of that banned admin's behavior. it is important to give examples of which behavior is uncivil yet tolerated by a clique of admins. see also [1] Slowking4 (talk) 20:57, 7 December 2020 (UTC)[reply]
  • Oppose. I think the cost would out way the benefit. There would be massive number of meaningless complaints that would effectively stifle any actual action against more major problems. People are imperfect, and generally, pointing fingers doesn't make things much better. Mulstev (talk) 07:00, 28 November 2020 (UTC)[reply]
  • I am against it. At the limit, if the reports are public and referred to arbitration committees, why not. To make the reports private would prevent an important step back from the dispute, in my opinion. --TechAcquisitor (talk) 22:24, 28 November 2020 (UTC)[reply]

I totally and utterly oppose this proposal. I think if people actually properly understood what it meant it would have been deleted. We would not tolerate a proposal which discriminated against wikipedians for being gay or black. We wouldn't even allow such a proposal. Yet, the scientific and medical evidence shows unequivocally that this proposal is functionally identical. It discriminates against people for states of being, in exactly the same way as homophobia and racism discriminates. I am one of those discriminated against. This is scientist-phobic behaviour. It discriminates against me for being a scientist. It it discriminates against people for being scientific as a personal characteristic like sexuality or skin colour. Scientific people tend to be blunt and logical. This sometimes upsets people who aren't like that. As you can see from the way I am saying this I am outspoken, I don't make personal attacks on Wikipedia, but sometimes it is necessary to tell someone they are wrong quite directly. An example would be where I edited the Astrid Peth article and told a Dutchman that I didn't think he should be debating the meaning of words in the Welsh language. He clearly didn't speak it and I am fluent. It is the native language of my homeland. If you want good articles you need competent writers. This will really damage scientific editing. It is also a very American centric view. There seems to be a rise in "offence culture" in the USA. Incidentally, I should say that there is a political divide on this. I don't match that. I am not right wing at all. If I lived in the USA I would be a Bernie Sanders man. I wouldn't describe some people as "snowflakes" as some in the USA do. The evidence on the effect of microaggressions is very charitably expressed as poor at best. This is like creationism versus evolution or flat vs round earth. The evidence clearly shows what is true. Get an expert. This is a subtle medical matter and you don't decide these by a vote. Reject this proposal. It is bad. Here is an article on the science from one of the websites run by the journal Nature, one of the leading science journals in the world. https://www.natureindex.com/news-blog/scientists-are-curious-and-idealistic-but-not-very-agreeable-compared-to-other-professions I don't think secret, Spanish Inquisition type lists are a good idea either. ( USER Neilj) Neilj (talk) 12:43, 29 November 2020 (UTC)[reply]

  • "Microagression" is a neologism for rudeness. Rudeness is unfortunate and I certainly don't support or practice it, but it is also entirely subjective and frequently boils down to a clash between styles of communication, such as factual vs. non-factual. Trying to politicize and legislate rudeness would be a very slippery slope toward censorship, which is why it should not be done. Instead, what we should do is deal with instances of rudeness the way people have always done: through communication. If a particular user's behavior becomes disruptive, there are already mechanisms to address that. Silver hr (talk) 17:58, 30 November 2020 (UTC)[reply]
  • This is a good proposal. Any emails to this system that aren't legitimate could be sent a templated message back -- it isn't that time consuming. Meanwhile, legitimate emails could be dealt with. I specifically like the idea a system where people could just ask opinions on what to do/say (OTRS does this already to a certain extent but it isn't exactly what it's meant for). Sam-2727 (talk) 03:10, 1 December 2020 (UTC)[reply]
  • Opposed. In contested areas of the project, this will become just another way that editor warriors try to get their opponents. Since each and every report will have to be accessed by a human, "success" of the proposal will mean that there is a permanent long backlog. There are already mechanisms for reporting editors whose behavior is unacceptable, and I think it is a good thing that making such a report takes some effort and allows some time to cool down. Zero0000 (talk) 09:01, 1 December 2020 (UTC)[reply]

Hi all. Thanks for the proposal and the discussion. I work with the Community Tech team and the Anti-Harassment Tools team so I'm in a position to speak to this. Specifically, I will archive this wish as out of scope for Community Tech. However, I can report that the Anti-Harassment Tools team has been researching how to build a per-wiki configurable system of reporting, managing reports, and sharing results of those reports of harassment issues of various degrees. We (the Anti-Harassment Tools team) hope to work on this in 2021. To that end, there will be a lot of community consultation, research reports, UX testing, and other ways to get involved in that conversation. I can't address some of the opposition here because it appears to be directed specifically at the notion of "microaggression." Our goal on the team is to build a tool that allows each community to manage harassment in the way they see fit with strong support for best practices. Again, this wish is out of scope for the Community Tech Wishlist but what I see as its spirit will be worked on by the Anti-Harassment Tools team hopefully in 2021. --AEzell (WMF) (talk) 00:38, 2 December 2020 (UTC)[reply]