Jump to content

Grants:IdeaLab/Community review for potentially toxic users

From Meta, a Wikimedia project coordination wiki
Community review for potentially toxic users
Implement a flagging system that identifies potentially toxic users, and allow for user community reviews, up to once a year
countryUnited States
themesocial factors
idea creator
Lonehexagon
this project needs...
contact email
developer
advisor
project manager
researcher
volunteer
join
endorse
created on18:22, 28 July 2018 (UTC)

Project idea

[edit]

What Wikimedia project(s) and specific areas will you be evaluating?

[edit]

Is this project measuring a specific space on a project (e.g. deletion discussions), or the project as a whole?
English Wikipedia, relating to efforts to reduce the amount of problematic behavior and harassment towards legitimate editors, especially towards new editors who don't understand what recourse they have.

Describe your idea. How might it be implemented?

[edit]

Provide details about the method or process of how you will evaluate your community or collect data. Does your idea involve private or personally identifying information? Take a look at the Privacy Policy for Wikimedia’s guidelines in this area.
I propose a flagging system that would trigger a community review of users who have received many flags within a certain period of time. The idea is to greatly spread the power around so any one group of users cannot use the flagging system to target certain users. This would only affect user accounts.

Some possible ideas:

  • Allow users to be reported directly for harassment, breaking the rules, following users around and reverting their changes, or other extremely rude behavior. If a user is flagged by enough users, maybe 20 different people within a certain time period, it would trigger a community review of that user account, up to once a year per user.
  • If an admin is reported multiple times within a certain time period, have it trigger a community review just like a review for a new admin. This could happen up to once a year per admin.
  • Anybody who reports a user wouldn't be allowed to vote in that user's community review. They also wouldn't be allowed to report the same user more than once.
  • IP users would be completely unaffected. They would not be allowed to flag or vote to prevent abuse, but also wouldn't be able to get flagged.
  • Other ideas for actions that might trigger a review: using a lot of swearing in discussions (which is often accompanied by rude behavior), reverting tons of edits (especially if they're continually reverting the edits of a particular user)

I would love to get people who are familiar with successful flagging systems involved, to share their ideas and experiences. I've heard there was a user review process on Wikipedia that was shuttered in 2014, though from what I understand there was not a way to take any actions after the review was over. I believe there needs to be some potential consequences for users who are found to regularly engage in toxic behavior, but it's critically important to be thoughtful about it in order to prevent abuse or overreaching.

There should be warnings. Maybe after someone receives a warning, they have to take a short quiz about proper editing practices before they can unlock their account, or something like that. The main idea is to stop the bad behavior through education, and keep the editors if possible. This about reform and improving the community, not punishment and banning users.

Are there experienced Wikimedians who can help implement this project?

[edit]

If applicable, please list groups or usernames of individuals who you can work with on this project, and what kind of work they will do.
I volunteer myself and would love to get help from anyone who loves Wikipedia and would like to improve the community experience.

How will you know if this project is successful? What are some outcomes that you can share after the project is completed?

[edit]

This project would be successful if it triggered a significant amount of community reviews that result in an action such as an official warning or possibly a ban of users with particularly bad or recurrent detrimental behavior. Even better would be if users who go through the process and get warned stop getting flagged, indicating that they've become better community members.

How would your measurement idea help your community make better decisions?

[edit]

After you are finished measuring or evaluating your Wikimedia project, how do you expect that information to be used to benefit the project?
I expect the users who are reviewed would be more likely to try to be nice, involve themselves in collaborative discussions, stop patrolling pages and reverting all edits they don't like (regardless of legitimacy and sourcing), and stop following particular users around and harassing them.

Do you think you can implement this idea? What support do you need?

[edit]

Do you need people with specific skills to complete this idea? Are there any financial needs for this project? If you can’t implement this project, can you scale down your project so it is doable?
A flagging system would need to be implemented. A Community Review section of the website would need to be created, similar to Articles for Deletion. There needs to be a way to track if a user has been reviewed before, like AfDs on articles that are proposed for deletion multiple times. There would need to be a way to implement some sort of action on users deemed to be toxic, maybe only after more than one review, otherwise the reviews will be meaningless.

Get Involved

[edit]

About the idea creator

[edit]

I've been editing Wikipedia for a few years, and while it's mostly been a delight to write, share information and participate in discussions, I've had a few negative experiences by just a few users that almost caused me to quit. I ended up abandoning those situations. Trying to make legitimate changes when someone is willing to revert all your edits without being willing to participate or discuss is just too stressful. There are articles I know contain a great deal of incorrect information but I'm too scared to try to edit them because certain users watch those pages and revert every edit that doesn't fit their PoV (no matter the evidence otherwise). Since it almost got me to quit, and I love editing on Wikipedia, I know some of these users are causing potential new editors to quit out of frustration. I feel this is especially common on topics related to race, religion, and articles about women or women's issues.

Participants

[edit]

Endorsements

[edit]
  • Support Support I think this is actually one possible application of the EPR (Earned Public Reputation) system that I proposed separately. I don't like the focus on "toxic" users, but if someone has earned a negative EPR, then this could be a mechanism for helping them improve their contributions. If the "toxic" user doesn't want to change, well, that's covered, too. Here's a link to that idea: https://meta.wikimedia.org/wiki/Grants:IdeaLab/Multidimensional_EPR_(Earned_Public_Reputation) Shanen (talk) 22:00, 28 July 2018 (UTC)

Expand your idea

[edit]

Would a grant from the Wikimedia Foundation help make your idea happen? You can expand this idea into a grant proposal.

Expand into a Rapid Grant

No funding needed?

[edit]

Does your idea not require funding, but you're not sure about what to do next? Not sure how to start a proposal on your local project that needs consensus? Contact Chris Schilling on-wiki at I JethroBT (WMF) (talk · contribs) or via e-mail at cschilling(_AT_)wikimedia.org for help!