Grants:IdeaLab/Cumulate Likes and Unlikes to automatise harassment limitations/bn
আপনি কি ধরনের সমস্যা সমাধানের চেষ্টা করছেন?
Repetitive harassment from "bad" abusive users in selected articles and pages.
আপনার সমাধান কী?
There is already a suspensions or a banishment system in place for article publishing. It is currently offered by a person and decided manually by dedicated discussions. But a better system would be if an user could earn "Like" and "Unlike" votes with short comments, reported by any reader or editor. If the pace of the "Unlike" score exceeds a certain threshold for a single user, the user may be limited by restricting his editing rights for a certain time period that has been decreed, and potentially increasing the time if the "Unlike" increases on some or all pages edited. Also the score of an user should be visible to all, so that people will know if the user is trustworthy or not. Of course there will always be discussions to choose a type of limitation. But this method of detection and the automation could graduate the limitations, relieve patrol, and will be a self-limiting of disruptive behavior. This punishment framework will deter abusive users from acting spontaneously because they would fear the consequences of having a bad reputation.
In order the user rating system should function correctly there would need to be two types of rating scores.
- An overall user rating score that would show a sum of how many Likes and how many Unlikes the user accumulated throughout the users lifetime.
- An individual article and single page "Like" and "Unlike" grading system based on each edit the user commits to the single page's history.
Looking at and making decisions based on a user's overall score is not an accurate way of judging a user. That is because the user can have lots of Likes from the many outstanding writing and editing the user has done. But for a single article (for some reason) the user can be biased and manipulate the page to his liking. That is why we need the second type of a more specific page related rating system so others can restrict the targeted abusive behavior.
One problem that this "Like" and "Dislike" functionality can emit is that this tool in itself can become a method of harassing other users. To solve this there can be an option for the user to request an review of his rating score. If enough people find that the Dislikes the user received was incorrectly assigned to him and was an act of abuse, they can then vote to remove it. But in general the rating system will be accurate enough, unless Wikipedia will have more evil abusive people than good honest ones.
- To limit and minimize abusive users who harass and make edits not pertaining to the subject of the current article.
- Viewers and editors should have a way of assessing the validity of an user.
About the idea creator
- Excellent idea which works well on many websites.
- This is a great idea. It is superior to the system we have now, where a person can continually wipe clean their talk page, often under the guise of "archiving", to hide their complaints, conflicts and misdeeds. Anticla rutila (talk) 06:54, 3 June 2016 (UTC)
Expand your idea
Would a grant from the Wikimedia Foundation help make your idea happen? You can expand this idea into a grant proposal.