Research:Online Community Conduct Policies
This Research project summarizes conduct policies from online communities, with an eye to relevance to the Wikimedia community. The focus is on volunteer-led communities that curate or manage user-created content, but will cover other communities, such as gaming and other user-submitted content sites.
Online communities, including those in the Wikimedia movement, have struggled to form conduct policies that are fair to users, inclusive of cultural differences, welcoming to marginalized groups, and practically enforceable. As communities age, and their practices evolve, there is much knowledge and experience to shared. This project is an attempt to start collating and summarizing that knowledge.
Benefits to the Wikimedia community
These reports give us insight into how other websites are handling some of the same problems that our communities face so that we can evaluate and potentially learn from their techniques (as well as their mistakes). Wikimedia communities often form their policies and practices from the ground up; by analyzing both the policies of these communities, and how they developed, this resource can help inform that process.
How to engage with this project
Each report has a section for "Strengths" and "Weaknesses"; users are encouraged to use these sections as discussion areas for the policies and approaches summarized in the report. Suggestions for changes, criticisms, and ideas/sources for expansion are welcome on the report talk pages.
Policies being studied
Arts & entertainment communities
Full analysis at Research:Online Community Conduct Policies/DeviantART
DeviantART is a website where users can submit and collate their artwork in a variety of media. Users can comment on, favorite, and download others’ work; can follow other users’ submissions; and can gather various works into their own collections, which can in turn be followed and featured on the site. The site's Etiquette Policy emphasizes that while DeviantART does not impose “quality standards” on work or contributions, it does have a number of behavioral expectations of the users themselves. Behavior considered "prohibited" in the Etiquette Policy includes racism, spamming, flooding, and failure to supply a model release form for photography work when requested; "discouraged" behaviors include personal arguments and other volatile conversations, accusations against other users.
All users have the ability to block other users from interacting with them; more severe or widespread misbehavior can be reported to the DeviantART Help Desk, where behavioral issues are handled by a 7-member, paid Community Operations team. The members of this team have the ability to suspend or ban user accounts and to remove content from the website, though they generally only do so in the most severe of cases.
Full analysis at Research:Online Community Conduct Policies/Goodreads
Goodreads is a user-submitted book review and rating community, launched in 2007. The website was acquired by online retailer Amazon in 2013. The primary activity on the site is composing book reviews, or rating titles through a “five star” ratings system. Users can also create user-curated lists of titles called “Shelves”. User interaction happens on the site’s “Q&A” sections, where users can post questions about a specific title, and other users can answer them. As well, the Goodread Groups feature allows people to form groups around specific topics, genres, or authors, and interact on their Group’s discussion boards. Users can also interact with authors who have accounts on the site through the “Ask the Author” feature. Goodreads staff perform the content and behaviour moderation roles at the site, with a heavy reliance on user reports through their prominent flagging buttons.
Full analysis at Research:Online Community Conduct Policies/Pinterest
Pinterest describes itself as a “visual bookmarking tool”. Founded in 2010, it is essentially a virtual pinboard onto which users can clip, save, categorize, and share images and links. It is a primarily visual medium; while users can comment on pins, there are no centralized discussion forums. The site bars hate speech, incitement to violence, self-harm, impersonation, harassment, and cyberbullying. The majority of its conduct policies specify that they protect only non-public individuals, and each of its policies also includes discussion of what types of content do and do not fall afoul of the policy.
All users have the ability to follow and unfollow content boards and categories, as well as to block other users from interacting with them; more severe or widespread misbehavior can be reported to Pinterest staff, who have the ability to remove pins and sanction user accounts.
Full analysis at Research:Online Community Conduct Policies/4Chan
4Chan is a discussion forum and image/meme sharing site, started in 2003. 4chan posters were instrumental in making or spreading many of the popular memes of the mid-2000s, including “lolcats”, “rickrolling” and videos like “Chocolate Rain”. The site has been criticized for hosting pornography, images of graphic violence, overtly racist discussions, and has played a role in online “cybermobbing” and harassment incidents. 4Chan relies on volunteer moderation, and takes a noted "hands-off" approach to user conduct. Due to the site's architecture, problematic posts are often "pushed" or "archived" away from public view quickly - a form of automatic moderation. The site has been criticized over several incidents involving its users engaging in criminal activity.
Full analysis at Research:Online Community Conduct Policies/Metafilter
Metafilter is a privately-owned internet link aggregation-and-discussion site (it describes itself as a “community weblog”). The site is considered atypical or anachronistic in the category of social media sites in a number of ways: it has been privately owned since its founding in 1999, its user experience and visual design have not substantially changed since 1999, and membership in the site is not free.
Metafilter’s seven-person moderation staff prefers to review behavior in context and thus the site provides only general guidelines for user behavior. The intention of this approach is not to have no behavioral expectations of users, but rather to allow the moderation team to judge cases individually and use their experience and judgment about what action will serve best in a given situation. Despite policy being largely unspoken, user behavior on the site is nevertheless quite actively moderated, both socially by site users (using flags and peer pressure) and technically by moderation staff (using bans and comment edits/deletion). Metafilter is unusual in explicitly allowing banned users to create new accounts and rejoin the site without having to appeal the previous ban or disclose their account history to the community.
Full analysis at Research:Online Community Conduct Policies/Quora
Quora is a question-and-answer website where questions are asked, answered, edited and organized by its community of users. The site relies on a founding principle of "be nice, be respectful", a policy which prohibits personal attacks, harassment, racial slurs, and hate speech. Users report content which violates site policies through a "contact us" form. Third-party reporting of problematic content is encouraged, and reports are reviewed and acted upon by paid Quora staff moderators. These moderators have a variety of tools available to them to deal with problematic users and content, including user-side (blocks from editing, bans from the site) and content-side (editing, hiding, and deleting content in questions and answers).
Full analysis at Research:Online Community Conduct Policies/Reddit
Reddit is a forum-style community of contributors who share interesting posts and memes, discuss current events and politics, and post breaking news stories. It was founded in 2005, and purchased by the magazine publisher Conde Nast in 2006. Reddit is moderated by volunteers, and has a small paid staff based in San Francisco. Content is organized into “subreddits” on specific topics, and subreddits can have their own content and behavioural guidelines.
Full analysis at Research:Online Community Conduct Policies/Tumblr
Tumblr is a microblogging platform with heavy reliance on social networking among blogs and users. The site's policies prohibit (among other things) hate speech, harassment, doxxing, and content glorifying self-harm. Users have the ability to block one another in cases of individual-targeted misbehavior; more serious conduct violations are reported to the site's Trust & Safety team via a webform. The site does not provide any information about how this team judges or handles these reports, though an external document written with the cooperation of Tumblr staff claims that all reports made via this method are reviewed individually and that subsequent actions may include content deletion or account suspension.
Riot Games (League of Legends)
Riot Games provides the software and community management for the PC-based game League of Legends (LoL). LoL is a online multiplayer “arena”-based game, and was, in 2012, the most played PC game in North America and Europe. It can have up to 7.5 million players online playing concurrently. Riot Games has been named by Danielle Citron, a noted online harassment expert, as having “much success” in curbing abuse on its platform.
League of Legends utilizes a peer-review conduct system called the "Tribunal". It allows players to report other players for violations of the game's policies, and to have those actions reviewed by a designated team of fellow players. The platform has reported a significant reduction in harassing speech and recidivism among its user base.
Full analysis at Research:Online Community Conduct Policies/Rust
Rust is a relatively new programming language, having been created in 2006 by a Mozilla employee. Though the language and its development are sponsored by Mozilla, it is administered by an active and diverse community largely made up on volunteers. Decisions are generally made through a “Request for Comment” process that occurs on Github.
The Rust community has a dedicated Moderation Team, which stands on equal footing with other, more technical teams such as the Core, Language Design, and Compiler. The moderation team is charged with enforcing Rust's Code of Conduct, which sets out the project's welcoming of users of all types, the behavioral expectations all users are expected to abide by, and the process by which problematic behavior is handled.