Controversial content/Board report

From Meta, a Wikimedia project coordination wiki
  • Report from the Controversial Content Working Group to the WMF Board, March 2011

Background[edit]

The controversial content working group was tasked with examining the recommendations in the Harris report on controversial content in greater detail, summarizing board feedback on this issue and making a recommendation on the report to the full board. The Harris report was developed in public and received a great deal of community input. The report was discussed by the Board in October 2010.

The membership of the working group went through two iterations: Jan-Bart, Kat and Phoebe (Fall-Winter 2010); and Phoebe, Matt, Bishakha and Jimmy (Winter-Spring 2011). Our consultants the Harrises (Robert and Dory) also participated. The working group collected and summarized board member feedback, examined the recommendations in light of this feedback, collected some new ideas that were not articulated in the report, and requested that WMF technical staff mock up a potential user image hiding feature. The group met several times by conference call.

Recommendations to the Board[edit]

Overall, we recommend that the Board issue a statement to the community which includes principles and actionable items, as well as direct the WMF to develop a personal image hiding feature, as described below. We aim to take a respectful and constructive tone, fulfill expectations of leadership and promote useful, actionable items. In detail, we recommend that the Board do the following:

  • Issue a statement of principles: The Harrises included several principles in their report that had broad Board support; it seems appropriate to issue a framing statement of principles. Our suggested statement combines several principles articulated in the Harris report, including recommendations 1 & 2 (no changes to text) and 11 (principle of least astonishment), adding language that mirrors the original Board resolution on the subject.
  • Issue recommendations for continued community action: We suggest urging the community to continue actively reviewing and curating (especially controversial) content; this is a re-wording of recommendations 4,5 & 6 (reviewing sexual images) that is more inclusive to all kinds of controversial content, and that recognizes that content curation is a part of ongoing work on all projects. We frame this as a continued call to action.
  • Issue a recommendation on personality rights: Additionally, we recommend that the community strengthen the guideline: Photographs of identifiable people and apply it more rigorously, perhaps as policy. This guideline is relatively non-controversial (and was incorporated in part in the recent failed sexual content policy, to consensus), but has been rarely applied. We feel that the more rigorous application of this guideline is a worthy end in itself for all sorts of images: we should make more of an effort to respect privacy of image subjects. (As a practical matter, we are particularly worried about images uploaded from Flickr; we have no good way of tracking down the authors of these images or of confirming that their subjects are willing to be exposed to a global audience, or even of confirming that the authors meant to use a free license, and often these photos are some of our most questionable.) Note: This was not a part of the Harris report.
  • Develop an image hiding feature: We asked the WMF to mock up a feature for an image hiding system, as described at mw:Personal image filter, following recommendations 7 & 9 (hiding image feature & NSFW button). We support the direction of this feature, and recommend that we develop and roll it out. We came to consensus on the following points of discussion:
  • The interface and the category system should use absolutely neutral language whenever possible -- not "controversial" or "objectionable" or the like.
  • We are focused on the ideas of reader choice and including all potentially controversial content (violence, sexually explicit, etc.) in the category scheme -- not focusing solely on sex, etc.
  • The UI interface link to change one's preferences as a reader should be clear and easily discoverable (above the fold), as well as easy to use; otherwise we will miss a good deal of our intended audience.
  • Recommend continued development and discussion: Finally, we suggest that we broadly encourage the development of more tools for Commons, in good faith as necessary for the health of the project; and that we encourage continued discussion of the scope of Commons (as perhaps the hardest and most broadly important of all of these questions). The working group does not have a specific recommendation on the question of the scope of Commons.
  • Take limited action on the controversial recommendations: Recommendations 3 (WikiJunior); 4,5,6 (reviewing specific categories of images with specific exclusions/guidelines); and 10 (no 3rd party tagging systems) were the most controversial recommendations from the report among the Board. We advise referring WikiJunior to the community at this time, and rewording the reviewing recommendations per below. We have no specific recommendation on 10; there seem to be a few reasonable stances on this issue (related to not supporting censorship on the one hand and supporting free distribution of data and metadata on the other). The Harrises offered further clarification if the board wishes to discuss this.

We would like to thank Robert and Dory Harris for all of their hard work over the past year, and the staff, particularly Brandon Harris, for mocking up the image filtering system.

Specific actions on each Harris Recommendation[edit]

Recommendation 1

That no changes be made to current editing and/or filtering regimes surrounding text in Wikimedia projects. Current editorial procedures, including the procedure to modify those regimes if needed, are adequate, in our opinion, to balance the twin Wikimedia principles of access to information and service to readers.

  • Proposed Action: Issue a statement of principles
Recommendation 2

That, therefore, no changes or filters be added to text on current Wikimedia projects to satisfy the perceived needs of children.

  • Proposed Action: Issue a statement of principles
Recommendation 3

That, however, the Foundation investigate the creation of a “WikiJunior” version of the Wikipedias, aimed at children under the age of 12, either as a stand-alone project or in partnership with existing and appropriate educational institutions.

  • Proposed Action: Take limited action on this, refer WikiJunior to the community
Recommendation 4

That Commons editors and administrators review the application of the existing Commons policy on educational scope to images of nudity in Commons, where breasts, genital areas (pubis, vulva, penis) and/or buttocks are clearly visible, and the intent of the image, to a reasonable person, is merely to arouse, not educate, with a view to deleting such images from Commons.

  • Proposed Actions
  • a)Issue recommendations for continued community action, rewording recommendation
  • b)Take limited action on this
Recommendation 5

That historical, ethnographic and art images be excluded from such a review, and be considered in virtually all cases to be in scope.

  • Proposed Actions
  • a)Issue recommendations for continued community action, rewording recommendation
  • b)Take limited action on this
Recommendation 6

That consideration be given by Commons editors and administrators to adopt policies of active curation within categories of images deemed controversial (sexual, violent, sacred) which would allow for restriction of numbers of images in a category, active commissioning of images deemed needed, but absent from the category (line drawings of sexual positions, eg.), such policies not necessarily to be applied to images not deemed controversial.

  • Proposed Actions
  • a)Issue recommendations for continued community action, rewording recommendation
  • b)Take limited action on this
Recommendation 7

That a user-selected regime be established within all WMF projects, available to registered and non-registered users alike, that would place all in-scope sexual and violent images (organized using the current Commons category system) into a collapsible or other form of shuttered gallery with the selection of a single clearly-marked command (“under 12 button” or “NSFW” button).

  • Proposed Action: Develop an image hiding feature
Recommendation 8

That no image be permanently denied any user by this regime, merely its appearance delayed, to prevent unexpected or unintentional exposure to images.

  • Proposed Action: Issue a statement of principles
Recommendation 9

That a series of additional user-selected options be created for other images deemed controversial to allow registered users the easy ability to manage this content by setting individual viewing preferences.

  • Proposed Action: Develop an image hiding feature
Recommendation 10

That, by and large, Wikimedians make the decisions about what is filterable or not on Wikimedia sites, and consequently, that tagging regimes that would allow third-parties to filter Wikimedia content be restricted in their use.

  • Proposed Action: No specific recommendation
Recommendation 11

That the "principle of least astonishment," the notion that content on Wikimedia projects be presented to readers in such a way as to respect their expectations of what any page or feature might contain, be elevated to policy status within the projects as a fundamental principle governing relationships with readers.

  • Proposed Action: Issue a statement of principles