Talk:2010 Wikimedia Study of Controversial Content

From Meta, a Wikimedia project coordination wiki
Jump to: navigation, search

Thanks, and a tiny suggestion[edit]

Robert/Dory/Steven: Thanks for posting this today! I mentioned it in IRC office hours too, and I think someone's going to post about it to foundation-l (possibly one of you). A very tiny suggestion: For ease of use and commenting, I think it's worth attaching a short title/descriptor to each of the principles. E.g., instead of "One," the first principle could be labelled "One: Wikimedia is committed to intellectual freedom." I think this would make it easier for people to discuss the principles on the talk page, because it'll be easier to see what they're talking about :-) Thanks Sue Gardner 22:03, 16 September 2010 (UTC)

Have done so. Thanks for the suggestion Robertmharris 22:07, 16 September 2010 (UTC)
"We are not completely finished our work, because the feedback ..." Have not? are not ... with ...? --Xeeron 22:15, 20 September 2010 (UTC)

A predictable disappointment[edit]

In his history of Rome, Lactantius wrote,

"Now this was a circumstance in the bad disposition of Diocletian, that whenever he determined to do good, he did it without advice, that the praise might be all his own; but whenever he determined to do ill, which he was sensible would be blamed, he called in many advisers, that his own fault might be imputed to other men."[1]

Whatever the time taken for the study, to me it is simply summarized — Wikipedia will oppose censorship, except when it it "necessary" — and necessity will no longer be determined only by law, but simply by counting complaints. The summary speaks of a balance between "openness" and "service", but what we should be serving is knowledge, not censorship. It speaks of a rising tide of censorship, but only to join it.

While the study describes complaints as being made by individuals, it is impossible to prevent coordination. We should expect that under this policy, for each type of content on Wikipedia, there will be a Facebook group dedicated to its removal, and their complaints will add up to the level of controversy demanded. Except, of course, when such protests are coordinated more formally by such well-respected groups as the Basij.

The question that remains unanswered, then, is what level of censorship is demanded, for content that is legal in the United States? Some have said that something as simple as hiding an image of Muhammad or a leper in a show/hide box would be enough, though I never believed it would stop there. Others would demand removal of the material from the edited page, but leave it available to contributors who examine the historical versions. But traditionally, sites and countries that embrace censorship rapidly decide that of all the things to censor, the most important of all is the list of things that have been censored. It follows that "RevDeletion", already exploding in popularity, will likely conceal the history, the discussions and debates, and leave it unknown to users in what way the article has been skewed.

So far, the outcome has been predictable — but now it meets the individual contributors, who are known for an unaccountable stubbornness. Wnt 23:18, 16 September 2010 (UTC)

I have to largely agree with Wnt. I really don't want to be on the discussion board where we are gonna settle all this. Its gonna be a minefield. Still I look forward to next installments of this. TheDJ 12:12, 17 September 2010 (UTC)
I don't see how you can draw these conclusions from what was written. The principles Robertmharris has proposed explicitly say that Wikimedia projects do not exist to please other institutions and organizations, whether they are the Basij or Burning Man. And if we are serving the needs of individuals, I don't expect that he will recommend removing any content, as long as it is even minimally in line with the mission. I agree the devil is in the details (an objective definition of "controversial" seems very difficult to me) but for now, he's asked us to debate the principles -- do you disagree with them so far? NeilK 19:45, 17 September 2010 (UTC)
I agree with the principle of openness, but the principle of public service is not properly defined. Wikipedia's role in public service is to provide information. It is true that, unjustly, there are still a remarkable variety of legal obstacles to doing this fully; but this is no excuse to give ground when it is in our power to avoid doing so.
Where public service to an audience is concerned, it isn't really Wikipedia's responsibility to make a site that is acceptable in Australia and Saudi Arabia. The material is all freely available - anyone can try to pare it down to what a given country and audience will put up with. Maybe Wikimedia can even help in that. But that's not an obligation that should be laid against the free contributors who create the original, full version of the content. Wnt 02:23, 18 September 2010 (UTC)

capitalisation in sub-headings[edit]

before thinking about responding more substantially, I thought I'd ask about how the authors feel about small 'gnoming' type edits - I haven't spotted any typos or anything thus far, but feel it might be best to remove the Unnecessary Capitalisation from the sub-headings - thoughts? Privatemusings 00:22, 17 September 2010 (UTC)

I don't know how Robert feels about gnoming, but re: the caps... It's actually normal (as in, standard across style guides) for headings to use title case. Wikimedia is actually the odd duck out in not doing so. Just FYI. Steven Walling (talk) 17:37, 17 September 2010 (UTC)

Yes check.svg Done 71.198.176.22 10:31, 21 September 2010 (UTC)

I have reverted those edits. As stated on the page clearly, please do not make significant changes (such as changing major style), unless there's clear consensus here with you and Robert or Dory. Steven (WMF) 19:55, 21 September 2010 (UTC)
How is title case major? 71.198.176.22 03:13, 25 September 2010 (UTC)

First thoughts[edit]

The proof will be in the pudding, as they say. The status quo is working great from my point of view, so the real question is what "changes from the status quo" will ultimately be recommended.

I, however, am cautiously optimistic at this stage because of this statement:

"Wikimedia Projects serve the Information Needs of Individuals, Not Groups"

This is, I think, a really helpful observation. This observation looks, to me, like the start on a path out of this mess. Because this observation points the way towards solutions that I would classify as "having our cake and eating it too".

Because on every pageview, there is one unique individual who can 'censor' absolutely anything they want without any risk of compromising our projects' intellectual freedom-- and that person is the individual reader.

There's nothing wrong with Jimbo going around and deciding which images he doesn't want on his own computer's screen. But if he can decide for me which images are to be allowed my screen, Wikipedia isn't Wikipedia anymore.

I think it's a very good sign to be remembering that each reader is their own person. Each reader has, to borrow the Enlightenment phrase, a 'god-given right' to decide for themselves what information is too controversial for their computer screen. No one else can responsibly make that decision "for" another person, only the individual reader can make that decision for themselves.

So long as we stick to empowering the individual reader, there is plenty of room to "Have Our Cake and Eat It Too".

In fact, there is so much room, the only conceivable reason that would we shouldn't be able to "Have Our Cake And Eat It Too" would be if someone in a high place decided that all cake is inherently immoral and started an anti-bakery campaign to purge all cake-related information.

--Alecmconroy 06:00, 17 September 2010 (UTC)

are you sure that's the only conceivable reason, Alec? :-) Privatemusings 07:27, 17 September 2010 (UTC)
lol not in the slightest-- i'm not done conceiving of all conceivably conceivable reasons. Besides, I'm always one step behind and I never really know what's going on. I only found out about the May porn purge when I read the headlines on the frontpage of Slashdot announcing our new 'change of policy' to the larger world. --Alecmconroy 08:39, 17 September 2010 (UTC)
I would like to interpret it that way, but the succeeding sentences make it clear that they're talking about counting individual complaints rather than using individual preferences.
I should note that I can readily envision methods of individual control. Two I've suggested elsewhere are:
Allow the user to set in his preferences (or maybe Monobook.js?) a personal list of blacklisted figures. Give users an easy way (in the optional script interface, not in the page) to "blacklist this figure" and thereby add it to the list. Allow users to "harmonize this list" with like-minded users on Wikipedia. In this way a central "imam" could keep a quite up to date list of all such objectioned figures, for each group of like-minded individuals, and all the others could take from him.
Make it easier for outside organizations to serve a historical version of a Wikipedia page in a frame, which has none of the annotations or links to other versions. Allow them to submit "self-reverting edits" to put a version on the record here which has removed objectioned content, without censoring the page for everyone else, which they can then access by this interface. Encourage private companies/organizations to run a stripped-down Wikipedia server (under some other distinguishable trademark) which serves "safe" history versions for each article, by their standards.
Unfortunately, I don't think this is about not offending users with what they find on their own screens. I think this is a capitulation to groups who can't stand the idea that there's a Muhammad cartoon or a naked woman anywhere where anyone else can find it. Because otherwise this would have been written differently. This is all about reaffirming the superiority of some people's sense of control over other people's freedom of inquiry. Wnt 17:28, 17 September 2010 (UTC)

Tangential second thought-- 'NOTCENSORED' as resistance to 'extra-institutional pressure'[edit]

Quoting:

"Wikimedia has been less willing than others to accede to extra-institutional pressure to change its content. (This, to us, is the true meaning of the oft-quoted “Wikimedia does not censor”)"

It's a really interesting and helpful this synthesis of all the different discussions over 'NOTCENSORED' and its interpretations, and it's another valuable insight.

We do have to be aware of what 'extra-institutional' entails in the unique case of Wikimedia, where anyone can edit. NOTCENSORED doesn't remind us to resist pressure from 'extra-institutional groups'. Rather, NOTCENSORED reminds us to reject pressure to achieve 'extra-institutional objectives'.

That is, it's not about who you are- there's no "in-group" and there's no "alien other. Overall, we try really really hard to make that true. "Who you are" doesn't matter-- it's your ideas that matter. (Indeed, we resisted pressure from Jimmy, and he's was the most popular Who in all of Whoville).

It's not about resisting an extra-institutional "who", it's about an extra-institutional "why". It's not that we reject pressure from extra-institutional entities, rather we reject pressures to meet extra-institutional objectives.

That is, when someone wants us to censor something, they essentially want us to elevate their extra-institutional objective (e.g. Religious piety) above our institutional objective to freely share the world's information. --Alecmconroy 08:33, 17 September 2010 (UTC)

I like your thinking here Alec, very inspiring. TheDJ 12:15, 17 September 2010 (UTC)

Billions of hours?[edit]

It's of course not central to the argument being made in the third paragraph, but I am curious about the sources for the statement that "hundreds of thousands of individuals have voluntarily given billions of hours of their time to the projects over the years". In his book "Cognitive Surplus", Clay Shirky gives the rough estimate (arrived at with the help of researcher Martin Wattenberg) of 100 million hours for all Wikipedias by spring 2008. This is hard to reconcile with the "billions", even adjusting for the work that been done since then, and adding the contributions to other Wikimedia projects (Wikipedia has more than four times as many editors as all other projects combined, see [2] p.4).

Also, while it is certainly true that many of the most dedicated volunteers have spent several thousand hours, I would be surprised to learn that this is the average time for a Wikimedia contributor.

Regards, HaeB 12:19, 17 September 2010 (UTC)

You are correct. I was actually quoting a figure I had read (I will attempt to recover the source)attempting to determine the value of the voluntary time Wikimedians had spent on the projects, which was estimated to be in the billions of dollars. Thanks for noting the discrepancy.Robertmharris 12:44, 17 September 2010 (UTC)
It hasn't been corrected yet. Regards, HaeB 23:22, 23 October 2010 (UTC)
Any word on the actual figure here? I've been citing the Shirky/Wattenberg figure in my presentations but it's really a back-of-the envelope estimation and I'd love a more accurate cite. WBTtheFROG 19:15, 10 February 2011 (UTC)

Unexplained aspects and implied principles[edit]

Thanks for your all your work putting this together. I am interested by your principles, although teasing out implied principles from the existing ones can be a perilous endeavour. You're in the same boat as people who would like to guess whether the American Constitution has an implied right to privacy.

I think you are going to make people very uneasy with this three-part unveiling. You are asking people to agree to new principles that have as-yet unexplained consequences. But the question that is uppermost in some people's minds is whether Wikimedia is now going to censor content. In the meantime I urge everyone else to suspend judgment whenever possible, or, be bold and suggest revisions.

Anyway, I'm confident that you are trying to make a positive contribution -- and yet, you're introducing a few new concepts, and it's not clear to me where these are going:

  • The notion of public service. While Wikimedia does benefit the public, that is not evidence that we have a principle of public service. For example, there are many software projects which are open source for purely pragmatic reasons. They happen to benefit the public, but that was a side benefit rather than anyone's explicit goal.

    Wikipedia often acts as a focal point for many people pursuing selfish goals. For example, opposing sides of the Lilliput/Blefuscu dispute might have their own agendas in editing the article on their conflict. Admins could enter the fray just to keep things orderly. And from this opposition eventually emerges an article that is reasonably balanced and verifiable. But nobody has to be concerned with the public per se for that to work.

    I am personally very sympathetic to the notion that most contributors do act with the public good in mind, whether they know it or not. But I think this needs more background and justification to be accepted as a principle of Wikimedia projects.

    I am guessing that you are making this principle explicit to oppose the current Wild West ethos on Commons, which, being contributor-driven, has no answer to the question of how many pictures of users' penises are actually enough or how visible such collections should be. Right now we only consider the contributor, and the software that stores their contributions. If we introduce a third element -- the public's needs -- then we may do things differently. And we also open the difficult question of intuiting what those needs are. So I think that's admirable and worth pursuing, but let's be careful.

  • The notion of controversy. You reject "objectionable" as too muddled, and replace it with "controversy" which you suggest could be objectively determined.

    But, in order to accept that principle, we need a sense of what that determination process could be. Caveat: even if you propose one, it's not likely to be adequate on the first pass - it's rare that individuals can out-think entire communities that want to game such a standard. So, I have to suspend my judgment on whether this principle is even worth following, until you (and like-minded people in the community) can show that there is a way to follow it.

NeilK 20:55, 17 September 2010 (UTC)


Neil -- thanks for your observations. Let me just make two points in answer to them. First, we were not trying to be coy, or to increase suspense in releasing the draft study in three parts, merely trying to reduce the amount of information people had to absorb at any one time. However, to make things clearer, we're planning to release all the recommendations of the study in one go early next week, with explanations of the rationale for those recommendations, so everyone can see the whole picture. We realize that people need to know the totality of what we're recommending before they can respond effectively. Then, for part 3, we'll provide a fuller and more comprehensive explanation of the thinking behind what we've suggested.

Secondly, we understand the value of having the community subject our definitions and proposed processes to a careful scrutiny -- it's one of the reasons we created this page in the first place. Robertmharris 02:40, 18 September 2010 (UTC)

the epistemological problem[edit]

This is an interesting effort - I think wikipedia could use something like this - but I think you've missed the central epistemological problem. The issue here is not really about whether some material is 'objectionable' or 'controversial', though that's where the more severe problems tend to manifest. This is a more fundamental issue about the difference between providing knowledge (in the pedagogical sense of the word) as opposed to mere information. Knowledge is a kind of information, mind you, but it's a restricted subset - the vast majority of information in the world has no pedagogical value whatsoever.

Openness is important, yes - that concept taps into deep-seated fears about political censorship (where some idea is suppressed because it violates the preconceptions of a politically powerful entity). We all know what happened to Galileo, Solzhenitsyn, and Salman Rushdie, and we all firmly believe that ideas should not be censored because they oppose politically or socially entrenched structures. But I think we all also recognize that guarantees of political openness were never intended to protect every dumb thing someone might say or do, and that too much openness is acutely destructive: it allows free expressions of hate, gross invasions of personal privacy, and endless reams of trivial, salacious, offensive, or otherwise mindless material. This is why Wikipedia has policies like Wikipedia:BLP and Wikipedia:RS that explicitly curb certain kinds of expression on project - we recognize that just because something can be said doesn't mean that it should be said in a respectable encyclopedia, no matter how much people want it to be said.

In other words, we want the project to be open to all knowledge, but we do not want it to be open to all information, and the problem we face is distinguishing between the two.

This is a decision that teachers make every time they plan out a course, and if we adopted that model more explicitly a lot of these problems would disappear. When faced with explaining a subject over the course of X weeks to relative beginners, a teacher will naturally use the following conservative cost/benefit schema:

  • Choose a limited selection of the best sources to highlight particular aspects of the subject.
  • Keep things concise, simple, and clear, without extraneous explanations or argumentation (detailed investigation is for later courses)
  • Use whatever material is needed to explain the subject, regardless of whether it might offend someone.
  • Avoid material that is not needed, or that might distract from the subject by drawing attention to itself or by stimulating unwanted secondary discussion.

On project, however, people tend to ignore that last point. In particular, they tend to add material without considering whether it actually adds useful knowledge, or whether its tendency to distract the reader might actually interfere with the article as a whole. Sometimes they even add material specifically because it's a distraction (not necessarily in a vandalish way, but because they personally find it amusing or interesting or titillating, and assume therefore that it's a useful addition).

The key point here is that creating knowledge often requires limiting information so that the reader is not overwhelmed (intellectually or emotionally) by unneeded or distracting material. We don't want complete openness in the sense in which that phrase is often used; we want to protect material that is needed pedagogically to explain the subject. If some particular material becomes a point of controversy, then that is a distraction from the encyclopedic aims of the article, and we should (by default) pare back the material as best we can to remove the controversy without sacrificing the pedagogical need of explaining the subject. --Ludwigs2 18:42, 18 September 2010 (UTC)

Wikiversity is a much-neglected project that serves the goal of providing courses of content, in which material should be edited down to a standard level of detail. However, the purpose of Wikipedia is to provide a wide range of information, at all levels from beginner questions to summarizing current scientific research; likewise to provide the full range of emotional impact. The purpose of Wikimedia Commons is even broader - to provide a valuable resource of photographs for many purposes. I am not sure if your distinction between information and knowledge is meaningful, but if it is, Commons' purpose is almost entirely to provide information, not knowledge. Wnt 23:13, 18 September 2010 (UTC)
I wasn't talking about turning wikipedia into wikiversity. I was pointing out that wikipedia is a more effective resource the more that articles exclude unneeded and distracting material. The question of what is and isn't needed, obviously, is going to vary depending on the particular project you're talking about and particular context you are looking at, but even on commons (which is the most 'repository'-like project I know of), there are distinct criteria. as NeilK pointed out above, commons does not really need to have thousands of images of users' penises. It needs a few - images that are particularly suited for descriptive presentation on various articles, or that are particularly notable for one reason or another, or that are considered artistic presentations - but I suspect commons sysops delete and block wherever some user starts uploading dozens of pictures of his own genitalia (or if they don't, they should - this isn't chat roulette). Note that commons project scope specifies that media 'must be realistically useful for an educational purpose', which is precisely the point I'm trying to make. If we could extend that 'realistically useful' concept to other projects and start pruning away material that isn't, it would create some significant improvements and dispose of a lot of content disputes out of hand. --Ludwigs2 00:18, 19 September 2010 (UTC)
To me it seems like usually when someone on Wikipedia says that an article is too detailed, what he really means is that "ordinary people don't have the right to know that much". This is heard in the medical articles, for example, which certain people think should be written for "patients" rather than for an audience including patients, biologists, medical students, and probably more physicians than will ever be admitted. The way to deal with the problem of excessive detail for a general article is via "w:WP:summary style" - i.e. create a new article that covers the more advanced information. Wnt 19:46, 21 September 2010 (UTC)
Thoughtful analyis, Ludgwigs. I think, as you suggested, the difficulty comes in getting down to knowledge without too many unintended side-effects. The process of cutting away information is fraught with the risk of creating administrative tyranny, bureaucracy, a biased evaluative regime, new rationales for discrimination, deeper entrenchment of old prejudices, pre-emptively cutting off of future prospects, denying locally desired ideas which don't have global appeal, rationalizing shortsightedness, and overemphasizing costs where proper curation can mitigate most excesses without requiring putting up new walls.
As some other commenters have mentioned, it's not only the benefits of inclusionism but the risks of deletionism (censorship) which weigh heavily here. It would always be great in theory to have just the knowledge you need and nothing else. But you have to build that edifice without making it a place no one wants to live. Certainly some tension between the two, but the greater risk, at least historically has been having too little information not too much. It's a real tribute to the project (and the civilization) that we're at a point to reasonably consider intentionally ignoring lots of data.
Last knowledge is an easily defined but variable concept; it depends on whether an individual user finds it useful. To me, a database of geolocation points for lighthouses is worthless, but for someone else it might be heaven. So the other problem with getting from information to knowledge is not just how but which? Ocaasi 11:19, 24 September 2010 (UTC)

The principles stated here have proved meaningless[edit]

After reading "Part Two", I find that the principles expressed here, however misguided, do not actually have any relevance. You talked about balancing openness and "public service", and having empirical methods of determining what is controversial. But Part Two gets the ball rolling by arbitrarily selecting sexual and violent content for suppression, subject to an undefined/unworkable category scheme. It talks about "curating" these categories on Commons - i.e. deleting the existing pictures and having someone contribute some line art to replace it. It establishes not an individually defined filter but a single-switch one-size-fits-all filter for "under 12" readers who, supposedly, will retain the right to shut it off. The whole outcome of this elaborate intellectual sophistry condenses to one simple achievement:

WMF is abandoning WP:NOTCENSORED.

That's all anyone needs to know or remember about this document. Wnt 20:18, 22 September 2010 (UTC)


Yes, the correct thing to do with recommendation 4 is to reject it, and to say instead that we shall treat images on these subjects exactly the same as on any other subject, and that the same criteria apply--and, in particular, that free images which any project considers appropriate are automatically appropriate for Commons. The distinction that images meant "to arouse" are pornography is based upon the 19th century assumption that sexual arousal is in some sense evil, dangerous, or in need of restriction. The contemporary view in some of the countries represented in Wikipedia is exactly the opposite. I am amazed that any group of internet-aware adults would not realize this. (And, fwiw, the selection of body parts listed seems an Anglo-American bias, and is probably not appropriate to all cultures--another reason why the /WMF should not become involved in such matters.) If there is a distinction, it is with images meant only to entertain, rather than inform, and would apply equally to sexual and non-sexual content. The same is true with such matters as privacy issues: it applies equally to all content. Sexual behavior, after all, is at the basis of most human behavior and should, if anything, have a specially protected status.
As for recommendation 6, the premises are false. There is no overall special sensitivity within the viewing community for sexual images; there is a special sensitivity within some small portions of the viewing community in certain demographics to these images, but Commons serves them all. Even if a particular project thinks it necessary to its educational mission to have certain limitations, this must not apply to Commons.
Recommendations 7 & 8 are wrong also. the need to avoid surprise is adequate met by proper placement within articles. As for recommendation 9, the development of such tools is appropriate for Wikipedia. It will inevitably lead to the use of the distinctions for the purpose of true censorship. If third parties wish to develop such tools, and provide for their use as a layer on top of Wikipedia, then we cannot prevent them., any more than we can prevent the use of censorship. Recommendation 11 is unnecessary--as even the recommendation says, this is already within article guidelines in some projects, and is in any case entirely up to the projects, not the foundation. (I will say that I think in the enWP, it could be more consistently followed.) This is exactly the sort of WMF interference with the individual projects that should not be permitted. DGG 02:15, 23 September 2010 (UTC)
If there is a distinction, it is with images meant only to entertain, rather than inform, and would apply equally to sexual and non-sexual content. No. It is a perfectly reasonable position to take that while we may be happy to provide "free entertainment" when it comes to butterflies or porcelain, we are not prepared to provide "free entertainment" in the area of porn. The idea that porn should be viewed as just another category of entertainment provided by our server, and that we should be as happy to provide "entertainment value" in that category as in any other that viewers might enjoy browsing through, doesn't work for me. I could write about different parts of the brain being involved in enjoying images of butterflies vs. watching porn, and how the learning mediated by an encyclopedic endeavour (as opposed to life) is thought to involve intellectual functions, but there is no need -- the idea that we should be just as happy if people use our site to enjoy porn as we are if they enjoy looking at butterflies is impractical, because it would profoundly damage the Foundation's standing in society. (Did you mean to post this on the Part Two talk page? Talk:2010_Wikimedia_Study_of_Controversial_Content:_Part_Two If so, feel free to transfer the thread including my response to that page.) --JN466 03:22, 23 September 2010 (UTC)
"The Foundation's standing in society" is measured differently by different people. To some here, allowing censorship damages the Foundation's standing. My concern is more specifically that if WMF adopts a greater degree of censorship than Google, it will cast doubt on whether open collaboration, copyleft models are capable of competing with copyrighted sources. It is true that Google has been manipulated into hiding some images, but at least their capitulation was only in one category.
The question of "if people use our site to enjoy porn" is already answered, and it remains answered so long as we maintain our appreciation of fine art. Given that children can surely find exemplary wankfodder in these hallowed halls so long as such fine art remains, where is the benefit (even if one accepts it as such) of removing some other images of lower quality? Wnt 04:05, 23 September 2010 (UTC)

Can the result be "nothing needs changing"?[edit]

I do not have an insiders perspective on all this. I have no idea about all the different things that have occured over the years. I also don't know what the hot button issues of the moment are. What I do have is an opinion of Wikipedia, and all WMF projects, that is similar to the majority of the non-wikipedians that access these sites.

  • Wikipedia is a good thing
  • The people running Wikipedia are doing a good job
  • Result = nothing needs to be changed

Please note that when I say Wikipedia, I am refereing to all WMF projects including Commons. This is how non-Wikipedians view all the related sites. I believe if the general population were polled, they would say that the study was unnecessary and nothing needs to be changed. Views to the contrary are most welcome. 64.40.62.120 07:29, 23 September 2010 (UTC)

I'm hoping Robert Harris will chime in on this. Is it possible for the result to be "nothing needs changing"? 64.40.62.120 08:34, 23 September 2010 (UTC)
I have a feeling the entire intent of this "Study" is to push for changing policies. However, I'm pretty confident the result will be "no change" in the end. WP:NOTCENSORED is a pretty strong foundation of Wikipedia policy. — The Hand That Feeds You:Bite 13:40, 23 September 2010 (UTC)

Comment on respect and censorship[edit]

I am not convinced by the simple statement that Consequently, Wikimedia has been less willing than others to accede to extra-institutional pressure to change its content.... the true meaning of the oft-quoted “Wikimedia does not censor”. It suggests a "thin end of the wedge" mentality. This section needs a stronger wording, a positive wording that states definitively that there is a stand, not merely reluctance to change.

Reluctance is merely a degree of coaxing to change. I would rather see it worded differently and positively: Wikimedia has traditionally been against extra-institutional pressure to change its content as an integral part of its editorial neutrality. Where censorship does occur it is internally driven and highly minimalist, aimed at avoiding cases of clear and significant harm where the mission will not be seen internally as being compromised by the change. It is not as a rule driven by external pressures or cultural mores.

Respect for others and for "massed voices" has a similar concern. Self-censorship in respect can be a high human value - almost everyone has had the experience of not saying something out of a wish to avoid hurting someone. Self-censorship can also be an immensely harmful action, and is a dangerous tool to coax into use upon good and clear cause. Sometimes the reason that "a new wave of... censorship... spread[s] throughout the world" is precisely because those who could champion radical openness and non-censorship decide to be 'respectful' when they should be taking a stand, and by acceding they help censorship to gain ground in the public eye. In the internet era almost every cause summons "massed voices".

Good service requires knowing where to take a stand, as well as where to have respect. It's important to do both and important to emphasize that non-censorship has positive value and is upheld for good cause, not mistakenly marginalize or minimize it as being "less willing to accede".

FT2 (Talk | email) 18:59, 23 September 2010 (UTC)

Let me try and respond to the several points raised here. First, to the question of whether nothing could change, yes, of course, that is a possibility. However, we have said in the first part of the study, that we do not believe that would have been the proper thing for us to recommend, for the reasons we have outlined.
As to FT2’s suggestion that the section about Wikimedia’s commitment to openness be strengthened, I have no objection, and the suggested wording is a good start, except for the last sentence, which is unnecessary, and the use of the word “censorship” in “Where censorship does occur….” We have just finished saying that censorship=acceding to illegitimate extra-institutional pressures, and that we don’t do that, so we can’t go on to say, that we will do it under certain circumstances. Perhaps “Where restrictions to access to content does occur…
It opens up another discussion which is unavoidable, but not necessarily as relevant as people think, in my opinion, and that is what is the meaning of “censored” for Wikimedia projects. If “censored” is to mean “changing or deleting content”, then the policy “Wikipedia/Commons is not censored” is violated hundreds of thousands of times a day, when an edit is made, or reverted, an image deleted, or any of a dozen other perfectly legitimate, time-honoured, absolutely normal Wikimedia procedures followed. Clearly, “Wikipedia/Commons is not censored” does not mean that. Content management – deletion, retention, whatever, is the way the projects work. What, then, does it mean? My view would be close to what FT2 has suggested, that “Wikipedia/Commons is not censored” means that we have rules we have established for the inclusion of content on the projects and those rules are based on the principle of educational value. We recognize as legitimate discussions and decisions on content inclusion based on that principle. We refuse to recognize as legitimate any discussion or decisions based on any other value. What I would add to that statement is another, equally valid, equally fundamental, that says: nothwithstanding the above, because we value respect for our audiences, we accept the principle that individual users of the projects have the right to tailor their viewing experience to their own needs. However, that right does not extend to deleting content from the projects, for themselves or for others.
As to the suggestion that there is some special sensitivity to sexual images in “certain demographics”, that is not widely held in the “viewing community", it resonates with the suggestion, often made, that proposed restrictions on the viewing of sexual images are a attempt by repressive forces in North America, illegitimately uncomfortable with expressions of sexuality, to force their views on the rest of the world. The evidence overwhelmingly suggests the opposite. Contemporary North American society is one of the most permissive in the world. Malaysia banned a concert by Avril Lavigne in 2008 (they later changed their minds, but rock concert promoters routinely avoid Muslim countries), Arab countries restrict even the mildest of sexual imagery, the manager of eBay India was thrown in jail for a week because one porn video was offered for sale on his site. If anything, the rest of the world sees exactly the opposite when they view the major North-American dominated sites of the Internet – they see a “biased" attempt to force extremely permissive views on sexuality on the rest of the world. We take no stand on these questions, although probably the world would think we lean towards the permissive. But we feel our recommendation around images of sexuality and violence strikes a balance and compromise between these two points of view, by focusing on the principle of individual choice as the determining factor in the use of the tools we are suggesting we offer our audiences.
Finally, the suggestion that the most effective way of combating increased internet censorship is to take a stand against it, is worthy of consideration, and finding the balance between legitimate restrictions on use to serve our audiences, without unintentionally legitimizing other, more restrictive techniques, is a difficult one. Our feeling is, to go back to our first point, on whether anything should be done at all, is that the relatively complete openness of Wikimedia projects at the moment needs to be tempered in some way, because not to do so would encourage those forces whose basic values are not necessarily ours to re-double their censorious efforts , although the notion that we should be careful in how far we go, is one we accept. Robertmharris 17:39, 25 September 2010 (UTC)
Censored is a label. We can describe what is meant by "censored" in a Wikimedia context, rather than applying a label that has multiple interpretations. For example as a starting point my wording above can be reworded to avoid the label entirely: "Where a policy of general removal, non-inclusion, or hiding of an entire category of content (whether generally or in a specific context) does occur, it is internally driven and highly minimalist...". In the context of Wikimedia's projects, that's probably a good definition of "censorship".
The other point is, I don't agree that our rules are based "on the principle of educational value". Two examples -
  1. In its role as an image bank Commons may have non-educational but useful material, and
  2. As I was reminded we may well host archive material that is relevant to our mission but not within current types of content we hold.
In essence the point here is that our mission is 'free knowledge' not merely 'education', and as such WMF hosting may include content that is valuable and within mission, but is not really 'educational material' (except in the broadest tautological sense that anyone can learn from anything). So arguing from a basis of "educational value" of certain content is flawed, as WMF's mission conceives of (and actually has) non-educational material being within scope as well.
FT2 (Talk | email) 00:45, 26 September 2010 (UTC)
see Commons:Project_scope/Summary#Must_be_realistically_useful_for_an_educational_purpose. Is it your view that this is inaccurate, or needs to be changed? Privatemusings 04:51, 26 September 2010 (UTC)
That's a good question. Look at the first part of the full Scope statement: "The expression "educational" is to be understood according to its broad meaning of "providing knowledge; instructional or informative"." The word "educational" can imply a narrower interpretation than is actually the case. When you read the actual scope statement it is to "provide knowledge" and the understanding is to be read in a broad sense not a narrow one. Providing knowledge is wider than education. A statement that its aim is to provide educational material may not fully reflect the actual scope of Commons to a reader of the study. To underline this, the project-wide scope statement for Wikimedia is not limited to "educational" at all, and users at Meta seem to be mindful of avoiding limiting statements that may hinder the range of "knowledge" we might provide in future that could fit into WMF's broader mission statement. (I also added a statement that the knowledge we provide will be selective, to the strategy draft for future). In summary it's important to note that our mission is providing "free knowledge", which in some cases may be distinct from providing "educational material". FT2 (Talk | email) 09:51, 26 September 2010 (UTC)
It's clear that Wikimedia would not have done as well in an Islamic country as in a country like the United States with a proud tradition of freedom of speech. Even so, one does occasionally encounter situations in which Americans have been more prone to suggest censorship than residents of other countries. For example, w:Gyaneshwari Express train derailment raised a debate about whether a Web page from the Indian government showing the bodies of the victims should be linked from Wikipedia (fortunately, freedom prevailed). Americans can become offended by images or films of cruel treatment of animals, if they are not accompanied by some condemnatory point of view, whereas Chinese enjoy very substantial freedom to use animals as they please. (An unconstitutional U.S. law to censor "w:crush videos" was thrown out. Oddly, the bill included no prohibition on mousetraps...) Wikileaks seems to thrive in Iceland, and of course various European countries like Denmark have traditionally enjoyed greater freedom in the realm of public nudity and inventive pornography.
The truly neutral position is for Wikimedia to do its utmost to include all content. There is no consistent "average" position on censorship worldwide, only a multitude of irrational local laws with no overarching logic. Wnt 07:06, 27 September 2010 (UTC)

"family-advocacy groups"?[edit]

Who are these "family-advocacy groups" with whom you talked? Many of the organizations in the United States that purport to be family-advocacy groups are in fact ideological and/or religious fanatics with an agenda I deem hostile to the interests of myself and the rest of my family, most emphatically including my daughter. --Orange Mike 04:10, 27 September 2010 (UTC)

I agree. "Family" is a euphemism in American politics, and I'm getting the feeling it's being used the same way here. I think you mean "conservative", which is another euphemism, but at least closer. - Peregrine Fisher 07:21, 27 September 2010 (UTC)
Two in particular with which I have been in contact (although my discussions with them are not yet completed) are Commonsense Media, (http://www.commonsensemedia.org) a well-respected and well-used rating site for children's material, and the Family Online Safety Institute,(http://www.fosi.org/) a similarly well-regarded "advocacy" group, who currently sponsors the ICRA system of content management online. Not everyone agrees with the neutrality of the ICRA system, and note that we have not suggested that such a system be applied to Wikimedia content (mainly because we start with the assumption that all Wikimedia content would qualify for the ICRA "context" tag "educational", meaning no content on Wikimedia projects could be or should be tagged by the ICRA system without that context tag being applied. Robertmharris 19:15, 27 September 2010 (UTC)

Teasing apart types of controversial content[edit]

Ah, I see Part II is out, but I have just digested Part I and some of the comments, so let me comment here. I may make similar comments over there if circumstances merit.

As far as I am concerned, there is a huge difference between a controversial image of Mohammed and a controversial pornographic image. How to separate them? First off, to make things easier, I'll speak here only of images, and I'll elide discussion of gross medical images or train crash images or graphic but non-pornographic sex pictures. Those are all interesting cases but for I'm just trying to peel off the pornography images. And I'm not trying to pick on our Muslim friends, maybe I could use Piss Christ or something. But let's keep it simple for now: Mohammed vs pornography.

An Encyclopedia is a Radical Idea[edit]

It would be hard to put this into policy, but to extend the idea of service a bit, a think a lot of Wikimedians would rally around this statement:

The basic values of Wikipedia may be assumed to stem from those of all proper encyclopedias, beginning with Denis Diderot's original encyclopedia -- the promotion of humanistic and scientific values, and the opposition to ignorance, superstition, and tyranny, commonly associated with the Enlightenment and the Age of Reason; and assumed to be aided by the free collection and dissemination of knowledge; and,from the basic social philosophy generally common to the peoples of the modern advanced world, well expressed by Thomas Jefferson's statement "I have sworn eternal enmity... against all forms of tyranny over the mind of man."

We are not just a valueless information database. We are against ignorance. We are against tyranny.

Given that, of course we give short shrift to objections to controversial content on grounds of religious sensibility. particularly the more superstitious aspects. We not only ignore that, we spit in its face. We revel in the opposition of ignorant and superstitious people.

Similarly for tyranny. If our coverage of Falun Gong or Tibet is "controversial" in China, well, let us double our coverage. Hu Jintao is not just another government head like Andrea Merkel. People like Hu Jintao are the absolute sworn blood enemies of projects like Wikimedia, and I hope and trust the sentiment is returned.

HOWEVER.

This has nothing to do with explicit sex pictures. This has nothing to do with pornography. Disseminating pornography is not a strike for human freedom. As a matter of fact, pornography is a form of "tyranny over the mind of man" since it's (without question) addictive and (in my opinion) destructive to the personality of users. But even if you dispute that: it's, at best, just people fucking (and sucking, and...). It's not part of any Great Cause. It's a dreary, if profitable, industrial business. Sure, there was a time when sexual freedom was an important and good cause. But Victoria died a century ago and more, and there was a sexual revolution, and sex won. We're way, way past this stuff having any beneficial effect on society anymore. I am not talking about Grandpa's Playboys and pitchers of nakkid wimmen. I am talking about explicit hardcorew images with have the effect (if not the intent, but who knows) of degrading our sisters as well as ourselves.

So, you see, there's more than one kind of "controversial".

Harm[edit]

Another way to separate out the kinds of "controversial" material is the concept of harm.

It's not really going to harm a person to view an image of Mohammed (or Piss Christ or whatever). It might upset them, but they'll get over it. It's not going to harm them to read or view most other controversial material,

But sex pictures. And children. That's different.

Let me tell you something you something about a town near where I live. It's a quite wealthy suburb in one of the richest and most educated states in a rich and educated country. It is lousy with lawyers, doctors, executives, and the like. The great majority of the householders here have college degrees, many have advanced degrees (both spouses), and many use computers in their work in some way. Their upper-elementary kids use Google (and therefore Wikipedia) a lot for school research. There are exceptions, but as a general rule, these parents:

  • Have heard of Wikipedia but don't really know what it is and certainly don't know that it hosts explicit images.
  • Have never heard of content control software, or if they have heard if it don't know what it is, how it works, or where to get it or how to install and configure it.
  • Do not know what the Windows Control Panel is, how to access it, or how to use it.
  • I won't even talk about finding and editing javascript files for chrissakes. The very idea is risible.
  • And people are busy. And people are lazy. And people are careless.

You can wish that these facts were not true, but they are true regardless of how hard you wish. Or you can shrug and declare that these facts are so inconvenient that you would just as soon ignore them. Or you can retreat into some kind of moral vacuum and just wave your hands.

(And even in the introduction to the monograph, it states as a self-evident basic principle that guided the very nature of the inquiry that "parents, teachers, and other guardians are best placed to guide them to material that is appropriate for them." So I guess I'll never win this one. But I will stand up and say: NO. We are all of us responsible for the care and protection of the next generation, always and sitting down at a keyboard does not make that go away.

Now go to the Wikipedia article on "Bukkake" (NOT SAFE FOR WORK, obviously) and peruse the images. Let me point out that this article is two clicks away from our article "Anime". It is two clicks away from "List of webcomics" and "List of fictional robots and androids" and "LGBT History: and "1979" and "Edo Period" and "Index of Japan-related articles (L)" and "List of manga magazines". It's two categories down from "Japanese Culture" and even two clicks away from a Wikipedia page, "Wikipedia:Websites/Example webcomics". It is two clicks away from our article on "Flirting", for chrissakes. You think children aren't interested in any of these topics?

Let me tell you some more. I don't even know or care what the text of that article is, but I do know that images go straight to the brain. Here are a couple of the many questions that will flood the mind of a child viewing these images: Do boys really like that? (or: Do girls really like that?) In that far-away, dreamed-of-yet-not-dreamed of land where I have boyfriends and even a husband, will I have to do that?

You think kids don't worry about stuff like that? You think that that worry can't harm a kid's psyche? You think people can't be twisted? You think stuff like this doesn't cause conversations like "Doctor, Suzy is suddenly biting her nails all the time and she seems so... distant?"

</rant>

Anyway. Harm. That is another way to differentiate between one kind of "controversial" material and another. Herostratus 05:30, 28 September 2010 (UTC)

I appreciate how you dismiss the danger to other people's immortal souls. Leading people into sin isn't harm in your definition, I guess.
I don't believe that parents can't find or use content control software or the options included with the web browser. If you assume the naked internet is safe for children, then you're a fool, and if you don't know what to do, there's many web pages, books and organizations that can help you.
You don't think seeing naked girls running down the road burned by napalm (one of the most famous images of the Vietnam war) could worry a child? You don't think seeing pictures of chickens in miles of filthy cages could cause conversations like "Doctor, Suzy is suddenly refusing to eat and whenever we ask why she calls us murderers?" You don't think that people won't bring up sources that claim that promoting LGBT as a valid alternative causes massive amounts of harm to their children?
The first memorable trial in Western History was about this; a man was executed to protect the children from dangerous ideas. I think we should remember that, and remove any thing on Wikimedia that would lead someone to believe that Zeus is not a real god or that he does not deserve our worship.--Prosfilaes 03:54, 29 September 2010 (UTC)
You are correct. Leading people into sin is not harm per se. Some of your other points are worthy of discussion, but for starters, the images you list have valid encyclopedia value. We can't, and on some level shouldn't, shield children from everything. But we should shield them from material that is liable to disrupt their path to eventually become loving and healthy sexual beings, particularly when the encyclopedic value is basically nil. Your point about Zeus I don't get at all, since I specifically exempted or indeed attacked appeals to religious sensibility. But your most interesting statement is this: "I don't believe that parents can't find or use content control software or the options included with the web browser." My personal guess is that the reason you don't believe it is because it would be inconvenient for you to believe it. It's OK. That's human nature. Herostratus 05:19, 29 September 2010 (UTC)
First, note that the above rant against pornography actually exempts "Grandpa's Playboys and pitchers of nakkid wimmen". But that's what the "3000 topless images" from part II are. For Herostratus to come this far is progress.
I should also say that our purpose is not to "spit in the faces" of foreign regimes with censorship policies — we should be content merely to defy them. Especially, some of the free speech issues raised by Americans against China are tactically chosen with a heavy dose of propaganda. If you read the article about the Tiananmen Square protests, you'll see that the higher the casualty figure, the sketchier the source. There are people here who will cheerfully support Falun Gong, but who won't stand up for the rights of polygamists or the mosque in lower Manhattan or the right of people to handle snakes or refuse vaccines for religious reasons. There are people who have cheered on race riots in Tibet in support of a hopeless cause of independence, while failing to recognize that the right of Chinese citizens to move and settle throughout the country is a new change from decades of unjust restriction. And all the while that the West was nominally pressuring China to allow such propaganda, it was simultaneously sending a much stronger message to set up the infrastructure to enforce the West's failing copyright model on a public that doesn't believe in it. Which oddly enough seems to be very useful for the purpose of censorship! We should by all means pursue the truth about every controversy in China and elsewhere — but the truth is not owned by one side or the other. Wnt 17:41, 29 September 2010 (UTC)
But let's look more closely at the claim that it does harm for children to potentially access images of more overtly sexual content here. Bear in mind, first, that any child with access to the Internet can find such content somewhere, despite any number of "filters", if only on a trojan and virus infested web site put up as bait. Then consider that Wikipedia doesn't host sexual content as a porno fiction, but as a factual exploration of what actually occurs in the world. If children want to get a straight answer to the question "do boys really like that?", they could do worse than to look it up here. The squalor of the sex industry stems from taboo and prohibition, just like the squalor of the drug trade. The more information we make available, the better for all. We can see from the exceptions the nature of the problem: because child pornography remains illegal, billions of dollars are made by kidnapping children and forcing them to make sex videos. This wouldn't happen if it were legal. Meanwhile, when the victims of these attacks reach adulthood, they still have no right to authorize distribution of the video, whether for political purposes or to recover a measure of financial compensation. (Law enforcement reserves the right to continue showing the pictures at little conferences to try to convince lawmakers to pass new rules against Internet freedom, but they don't pay any copyright fees to the children because that would be immoral...) The simple fact is that no matter what you think of the pictured activity, censorship is never the right answer for any problem. Wikipedia can't practically transgress American law, but it certainly should embrace truth and freedom to the extent that it is allowed. Wnt 17:41, 29 September 2010 (UTC)
Not leading them into sin; by viewing images of Mohammed, they are sinning. You are accepting damage to their "immortal soul" because you don't think it's "harm". How POV is that? Illustrating a sexual act does have encyclopedic value; like many other cases, it can make something clear in a reader's mind that no words could explain. Geometric relations, like what goes into what where, are best shown in images. The point about Zeus is that this whole thing about banning stuff because it harms children has a long history, and that history is largely centered on heresy. Why is your version right and theirs wrong, and how is that NPOV? As for the blocking software, if parents can't use it, then they're idiots, and their children are going to be exposed to this stuff no matter what Wikimedia does. It is probably true that many parents can't keep their kids out of the street; that doesn't mean that if you're on the highway, that you should go 20 MPH slower than the ambient traffic, because it's not going to matter.--Prosfilaes 20:43, 29 September 2010 (UTC)

The way it works at it.wikipedia: a Not-NPOV way![edit]

I have some troubles about controversial contents at it.wikipedia

We have a notice template to be put in articles to warns users that a same contents of a those page 'potrebbero urtare la sensibilità di chi legge' (english translation: 'may harm the reader's feelings').

Actually in some types of articles the inclusion of that template is allowed (in articles like, for example, it:w:Dildo (which means in english: en:w:Dildo), it:w:Fritz Haarmann (which means in english: en:w:Fritz Haarmann), it:w:Charles Manson (which means in english: en:w:Charles Manson), it:w:Sadismo (which means in english: en:w:Sadistic personality disorder), it:w:Peter Stubbe (which means in english: en:w:Peter Stubbe);

this is not the case of articles like it:w:Procedimenti giudiziari a carico di Oscar Wilde (which means in english Judicial proceedings against Oscar Wilde), it:w:Inquisizione (which means in english en:w:Inquisition), it:w:Lettre de cachet (which means in english en:w:Lettre de cachet), etcetera. The template's adding is quickly reverteted.

That is: it's ok to consider 'content that may harm the reader's feelings' something concerning a single man / woman (a criminal, a sexual pervertet (?) ) , but not when abomination are made by governments or other institutions.

This is blatlantly not-NPOV. And this is of cournse just an example.

A policy about controversial contents have to deal to avoid not-NPOV use of policy, notice and other stuff. --Teletrasporto 08:07, 10 December 2010 (UTC)




--Teletrasporto 08:07, 10 December 2010 (UTC)