Jump to content

Talk:2010 Wikimedia Study of Controversial Content: Part Two: Difference between revisions

From Meta, a Wikimedia project coordination wiki
Latest comment: 13 years ago by Robertmharris in topic A predictable error
Content deleted Content added
Line 54: Line 54:


You can pretend that this is an NPOV crusade, but that's a lie. There may indeed be many pictures of "you and your friends" on Commons, and no doubt many of these can be deleted. But by announcing a crusade against 3000 bare breasted photos, you're sending a message that there's something wrong with women's breasts but not with men's breasts. Which is not merely a POV, but a wrong one. Also note that your statement that "ethnographic" images will be spared from review conceals an old and well-worn racism, rooted in the idea that because non-white women are not valid sexual partners, that their images should not be viewed as pornographic. How is the image I linked above any less "ethnographic" than an image from darkest Africa, if the races are viewed as equals?
You can pretend that this is an NPOV crusade, but that's a lie. There may indeed be many pictures of "you and your friends" on Commons, and no doubt many of these can be deleted. But by announcing a crusade against 3000 bare breasted photos, you're sending a message that there's something wrong with women's breasts but not with men's breasts. Which is not merely a POV, but a wrong one. Also note that your statement that "ethnographic" images will be spared from review conceals an old and well-worn racism, rooted in the idea that because non-white women are not valid sexual partners, that their images should not be viewed as pornographic. How is the image I linked above any less "ethnographic" than an image from darkest Africa, if the races are viewed as equals?
:In response to your various points: 1. ''The central error of these recommendations is that they imagine that Wikipedians can agree on what is sexual or non-sexual, violent or non-violent, controversial or non-controversial.'' If your point is to question how Wikimedians make decisions, and to note that there are differences of opinion within Wikimedians about things, I would guess that would place them among hundreds of other similar issues within the community, all of which use the various means that the community has devised over the years to mediate them. If your point is that such definitions are impossible to craft, I think that is not true. . There can be, and are, definitions of sexual and violent content throughout the world. The proposed policy on Sexual content for Commons, for example, attempts such a definition. You say that not all Wikimedians can agree on such a definition. You may be right. But would not all Wikimedians agree that a depiction of human sexual intercourse, for instance, would be defined as a sexual image? I think they probably would. The fact that some content in the world resists easy definition does not mean that definitions are impossible. Let's talk within the community about your Leaning on Barn Doors image. Let's discuss how artistic it is (with the understanding that aesthetically pleasing and artistic are not synonyms). Nowhere have we suggested that current procedures around Requests for Deletion should be changed. In any case, our point of departure is the belief that Wikimedians can agree on what is educational – that is the definition we’re trying to establish and protect. If they cannot, then Commons: Scope lacks much meaning.

:2. ''You can pretend that this is an NPOV crusade, but that's a lie. There may indeed be many pictures of "you and your friends" on Commons, and no doubt many of these can be deleted. But by announcing a crusade against 3000 bare breasted photos, you're sending a message that there's something wrong with women's breasts but not with men's breasts. Which is not merely a POV, but a wrong one.'' “Right” and “wrong” have nothing to do with designations of content within Commons. Educational and non-educational are the relevant criteria. It may be true that some people morally object to some of the images we’re suggesting be deleted from Commons. That does not mean that that is the rationale we are using for their deletion. If that were the case, we would more likely be suggesting the deletion of images of cock rings, ejaculating penises, shaved vulva, bondage – and we are not suggesting the deletion of any of these. And the reason is because we believe they have educational purpose – although the things they educate people about are disturbing to some. Morality is not the issue when it comes to application of scope. Educational purpose is. We have nominated certain kinds of images, which we have attempted to define as accurately as possible, as non-educational. Can that designation be challenged? Of course. But educational value is the only true test of inclusion – this, of course, is not our suggestion; it is current policy on Commons. If you want to make the argument that Wikimedians cannot agree on the definition of educational (as you have made above that they cannot agree on sexual, violent, or controversial), then we’re looking at quite a different set of challenges, I would think. [[User:Robertmharris|Robertmharris]] 04:24, 23 September 2010 (UTC)



==== A modest counterproposal ====
==== A modest counterproposal ====
Line 68: Line 72:


* The plan to remove taboo images of breasts should be abandoned. If not abandoned, it should include all random snapshots of friends and significant others, regardless of the clothing. In any case, a full list of images to delete '''must be publicized in advance''', so that private photo archives are alerted to take these images and reproduce them.
* The plan to remove taboo images of breasts should be abandoned. If not abandoned, it should include all random snapshots of friends and significant others, regardless of the clothing. In any case, a full list of images to delete '''must be publicized in advance''', so that private photo archives are alerted to take these images and reproduce them.
:In response to this counter-proposal (which we welcome, as well as any others), I will just note that if we thought images should be hidden by default, we would have said so. In fact, we said the opposite as clearly as we knew how. Anything may come to anything, but to suggest that a proposal to do “x” is the same as a proposal to do the opposite of “x” is not entirely reasonable. [[User:Robertmharris|Robertmharris]] 04:24, 23 September 2010 (UTC)



===="Active curation"====
===="Active curation"====
Line 75: Line 81:


[[User:Wnt|Wnt]] 19:26, 22 September 2010 (UTC)
[[User:Wnt|Wnt]] 19:26, 22 September 2010 (UTC)
:You don’t have to wait for Part 3. Consideration means consideration; nothing more. It is just an idea we are suggesting be applied to the collections. And, as you note, it is an idea that has been brought up before, and we believe for a reason. To use your art gallery example, there might not be any curators throwing artworks in the trash, but I would be surprised if there wasn’t a curator in the world who, when approached by a potential donor, would not assesss his or her current collection to see whether that collection could benefit from the donor’s offerings, and perhaps turn them down if they did not. To call that “censorship” seems to be to be straining at the definition of the word. And to take a case closer to home, we counted more than 1,000 images of penises in Commons, and unless we missed one, they were all white. A curated Commons might actively try to correct this imbalance, to make the collection more representative. We made the recommendation to increase the likelihood that this right might be given Commons editors and administrators. [[User:Robertmharris|Robertmharris]] 04:24, 23 September 2010 (UTC)


== Comments by TheDJ on recommendations ==
== Comments by TheDJ on recommendations ==

Revision as of 04:24, 23 September 2010

"User-Controlled Viewing Options"

The recommendations in the section User-Controlled Viewing Options are suggesting a clear violation of NPOV. --Yair rand 22:22, 21 September 2010 (UTC)Reply

Just as a reminder, not all projects actually have NPOV. See COM:NPOV for example. Steven Walling (talk) 00:26, 22 September 2010 (UTC)Reply
How is user-initiated opting-out of viewing images a violation of NPOV? If I close my eyes when looking at the Mohammad article, am I violating NPOV? Apparently. Kaldari 21:58, 22 September 2010 (UTC)Reply

Comment on "Intent to arouse"

That is a lot of thoughtful words, but only a few will be controversial. Namely, these:

We are suggesting they be deleted because they are out of scope – they serve no “reasonable educational purpose”...their intent is to arouse, not inform...It is our belief that the presence of these out of scope images in Commons is potentially dangerous for the Foundation and Community because they reduce the overall credibility of the projects as responsible educational endeavors, and thus call into question the legitimacy of the many images of sexual content and “controversial” sexual content that must remain on Commons for the projects to fulfill their mission. And, although not our primary motivation in making this recommendation, it must be noted that they are offensive to many people, men and women alike, and represent with their inclusion a very clear bias, and point of view – that of woman as sexual object. They are far from neutral. We would never allow this point of view untrammeled and unreflexive presence on any Wikipedia site as a clear violation of NPOV– we should not allow it on Commons either. (emphasis added)

Some comments:

  • The intent of the photographer or uploader is unknown and perhaps unknowable. In effect, this turn of phrase is a pure projection--to use the clinical term--of the image-viewer's biases about the image-uploader's intentions. It is a category error: there is not 'intent to arouse' without someone's arousal; in other words, this image is controversial because it makes someone horny. (Or in the case of violence, because it makes someone squeamish or horrified). In truth, there is no other metric apart from our own individual reactions. So just say it: we are censoring these images because many people will be turned on by them or shocked by them. And that is okay or tolerable but not more than necessary to advance the encyclopedia's core mission.
  • Wikipedia is not Google or YouTube or Facebook. It makes no profit. It has no goal except to deliver the sum total of human knowledge to its users. This is not a polite mission. It is a radical mission. I don't want to give cover to 'prurient' trivialities, but there is a sense in which today's "icky" images are always the boundaries of what society is willing to explore. The frontiers of human knowledge, whether in sex, science, violence, or other taboos is never comfortable. Yet this is where knowledge is expanded and discovered. There is a clear friction between the twin goals of presenting knowledge comfortably and allowing its growth beyond those boundaries.
  • Per many Wikipedia policies, Wikipedia is neither censored nor a crystal ball. It reflects the world as it is. But that is 'the thing' with controversial images: they exist. They happen. Sex happens. Ejaculation happens. Dildos happen. Beatings happen. Massacres happen. And so on. You make the fair point that these things can be shown without being endlessly replicated--and that people should have the 'option' to exclude these images if they wish. That seems fair. But don't do it under the guise of intent.
  • Do the images which show off women's bodies cross a line because they 'portray women as sexual objects'? I don't know, do they arouse you? Women are sexual objects, if you find them sexual and you objectify them. But again, that is in the eye of the beholder. Photos of women are not objectifying unless that is the lens through which they are viewed. This is a naive view, but it is also true.
  • A broader and more subversive or post-modern critique is that the socially-constructed definitions of what is sexual or controversial should not apply to an encyclopedia at all. For these things happen at the interface between object and subject, content and viewer, and they are not attributes of things in themselves. At the least, they are subjective, and as such cannot be applied across broad categories. Is that hand-tying? Yes, but if you want to be free to make choices which censor, as any decision to eliminate content will do, then you have to say clearly that you are making socially supported choices, biased by social attitudes. It's not a very Wikipedia thing to say, but if it's true, better to say it. Lots of users don't like seeing these images so we are going to limit the number of them to what is minimally needed.
  • I realize that justifying 'more boobies' with highfalutin social criticism is a risk--and a middle path might need to be charted which you have begun to do. But let's choose the words carefully, lest we weave a web that will ensnare us as well.

Ocaasi 09:44, 22 September 2010 (UTC)Reply

Standing ovation. :) --Cyclopia 15:44, 22 September 2010 (UTC)Reply
Well at the very least he has my applause. TheDJ 20:17, 22 September 2010 (UTC)Reply
Some agreement, though where others see thoughtfulness I see only post hoc justification. As for photographers' point of view, well, why do so many people upload photographs of waterfalls? Because they're beautiful. Why so many seashores? Because they're beautiful. Why so many naked women? You figure it out. There's nothing all that sophisticated or surprising going on here, and it doesn't express a point of view beyond that of any photographer who bothers to pick up the camera. Wnt 20:27, 22 September 2010 (UTC)Reply
I don't think he's referring to The Birth of Venus here. He's talking about porn. Kaldari 22:12, 22 September 2010 (UTC)Reply
The intent of the photographer or uploader is unknown and perhaps unknowable: In a good number of sexual media uploads, the filenames, file descriptions, files' provenance, even sometimes the uploaders' user names, leave no doubt at all that the image's ability to arouse was a prime motivating factor in creating or uploading it. And bearing in mind the role Internet media play in 21st-century human sexual arousal, it would be strange indeed if this role were not reflected in the motivations of uploaders. (Many such images are routinely deleted even today, under COM:PORN.) --JN466 22:42, 22 September 2010 (UTC)Reply
A few comments:
  • I am as much concerned with the language as the actual filtering. "Intent to arouse" is a thought-crime. It is a Kafka-esque road that we should not go down. Culling tits by legislating people's motivations is a devil's bargain. It's not just about specific acts when you make law, but about setting precedents for guilt and innocence. "I didn't do it!" ..."But you thought about doing it!" is no world to live in. We shouldn't encourage it here.
  • Of course I'm talking about porn, if you can define it. Surely some people's porn is other people's breakfast, and vice-versa. Besides, as Cyclopia also mentioned, 'porn' is real. It exists too. Arguably, it should not be presented any less vividly and thoroughly than pictures of trains or architecture. So some people get off on it or have to avert their eyes. If you like it, look. If you don't like it, don't. Our job is do a good job of collecting it.
  • If I had to choose between post-hoc and pre-censored, I'd think I'd take post-hoc. Once the images stop even showing up, there's nothing to rationalize--you don't even know what you missed.
  • Again, just be clear about it: "We're offended by lots of pictures of naked women doing sexual things and random shots of men masturbating and we're not going to allow it to expand beyond a necessary minimum. It's porn, and we think too much of it is bad for the encyclopedia." If that holds up to critique, then you'll be on much firmer ground.
  • Either way, if you still want to censor bodies and sex acts, I don't think there's much particular need for any grand controversial image policy. Just use existing guidelines about redundancy and replication. If you have one great picture of a vulva, allow in 4 others for variety, but not 60 just for kicks. Call it curating rather than censoring. But please drop the 'intent to arouse' bit. Ocaasi 00:08, 23 September 2010 (UTC)Reply

Definition of controversial

The authors say: "[Images] would receive this designation [controversial] when there is verifiable evidence that they have received this designation by the communities the projects serve. This evidence would include: existing legislative and legal frameworks around the distribution of said images; wide public debates in responsible forums concerning these types of images; articles and editorial opinion expressed about these types of images in notable journals and newspapers, etc. In our view, these tests will ensure that the bar to be admitted to the category of controversial image will be high (much higher than it is for the Wikipedias), and should remain so. In effect, our surmise is that only three categories of images can currently pass this test -- sexual images, violent images, and certain images considered sacred by one spiritual tradition or another."

This is absurd. Almost everything is "controversial" if the only bar for that is that there is some public debate or articles and editorial opinion in newspapers. Thus there is no end to the damage that such a policy would create. I am sure there are a lot of newspapers and public forums where exposure to atheism is deemed dangerous to children: should we consider images of atheists controversial? Given the huge debate on creationism, should we consider evolutionist content "controversial"?

There is no end to the amount of censorship and POV pushing that could arise from that. It is grotesque that so-called experts do not recognize that.

Also this is highly worrying: "and the intent of the image, to a reasonable person, is merely to arouse, not educate". Who is a "reasonable person"? Why is something arousing can't be educative as well (for example: to illustrate, effectively, what people find arousing)? Should we take into account fetishes or feticists do not fall under "reasonable persons" (and if so, I think the authors are insulting a not trivial part of the population)? Why should "reasonable persons" decide what is educative and what not, while instead we should simply stick to the fact if an image/video is simply documenting properly a notable thing/behaviour, regardless of what happens in our reader's genitals?

I am thoroughly worried and disgusted. --Cyclopia 15:42, 22 September 2010 (UTC)Reply

Ssst, you are supposed to not ask and not to tell. TheDJ 20:19, 22 September 2010 (UTC)Reply

A predictable error

The central error of these recommendations is that they imagine that Wikipedians can agree on what is sexual or non-sexual, violent or non-violent, controversial or non-controversial. This will not happen. The reason why censorship is imposed by dictatorial entities, whether they be authoritarian governments, authoritarian political parties, or corporations, is that there is no correct way to make these decisions. There must be one lone dictator arbitrarily ordering all the decision making, with no one permitted to call much attention to his inevitable inconsistencies.

Suppose we consider an image like File:Leaning on Barn Doors.png. No doubt there is some potential for arousal; but then again some will say it is art. There are after all less puritanical countries where breasts are simply one of the many aesthetically pleasing features of beautiful women, and not the object of some taboo. It is ironic indeed, that you would say that to violate this taboo expresses a POV that a woman is a sexual object, when in fact that exact opposite is true. If you show a photograph of an Afghan woman's face, does that mean you regard her as a sexual object???

But here's the rub: suppose I go ahead and edit a category of the above image to say that no, this is not a sexual image, but simply an artistic one. Are you going to ban me from editing? I mean, I suppose you'd have to — otherwise your little rating system wouldn't be much of a system, now would it? You'd better be ready to have a thousand new admins, to arbitrate whether it is an offense against Wikipedia to tag Tom & Jerry cartoons as violence, or images of people smashing shop windows in a riot as nonviolence, or photos of low-cut jeans as sexual content, or images of cartoon characters humping as non-sexual. You'll need whole reams of new policy about how to characterize images of invertebrate sexual acts, and rule on whether the destruction of a Death Star is an act of terrorism. I suggest you set up a whole new namespace for this stuff, because you might find that most of the edits to Wikipedia are being made there.

You can pretend that this is an NPOV crusade, but that's a lie. There may indeed be many pictures of "you and your friends" on Commons, and no doubt many of these can be deleted. But by announcing a crusade against 3000 bare breasted photos, you're sending a message that there's something wrong with women's breasts but not with men's breasts. Which is not merely a POV, but a wrong one. Also note that your statement that "ethnographic" images will be spared from review conceals an old and well-worn racism, rooted in the idea that because non-white women are not valid sexual partners, that their images should not be viewed as pornographic. How is the image I linked above any less "ethnographic" than an image from darkest Africa, if the races are viewed as equals?

In response to your various points: 1. The central error of these recommendations is that they imagine that Wikipedians can agree on what is sexual or non-sexual, violent or non-violent, controversial or non-controversial. If your point is to question how Wikimedians make decisions, and to note that there are differences of opinion within Wikimedians about things, I would guess that would place them among hundreds of other similar issues within the community, all of which use the various means that the community has devised over the years to mediate them. If your point is that such definitions are impossible to craft, I think that is not true. . There can be, and are, definitions of sexual and violent content throughout the world. The proposed policy on Sexual content for Commons, for example, attempts such a definition. You say that not all Wikimedians can agree on such a definition. You may be right. But would not all Wikimedians agree that a depiction of human sexual intercourse, for instance, would be defined as a sexual image? I think they probably would. The fact that some content in the world resists easy definition does not mean that definitions are impossible. Let's talk within the community about your Leaning on Barn Doors image. Let's discuss how artistic it is (with the understanding that aesthetically pleasing and artistic are not synonyms). Nowhere have we suggested that current procedures around Requests for Deletion should be changed. In any case, our point of departure is the belief that Wikimedians can agree on what is educational – that is the definition we’re trying to establish and protect. If they cannot, then Commons: Scope lacks much meaning.
2. You can pretend that this is an NPOV crusade, but that's a lie. There may indeed be many pictures of "you and your friends" on Commons, and no doubt many of these can be deleted. But by announcing a crusade against 3000 bare breasted photos, you're sending a message that there's something wrong with women's breasts but not with men's breasts. Which is not merely a POV, but a wrong one. “Right” and “wrong” have nothing to do with designations of content within Commons. Educational and non-educational are the relevant criteria. It may be true that some people morally object to some of the images we’re suggesting be deleted from Commons. That does not mean that that is the rationale we are using for their deletion. If that were the case, we would more likely be suggesting the deletion of images of cock rings, ejaculating penises, shaved vulva, bondage – and we are not suggesting the deletion of any of these. And the reason is because we believe they have educational purpose – although the things they educate people about are disturbing to some. Morality is not the issue when it comes to application of scope. Educational purpose is. We have nominated certain kinds of images, which we have attempted to define as accurately as possible, as non-educational. Can that designation be challenged? Of course. But educational value is the only true test of inclusion – this, of course, is not our suggestion; it is current policy on Commons. If you want to make the argument that Wikimedians cannot agree on the definition of educational (as you have made above that they cannot agree on sexual, violent, or controversial), then we’re looking at quite a different set of challenges, I would think. Robertmharris 04:24, 23 September 2010 (UTC)Reply


A modest counterproposal

Some basic ideas to salvage this debacle (not guaranteeing it is a sufficient set of demands):

  • All categories used for showing/hiding images in articles should be in userspace. I.e. Category:Wnt/Sexual images. These categories should not be edited by anyone other than the originator without permission, except to place them into other categories (so that someone can put my category inside theirs or vice versa). A solution will be needed to resolve looped categories gracefully, since some people will intentionally form rings so that any editor's list is included in all, and a bot will be needed to remove overcategorization from the original images (when one person's category includes another but both have categorized the image). I'm not saying this is very workable - just more workable than your idea.
  • The "single switch" to set viewing of such images implies that one single set of criteria has a special status. This could be established by a "featured filter" in which people choose a single category (originating from a single user, which may contain categories from other users in a hierarchy). By choosing a single category to feature, recognizing that the content of that category is up to the user, you can avoid individual debates over whether X is violent.
  • Anyone with a preferences page on Wikipedia should be able to change (and of course remove entirely) the category he wishes to use for this purpose - the choice should not be just filtered vs. unfiltered, but "filtered by Grace" versus "filtered by Ahmed" and so on. If this is not done, then Wikipedia puts itself in the position of making a site "safe" for one culture's taboos, but not for others.
  • There must remain a URL that feeds through to a completely unbowdlerized version of all content, so that third party sites can provide direct links to the desired target. This must work even for obscure things like discussion pages and historical versions of articles - i.e. the site should work exactly the same way for two URLs, except for one all this image-hiding is omitted by default. (While the proposal here doesn't explicitly say that the intent is to hide images by default for unregistered readers, I can't imagine it won't come to that).
  • The plan to remove taboo images of breasts should be abandoned. If not abandoned, it should include all random snapshots of friends and significant others, regardless of the clothing. In any case, a full list of images to delete must be publicized in advance, so that private photo archives are alerted to take these images and reproduce them.
In response to this counter-proposal (which we welcome, as well as any others), I will just note that if we thought images should be hidden by default, we would have said so. In fact, we said the opposite as clearly as we knew how. Anything may come to anything, but to suggest that a proposal to do “x” is the same as a proposal to do the opposite of “x” is not entirely reasonable. Robertmharris 04:24, 23 September 2010 (UTC)Reply


"Active curation"

This proposal suggests "that consideration be given by Commons editors and administrators to adopt policies of active curation within categories of images deemed controversial (sexual, violent, sacred) which would allow for restriction of numbers of images in a category, active commissioning of images deemed needed, but absent from the category (line drawings of sexual positions, eg.)". Now "active curation" here is a clear euphemism for censorship - I mean, can you imagine walking into an art gallery and finding the curator deciding that there are too many paintings of one type, so he's throwing some in the trash?

The actual wording here says "consideration". Well, people have considered these ideas before. And the response has always been, "Hell no" — we're not deleting images just because they're 'controversial', whether or not there are other images that can vaguely be described as similar. We should watch to see whether Part 3 interprets "consideration" as "mandate". Wnt 20:38, 22 September 2010 (UTC)Reply

Wnt 19:26, 22 September 2010 (UTC)Reply

You don’t have to wait for Part 3. Consideration means consideration; nothing more. It is just an idea we are suggesting be applied to the collections. And, as you note, it is an idea that has been brought up before, and we believe for a reason. To use your art gallery example, there might not be any curators throwing artworks in the trash, but I would be surprised if there wasn’t a curator in the world who, when approached by a potential donor, would not assesss his or her current collection to see whether that collection could benefit from the donor’s offerings, and perhaps turn them down if they did not. To call that “censorship” seems to be to be straining at the definition of the word. And to take a case closer to home, we counted more than 1,000 images of penises in Commons, and unless we missed one, they were all white. A curated Commons might actively try to correct this imbalance, to make the collection more representative. We made the recommendation to increase the likelihood that this right might be given Commons editors and administrators. Robertmharris 04:24, 23 September 2010 (UTC)Reply

Comments by TheDJ on recommendations

  • Recommendations: Controversial Text: I fully agree on all counts. The one thing I do say is that a wikijunior is a difficult thing to create as has been proven in the past. This is largely because the target group is not capable of doing all the legwork as in many other wikis. That means that teachers and other editors will need to be heavily involved. It's a worthy cause to aim for a wikijunior, but it will take a lot of focus, focus that will impose upon the foundation. Perhaps a "sister foundation" might be a better idea. We could donate some resources to them (server usage etc), but there would be a separate focus to managing the project, promotion, cooperation with educators, etc.
  • Recommendations on “Controversial” Images
    • 4 This is the most problematic part of all the recommendations, as already explained very well I have to say by Wnt and Ocaasi. It is based on archaic traditionalist ideas and protectionism. I totally falls apart when we start comparing it with images of unveiled afghan ladies for instance. It is a Western viewpoint on "decency", and per my own policy (as person) on explicit content is not acceptable.
    • 5 Fully agreed
    • 6 So we should limit the amount of pussy, if I understand this correctly, and encourage the creation of linedrawings. I think we already do that, the amount of images we let in on sexual content is already diverging towards "we want high quality material that is useful for illustrations". In my opinion that works much better than setting fixed numbers.
  • Recommendations on User-Controlled Viewing Options
    • I have to start with the fact that I was expecting these recommendations. I even partly argued for them myself over the past years. Filtering is totally acceptable to me, if the projects at large support that desire and I'm not bothered by them (means default off). I believe in making the choice that you don't want to see something. I have significant worries about the amount of time and resources that would have to be diverted towards such processes (both technical and community wise, manhours and pure cost). It adds to the bureaucracy and complexity of the website and it's managing systems, and the question is can we handle that now that editor numbers are slipping. I'm not sure. But like I said, I can accept it as long as I don't have to participate in it.
    • 7 As long as i'm not bothered by it, doesn't inhibit printing, doesn't break google. No problem.
    • 8 This is the "click to reveal" routine. I note that this has significant accessibility risks (think mobile devices and the blind) which require appropriate attention in development. My other requirement is that individual projects (with the exception of Commons) reserve the right to "opt-out" of such schemes for anonymous users. (I have serious doubts if the Dutch community want their articles on the sexual revolution defaced by "click to show this image" blank spots. As a matter of fact, I think they and the Germans would seriously consider forking the projects in that case.)
    • 9 I'm confused on this one. Are we talking individual images, or just individual choices for "pre-made" (by commons editors) sets of controversial content ?
    • 10 I read this recommendation a few times and unfortunately it is still not clear to me what your intent is with this recommendation. As far as I know, it wouldn't be hard to create lists of images to censor already. I think this is more an issue of skills of the censors :D
    • 11 I cannot support this suggestion. least astonishment has nothing to do with "surprise" towards the reader that he gets stuff he might not have been prepared for when he googled vagina. It is a purely stylistic element of writing a comprehensible article. It is about dosage, accurateness, to the pointness and order of elements, it has nothing to do with inclusion and protecting feelings.

I share many of the concerns of Wnt. This is gonna be a warzone. There will be more discussion than useful contributions. And if such measures go to far, editors and readers will leave, some already left last time. These are the people who read, build and maintain this encyclopedia. Why do we care about including the fringe groups, who do not participate here, and endangering the participation of our core members ? TheDJ 21:40, 22 September 2010 (UTC)Reply

Judging by your comment on 8., I believe you may have misread 7 and 8. As written, they say,
  • "7. That a user-selected regime be established within all WMF projects, available to registered and non-registered users alike, that would place all in-scope sexual and violent images (organized using the current Commons category system) into a collapsible or other form of shuttered gallery with the selection of a single clearly-marked command (“under 12 button” or “NSFW” button).
  • 8. That no image be permanently denied any user by this regime, merely its appearance delayed, to prevent unexpected or unintentional exposure to images.
The way I read that, it means that the regime becomes activated after the user has clicked the opt-out button. --JN466 22:52, 22 September 2010 (UTC)Reply
I'm not entirely sure, i find the section slighlty confusing. That's why I stated that I cannot support hiding content by default. TheDJ 00:44, 23 September 2010 (UTC)Reply
Just to be clear, what we are recommending is that, unlike images in sexual and violent categories, which are collapsible by readers (non-registered, general-public users of the projects) by the selection of a simple command visible on all project pages, other images deemed controversial (by the tests we’ve proposed, let’s take Images of Muhummad as an example) can only be collapsed by registering as a user, and making choices from a set of options presented on your user page – what those options should be – category by category, image by image, we haven’t said). However, we believe that the option to collapse an image not be extended to all images in the projects, but limited to those we have, in effect pre-selected for potential filtering.(because they meet our definition of controversial). Same with potential third-party filters. We believe that we should not tag every image or bit of content in the projects, so that third-party filters can easily delay or delete content we believe should be open. We believe if the bias on the projects is towards freedom, we should not be fashioning tools for people to restrict that freedom, for themselves or anyone else, except in clearly-defined exceptional cases.Robertmharris 00:32, 23 September 2010 (UTC)Reply
Sounds like a good compromise solution to me. Allow everyone, at their own discretion, to filter out sexual and violent stuff, if they don't want to see it, and allow registered users limited scope to filter out additional controversial content, like images of Muhammad that they don't want to see. --JN466 02:03, 23 September 2010 (UTC)Reply
This sounds comparatively mild, except that I just read about "...policies of active curation within categories of images deemed controversial (sexual, violent, sacred)...", which are clearly not limited to what a single user sees.
Furthermore, there's a huge inconsistency between this and the Part I idea that controversy is objective and initiated by user complaints. What you seem to be saying here is that selected Western hot-button items will be concealed for non-registered users, but issues like Muhammad cartoons can never reach that level of treatment. That doesn't sound particularly stable, but for the time being it expresses a cultural bias. There are countries like India that have a very bad record on sexual censorship but don't think twice about posting photos of the dead from a train crash on the Web for everyone to look through and identify. See, e.g., external link from w:Gyaneshwari Express train derailment to the Indian government's site, which some people wanted to censor out of that article entirely due to gruesome content.
Believe it or not, the people "who read, build and maintain this encyclopedia" includes many people who are offended by some images that are legal in the US. Such people are not a "fringe group" battling in a "warzone". If you view things that way I somehow doubt that you are interested in broadening Wikimedia's appeal beyond twenty-year-old white male Americans. Also, you should keep in mind that Wikimedia includes more than just "this encyclopedia", by which I assume you mean en.wiki. We are dealing with many people from many different cultures, backgrounds, age groups, etc. on several hundred different projects. Kaldari 00:20, 23 September 2010 (UTC)Reply
As I said those are my personal principles, they do not concern with how we actually run the projects, it has to do with how I think we should run the projects. It's my opinion, i'm professing it. As much as Privatemusings has been professing his opinions for years now. And yes, some of my ideals do trump participation of a wider audience I guess, but so be it. We started the project on ideals, I see no reason to change that. TheDJ 00:34, 23 September 2010 (UTC)Reply

Image deletion is not permanent

Image deletion is not permanent. Images can be undeleted, just like wiki pages. This was not the case when Commons was started, but has been for several years now. -- Duesentrieb 21:58, 22 September 2010 (UTC)Reply

Good point. I do note that Commons deletion in general is seen as rather disruptive because images are automatically removed from many projects. Restoring an image is easy, but restoring the way an image was used is more difficult (and gets harder as time progresses due to changing article content). TheDJ 22:01, 22 September 2010 (UTC)Reply
They can only be undeleted by admin staff: and unlike earlier article revisions in a Wikipedia, they can only be seen by admins as well. --JN466 22:54, 22 September 2010 (UTC)Reply