Talk:Image filter referendum/en/Other

From Meta, a Wikimedia project coordination wiki

Concerns[edit]

Hyphenization[edit]

...opt-in personal image hiding feature..

Let me suggest that as soon as possible, we hyphenate image-hiding, so as to clarify meaning, especially in the Wikipedia banner announcement. When I first saw this a few minutes ago, I thought it had to do with a feature to hide your personal images. I think that ...opt-in personal image-hiding feature... does the trick. KConWiki 21:11, 27 August 2011 (UTC)[reply]

Slippery slope danger[edit]

This is how all censorship starts.. slowly. First it's opt-in, then opt-out and then it wil be mandatory on request of foreign governments / organizations, since the meganism is already in place. Very dangerous. If people want to filter, fine, do it on your own PC. Don't give organizations / government the opportunity to easily implement a blacklist for their people. Koektrommel 07:58, 3 July 2011 (UTC)[reply]

That's a huge extrapolation based on a very preliminary step - and not one that I think is justified. Given that it would already be technically feasible (and easy) for foreign governments to apply such censorship if they wanted (e.g. see the Great Firewall of China or the Internet Watch Foundation), as well as the finances for technical development available to governments / organisations in favour of censorship (which are several orders of magnitude larger than those available to Wikimedia), I seriously doubt that this would make it significantly easier for them to implement... Mike Peel 20:22, 4 July 2011 (UTC)[reply]
I don't think that it is within Wikipedia goals to provide a tool to make job of censors easier. Ipsign 06:57, 16 August 2011 (UTC)[reply]

Esto no es una herramienta que aumente la libertad, sino que la recorta. De aprobarse será la propia wikipedia, que no censura contenidos ni temáticas, la que tristemente proporcione el instrumento de censura a los censuradores. Lamentable. Wikisilki 19:01, 5 July 2011 (UTC)[reply]

Translation: "This is not a tool to increase freedom, but to limit it. If approved, it will be Wikipedia, which does not censor content or themes, which sadly provides a tool for censoring to the censors. Lamentable." —translated by Sj.
Again, I must very forcefully say that it should not be the goal of Wikipedia to champion "libertad", but to educate. To deny education to the people who need it the most (i.e. children in developing nations that often have highly moralizing, oppressive cultures) because you find it distasteful to implement a voluntary, opt-in method for hiding images that might preclude large swaths of people from using Wikipedia because their cultures don't allow it - this is absolutely vile and idiotic. Unigolyn 08:59, 18 August 2011 (UTC)[reply]

I concur. Implementing this feature would be clearly the first step towards censorship on Wikipedia. The next step on this way will be to block whole pages (for example, by adding category 'sexuality') which will allow schools to inject cookie with 'Display-Settings: sexuality=denied' and block such content for all the students. While it might be the intention, it won't stop there: the next step will be to create category 'Display-Settings: opposed-by-chinese-government=denied' and mark Tiananmen Square protests of 1989 with it, which will effectively allow Chinese government to simply inject appropriate cookie at firewall and block content which they don't want Chinese people to see. It is also *very important* to understand that limiting the feature only to 'logged-in' users will *not* fix this problem (regardless of implementation, it will be *very easy* to enforce at firewall level that all users from China are always logged in as the very same user). Ipsign 06:57, 16 August 2011 (UTC)[reply]

It sets a precedent that I really don't want to see on Wikipedia. --Dia^ 07:54, 16 August 2011 (UTC)[reply]

I dont want that tool, its the key to introduce censorship. If a certain image cannot be deleted if will be added to all possible categories to prevent presentation as far as possible. If someone has problems with uncensored images, he/she should read only their own publications.— The preceding unsigned comment was added by an unspecified user

Thanks, Ipsign and others, for illustrating the slippery slope fallacy. There is no logical reason why this would be extrapolated to tagging subjects blocked by China. In addition, China has already blocked that article and others in the past, so the tagging would be years too late to serve any purpose whatsoever. There are specific reasons given by many people why we should allow users to opt-out of seeing specific types of images. There is no reason why we would then apply this setting to aid censorship by governments. It's illogical FUD, not rational opposition. Sxeptomaniac 22:30, 17 August 2011 (UTC)[reply]
Just because one uses the phrase "slippery slope" doesn't automatically make it a fallacy. Like in all political issues, setting a precedent - that Wikimedia is willing to accommodate offensivists - does make subsequent steps toward the bottom of the slope easier. Why allow personal sensitivities to influence what pics get displayed in Naturism, but not allow "national sensitivities" to do the same? 41.185.167.199 12:44, 20 August 2011 (UTC) (a.k.a Bernd Jendrissek)[reply]
It's a slippery slope fallacy if you can't give good reasons why the supposed chain of events will happen. There are some gray areas, but the very fact that you can use "personal sensitivities" vs. "national sensitivities" works against the fearmongering. Sxeptomaniac 22:42, 22 August 2011 (UTC)[reply]

Can't complete form: "not eligible", and I've tried en, de, nl and give up.

I am utterly AGAINST! the proposal. - It reeks of censorship! Unless it's a filter, to be built in/activated by adults (on their own computer - and ONLY there) for the protection of minors.

Lots of zeros and one question mark as my reply.

Thus "?" I answer question # 6; for "culturally neutral" is an oxymoron. Culture is diverse by nature & definition. To conjure oxygen into carbon dioxide (by magic) while breathing is about all we humans have in common.

Hence my unease at the VERY suggestive example in question # 5 (condone the "ferocity of brute force" versus proscribe a "state of undress") redolent of the nauseating puritanism which abhors all that could ever so remotely hint at sexuality (or indeed the corporeal).

Apart from moral-mongering religious fanatics (atheists included) there is the problem of definition.

"Moral" (from: Latin mores=customs): where and when? Same for "Nude": is that my freshly shaven face (half an hour ago I rid myself of a three days' beard) or only when I was actually in the shower, unclad, naturally (naturely). "Obscene" is another one. Can I come to the conclusion that I find the description of π "Offensive"? I've already mentioned "Culture" and could prattle likewise for days on end.

In my not so humble opinion there is no slippery slope: there is a precipice and, although it might take a while, we, along with wikipedia, shall splatter. Sintermerte Sintermerte 18:24, 22 August 2011 (UTC)[reply]

Once you're done hyperventilating, look up "censorship" in the dictionary you used to look up "moral" and get back to us. As far as "religious fanaticism", the panicky opposition to this proposal fits the definition far better. Sxeptomaniac 22:42, 22 August 2011 (UTC)[reply]

Abuses by censors: self-censorship seems to open the door to 3rd-party censorship[edit]

From other comments on this page, there seems to be a severe confusion about the intention of the feature and of its potential (and IMHO very likely) abuses. One common argument pro this feature is that it is self-censorship, which (unlike 3rd-party censorship) looks as a good thing. Unfortunately, there are severe implications of available technical solutions. It seems that this feature will almost inevitably be based on so-called cookies. It will essentially allow any man-in-the-middle who sits between end-user and wikipedia server (usually ISP, which can be controlled by government/private company with certain agenda/...), to pretend that it was end-user who decided not to show controversial images, and Wikipedia servers won't be able to detect the difference (the only way I know to prevent such attack, is SSL, and even it can be abused in this particular case - 'man in the middle' can intercept SSL too, and while user will know that server certificate is wrong, he won't have any other option, so he'll need to live with it). It means that by enabling users to filter themselves out, we will also be enabling 'in the middle' censors to make filtering much easier than they're doing it now (it will also probably mean less legal protection against censorship: for example, if somebody will filter out Wikipedia images in US right now - they will be likely committing copyright infringement - with 'fair use' defense being quite uncertain, but if it is Wikipedia servers who's doing the filtering - copyright argument evaporates, making censorship much more legal). This is certainly not a good idea IMHO. Ipsign 07:59, 16 August 2011 (UTC)[reply]

In order to distinguesh between images, they have to be categorized. And there you will find the censors. --Eingangskontrolle 08:47, 16 August 2011 (UTC)[reply]

Yes, even if this is intended for personal use, it will in fact be used by network managers and ISPs to censor Wikipedia for their users. MakeBelieveMonster 11:50, 16 August 2011 (UTC)[reply]
Okay, so your scenario here is that a malevolent entity, with government-like powers, does broad-based man-in-the-middle attacks in order to... turn on a filtering system, where the user can get around the filter just with one click? There's nothing we can do about MITM attacks (other than recommending SSL, so at least you know you've been compromised) and if there is someone capable of an MITM attack then they are not going to use our click-through-to-see filtering system, they will just block content, period. NeilK 17:52, 16 August 2011 (UTC)[reply]
FYI, the world is full of malevolent entities with government-like powers. They're called governments. (Brrrrummmm-Chhhh) ;)
The fundamental problem isn't the optional filtering switch, it's the background work on establishing a censorship-friendly categorisation system that can be read and reused by outside software and used to impose third-party filtering, the willingness to include the switch, and the difficulty that Wikipedia would then have in resisting legal demands to extend the censorship system once all the infrastructure was in place, and the crippling legal liabilities they might have if they'd already broken their own principles on non-censorship, and had the technical ability to impose broader censorship, and didn't. If WP implements a purely optional censorship system, then the next time some kid blows up their high-school, the lawyers will be poring over the kid's internet records to see if the kid ever accessed anything at all on WP about explosives, and if so, they'll be launching a lawsuit on the grounds that WP had had all the software in place that would have let them block the content, and had chosen not to use it.
If you show that you have the machinery and willingness to edit, and you choose not to edit, then you tend to lose the "simple carrier"-style assumed absence of liability for the material that you carry. That's why it's critical to stick by the principle that your charter forbids censorship, and why its so important not to develop these systems. It gives you the defence that you are forbidden to censor by your charter, and couldn't technically do it even if you wanted to. Introducing this system would remove both those defences and introduce a whole new set of potential vulnerabilities and liabilities for WP that I don't think the committee have really thought through. ErkDemon 00:01, 18 August 2011 (UTC)[reply]
There are no plans for any "background work on establishing a censorship-friendly categorisation system". The plan is to use the existing categorisation system, and say (for example) that this user does not choose to see images in this list of categories, including Commons:Category:Abused animals by type (and whatever else seems relevant). If a third party wanted to use these categories to censor images, they could have been doing that years ago. They do not need any changes at all to enable them to do this. WhatamIdoing 00:59, 24 August 2011 (UTC)[reply]
No. What they are saying is that all it takes is one malevolent user to inapproprately tag harmless content with labels that will trigger the censorship system. Honestly, if you are liable to be offended by images on a given subject then you wouldn't really be viewing such an article in the first place. If you do encounter an 'offensive' image it is probably because a troll has uploaded something inappropriate to an article whose subject suits your sensibilities, and I don't think they would be helpful enough to censor tag their work for you. --2.26.246.148 20:52, 17 August 2011 (UTC)[reply]
"That's naive though. Suppose that I need to learn about Naturalism (art) - so I type in "Naturism" - and find (to my horror), not some crappy porcelain figurine of a shepherd, but instead, right there at the very top of the page, to the right of the lede - a picture of half a dozen naked people. A simple typo or misunderstanding of a term (that the reader is looking up in an encyclopedia precisely because they don't yet understand it) can result in such problems. Consider that there are some companies out there who might fire someone for looking at the picture at the top of Naturism on company time when they are supposed to be researching Rococo figurines! So precisely because I'm against censorship of Wikipedia - and because I demand the right to have full-frontal pictures of nudists at the top of the naturist page - I'd very much like a "No Porn Please! filter." SteveBaker 19:10, 17 August 2011 (UTC)" Not to mention ErkDemon's point just above yours. Sdmitch16 02:50, 19 August 2011 (UTC)[reply]

So we start with censorship against nudity and cruel pictures and end where ? Political and religious censorship will follow soon. This is not really self-censorship, because someone will have to add a tag to each picture to categorize it. There the censorship already starts. 94.217.111.189 06:30, 21 August 2011 (UTC)[reply]


Copyright concerns[edit]

Forgive me if I have got the concepts wrong, but the way I see the licensing is that an ISP may not alter the text without providing the original source (or a link to it). If we detect a middleman tampering with cookies to deny users the ability to see these images on a permanent basis, should we pursue those people for violation of the copyright? And if not, does that mean the WMF CC license is unenforceable,and pointless? Obviously data manipulation could be happening already, but a cookie-based solution is an open invitation for low-level tampering. Inductiveload 17:44, 16 August 2011 (UTC)[reply]

Technically, such an ISP would not alter what is sent by the WP servers, but would alter what the user requests. Not different from secretly altering (http or even DNS) requests for wikipedia.org with censopedia.example, say (which is of course also not a nice thing to do). On the other hand, already now I can imagine that the content of w:EICAR test file might not arrive in its original unfiltered form at every user's PC, depending on anti-virus policies along the line, a thing I am reluctant to call a copyright infringement.--Hagman 13:56, 21 August 2011 (UTC)[reply]
No, hiding an image is not a copyright infringement. If you do not show material you do not need the copyrights to that material. What would be infringement is showing an image, but making the description page inaccessible, or showing articles but hiding the license and blocking the history. Zanaq 17:26, 23 August 2011 (UTC)[reply]
Thank you for the clarification, that does make sense now. Inductiveload 01:06, 26 August 2011 (UTC)[reply]
Copy+rights do not exist in the United States. Copyrites regulate the copying ritual; -NOT a RIGHT-.

Visual artists should be free to determine what is prejudicial to their honor in their belief according to the artist's culture. I support not allowing third-party sites to redisplay ANY visual art and the Eighth Circuit will examine this starting next month in the United States. Self-censorship is a fundamental human right in my belief stated in my Appellant Brief. CurtisNeeley 21:24, 27 August 2011 (UTC)[reply]

conservative POV bias[edit]

Kruzifix
Image hidden by Wikipedia opt-in personal image filter. This image is categorized as: religious POV (christian) ; health advice (dangerous for vampires) ; contravention to "Thou shalt not make thee any graven image" (Exodus 20:4) ; torture ; corpse ; barely clothed male person -- the categories you blocked are marked yellow -- click link to show image

Any system that attempts to distinguish one image from the next will require individual labelling. In turn, any such effort will, I argue, become dominated by a conservative cultural and religious POV.

I would posit, for example, that the vast majority of images labelled as 'objectionable' will be one's relating to human sexuality and will focus in particular on what social conservatives consider 'aberrant'. It sickens me to think that every depiction of homosexuality, the naked human form or gender non-conformity, whether sexually explicit or not, may have to bear a stigmatising label reflecting a particular prejudice. Such labelling may in turn help trigger external filtering software that already reflect a conservative bias, thus further harming access to Wikipedia.

While I fully admit the global political left has far from consensus views on what is acceptable free expression, the number of images that might fall afoul of "political correctness" or "hate speech" concepts will scarcely compare to the scope and willingness of social conservatives to push their POV. I offer Conservapedia as evidence of how far they will take their agenda and this current proposal is simply opening the door and inviting them in. Added to this situation is the irony that many forms of left censorship are on behalf of cultural and religious conservatism stemming from the developing world, the most prominent example being the self-censoring response to 'offensive' images of Mohammed.

Both coming and going, an image tagging effort will be a boon for conservative views and a blow against the disinterested presentation of knowledge. --Bodhislutva 04:09, 21 August 2011 (UTC)[reply]

I agree with your assessment and I'd like to add that I the idea of a catch-all category for "NSFW"/"objectionable content" is thoroughly incompatible with the notion of neutrality (which entails cultural neutrality). The idea, implicity and excplicitly professed by several people on this talk page, to actually host and maintain such a category or categories within Wikimedia is bordering on the obscene. --87.78.45.196 04:44, 21 August 2011 (UTC)[reply]

We could introduce a Dutch POV bias instead? Images relating to Firearms and (most) fireworks blocked, some nudity permitted, sex blocked. (Is that roughly correct?). We could differentiate between different political leanings if you'd like. It'd be interesting to compare with what's objectionable or not in other countries. O:-) --Kim Bruning 14:08, 23 August 2011 (UTC)[reply]

It'd be interesting to compare with what's objectionable or not in other countries.. It wouldn't just be interesting, it would be _fascinating_. If wanted to wait long enough, we could probably find scientists willing to fund us just to collect that data. It's so fascinating, it might just be worth doing. :) --AlecMeta 23:36, 23 August 2011 (UTC)[reply]

Which images are likely to be subject to filtering?[edit]

Is included in 'nudity', given that it's a block of stone and not a live human, and a classic work of art at that? Is included in 'sex', given that they're not having sex, but they are kissing while naked? Is included in 'nudity' given that everything is visible except a very small area of gluteal cleft? Is included, given that a small area of gluteal cleft is visible? Are and included in 'gruesome medical images'? Some people are pretty squeamish you know? makes me wince because I can tell it must have hurt, so will I be permitted to hide it? Does count as nudity? What about ? What does an image of semen come (no pun intended) under, if it is not a sexual image, but a simple depiction of white fluid in a puddle? If it counts as sexual, do microscopic images of sperm? If not, at what level of zoom are the harmful sex rays attenuated? Beorhtwulf 15:53, 23 August 2011 (UTC)[reply]

I don't see how we could keep such images out of such filter categories. With just a few filter categories, they'll inevitably be interpreted as widely as possible. If someone says "This offends me and should be filtered", we have no fair or objective way to say "no, your offense doesn't count".
If you dislike filters, there is an upside-- almost no one would use these kinds of filters because of how wide they would become. --AlecMeta 23:31, 23 August 2011 (UTC)[reply]
In terms of a "sex and nudity" filter, I seriously doubt that anything in Commons:Category:Nudes in art would be on the list. Photographs of humans with exposed genitals probably would; it's very likely that the hundreds of young-white-male vanity photos at Commons:Category:Human penis would be. We can take the simple and very likely welcome step of filtering Commons:Category:Photographs of sexual intercourse without blocking every image that shows a little bit of skin or has some remote connection to reproduction.
In short, there's no need to assume that the very people who have permitted and even encouraged sexually explicit photographs for years are somehow going to turn into repressive, censorious idiots now. WhatamIdoing 17:57, 24 August 2011 (UTC)[reply]
But what would happen when, inevitably, people do want to add something from Commons:Category:Nudes in art to a filter? Both "Nude" and "Art" are very fuzzy concepts that defy definition, but "This Offends Me" is a simple concept.
I agree with you, our existing community wouldn't morph into censorious crusaders-- our existing community would probably just ignore the filters. But allowing filters might bring whole new groups of editors here-- the kinds of editors who do like censorship and do want to spend all day classifying images into appropriate categories. Tomorrow's editor's won't necessarily share the values of yesterday's community, and we have to anticipate that in our thinking.
Images Categories haven't been controversial because they haven't been attached to filters. If we created a limited number of filter categories, it seems to my mind they would be eternally controversial with a trend toward over-filtration. --AlecMeta 04:17, 25 August 2011 (UTC)[reply]
Fundamentally, I trust that the community is, and will always be, able to handle this. We've already figured out what belongs in that category. I don't think that we're going to tend towards over-filtration, and even if we did, the problem is self-correcting: the more false positives that get picked up, the more likely people will be to turn the tool off altogether.
The part that you don't seem to quite grasp is that nobody's proposing "creating a limited number of filter categories". The proposal is to use the existing category tree, not to create new categories. The questions look like, "Shall we put the five-year-old Category:Nudes in art on the list of things that are filtered under a tick box labeled 'sex and porn'?" (Very likely answer: no.) The questions that the community will face do not look like, "Shall we make a brand-new category of 'Things that offend me'?" WhatamIdoing 17:18, 25 August 2011 (UTC)[reply]

Other[edit]

Is the 'beyond-Perfect' proposal controversial?[edit]

I support the proposed filter as 'good enough' for complex reasons. But I'm also very interested in understanding where others stand on the issues.

What I call the 'perfect' system would be:

  • Infinitely Customizable each user can specify any list of images or categories.
  • 1-Click-To-View-at-all-times images are never denied, just temporarily hidden.
  • Only upon user request only the user can turn on the image-hiding feature.

My 'Beyond-Perfect' system would also have two additional features:

  • Not developed or promoted by WMF, no WMF resources go to support it.
  • Active only on non-WMF servers / clients

Is there anyone who finds filtering so controversial that they'd 'Vote No' on what I've call the beyond-perfect proposal? --AlecMeta 22:11, 19 August 2011 (UTC)[reply]

The problem is that it's circumventable. You can lock down what a user is logged in as, you can prevent them from changing that cookie, you can deprive them of JavaScript, and you can stop them from accessing specific parts of a website that allow undoing the filter, thus preventing your "1-click" from working. Samsara 22:52, 19 August 2011 (UTC)[reply]
Making it easier for person A to censor person B's reading habits, that's definitely controversial.
I'm curious how controversial this is in the ideal, setting aside slippery slopes. Do we agree a reader 'has the right' to 'censor' their own reading, or is that itself a source of controversy? --AlecMeta 02:40, 20 August 2011 (UTC)[reply]
This is an interesting area. It's probably in the class of things where it is a good thing for the individual but a bad thing for society as a whole. The reason it is bad is that it avoids challenging preconceptions with fact. This is part of the concern I think, of the "filterbubble" meme (I haven't read the book). So does the reader "have the right" yes, it's in the nature of the license that they have the right to create derivative (censored) works. But nonetheless we individually and as a community have views about those derivative works. We are generally pretty relaxed about "Wikipedia for Schools" and Wikibooks for example. We are not so happy when people "print on demand" an almost random and un-quality controlled set of articles as a "book", for a ridiculously high price - using our work to take money for an inferior product. We would probably be deeply disturbed if, for example, Holocaust revisionists created a version of Wikipedia that excluded or distorted all mentions of concentration camps. They would of course have the right to do it (in most countries), but we certainly would not wish to be a purposeful enabler of it. Rich Farmbrough 03:01 20 August 2011 (GMT).
I honestly don't see anything relevant to the filer. Nazism can still create their own encyclopedia using Wikipedia materials selectively without the filter. -- Sameboat (talk) 03:22, 20 August 2011 (UTC)[reply]
"We certainly would not wish to be a purposeful enabler of it." And in fact the report recognises this risk and warns against it in another recommendation in the same section: Recommendation 10, which I encourage all to read. Rich Farmbrough 15:17 20 August 2011 (GMT).
And censors can create their own copy on their own servers, paid by their money. But not money donated for free content should be used. --Bahnmoeller 16:09, 20 August 2011 (UTC)[reply]
Much wisdom in the concerns of resource use. I donated knowing this was still in the works. Most of our donors probably had no idea it was coming. In May 2010, when I had donated to 'protect wikipedia' and it was briefly announced we would start deleting offensive images-- in that case, I absolutely felt a sense of 'betrayal'-- that's not what I had donated to, indeed it was the exact opposite of what I had donated to.
I think the price tag will be very small, but we have to be very careful about resource usage. The language we use to solicit donations is 100% antithetical to the language we used in the image deletions. We need a 'sense' of how "average donors" feel. If someone gave us the system for free, I'm comfy with us using it. But using large chunks of own funds on this particular project may not sit well. It's okay by me, but I can't speak for all the donors, and most importantly, I have no clue at all how this policy will affect future donations-- positive or negatively.
Donors don't 'own' us, they're not 'entitled' to dictate our future, but realistically, if they wanted to build a filter, they probably would have given their time and money to someone else, and that's very much something to consider. --AlecMeta 14:47, 23 August 2011 (UTC)[reply]

Why did it take you so long[edit]

to add a feature that will allow me to hide the pornographic images on Wikipedia, somehow there is no firefox add-on to do this yet (at least no effectively). Las tortugas confusas 13:55, 20 August 2011 (UTC)[reply]

Because one person's pornography is often another's art -- and vice versa; there is no commonly-agreed to definition of "pornography". One example is that Oregon, which is part of the United States, has no legal definition for the word "obscenity" due to the free speech clause in its state constitution. And since there is no simple solution for this issue, it hasn't been tackled -- & many of us are no convinced that any solution would be worse than doing nothing. -- Llywrch 16:22, 20 August 2011 (UTC)[reply]
"Because one person's pornography is often another's art " Which is why people can turn it on or off. Ottava Rima (talk) 02:49, 21 August 2011 (UTC)[reply]
So you support my right to filter out paintings of dead babies representing the Christ child, or of the Crucifixion which are horrifyingly graphic? And are you willing to defend my right against those who don't understand my dislike for these images? -- Llywrch 03:22, 22 August 2011 (UTC)[reply]
I'd just like to know where on wikipedia people are finding porn. I've been actively looking, and can't seem to come across any, so it's hard for me to imagine that someone would just stumble upon some without intention. I would argue the 'problem' this filter is aiming to 'fix' isn't really a 'problem' at all. It is something that this site (and every other content provider on the internet) simply should not be concerning themselves with. Self censorship is up to the individual and it's up to that individual to figure out how to make it work... or simply don't use the content provider. That's your choice. Pothed 17:13, 20 August 2011 (UTC)[reply]
There is stuff that some people might classify as porn in Commons (e.g., pictures of genitalia, images of people from porn websites, probably even images of people engaging in some form of sexual intercourse); it's one of those problems which you only solve if you know the answer first. And let's be frank: if you're serious about finding porn, searching on tumblr is a better bet. (Hell, even a Google search is a better bet than sifting thru all of the possible categories in Commons.) -- Llywrch 22:28, 20 August 2011 (UTC)[reply]
And someone who doesn't want to have any pictures of any nudity on their computer? Too bad for them? If their country bans it or they are at work, then too bad? We need to make sure that as few users as possible are able to access Wikipedia because you want to make a political statement against those you feel aren't as "liberated" as you? That is really selfish and incivil. Ottava Rima (talk) 02:50, 21 August 2011 (UTC)[reply]
And if someone thinks wearing a skimpy bikini constitutes nudity (showing too much skin), won't they have the right to be upset if they find those images aren't filtered out? "Why are you using their standard of nudity and not mine?" Evil saltine 21:25, 21 August 2011 (UTC)[reply]
Why do you want to force someone to look at something they don't want to look at? Ottava Rima (talk) 21:44, 21 August 2011 (UTC)[reply]
Who is "forcing" anybody to use Wikipedia? --87.79.214.168 21:47, 21 August 2011 (UTC)[reply]
I don't. I support optional filtering based on the image categories we already have, not creating a whole new set. Evil saltine 21:52, 21 August 2011 (UTC)[reply]
It has been pointed out that the current set of categories will be used for the new set of categories. We already have the images put into the specific areas. Ottava Rima (talk) 21:54, 21 August 2011 (UTC)[reply]
OK. So activating the "Nudity" filter will hide all images under Category:Nudity (and its subcategories)? That seems to make sense. Evil saltine 22:06, 21 August 2011 (UTC)[reply]
What filter will hide all images of homosexuals, all images of interracial couples or all images containing background persons wearing skimpy swimsuits or "blasphemous" T-shirts (a point raised by Trystan)? Which current categories cover those? —David Levy 22:32, 21 August 2011 (UTC)[reply]
There are already categories for homosexuality, swimsuit, etc. The article page shows that even a Pokemon image could be hidden. Ottava Rima (talk) 22:35, 21 August 2011 (UTC)[reply]
Firstly, are you under the impression that every image of a homosexual person is categorized as such? That isn't close to true. Images directly pertaining to homosexuality (e.g. a photograph of a gay pride parade) fall under such categories, but we don't place "homosexual" tags on images simply because their subjects are gay. The planned system, however, will require us to (unless we reject the widespread belief that such images are objectionable).
Secondly, you appear to have overlooked the word "background." A photograph with an incidental person in the background isn't categorized accordingly. The planned system, however, will require us to do that. (See #Negative effects of using categories as warning labels on the usefulness of categories for a more detailed description.) —David Levy 23:09, 21 August 2011 (UTC)[reply]
Are you honestly trying to say that a picture about "homosexuality" is the same as a guy in a picture who may or may not be gay? Wow, that is really weird. Ottava Rima (talk) 23:44, 21 August 2011 (UTC)[reply]
No. That the two differ (and the latter currently isn't categorized) is my point. Many people object to the existence of homosexual people (such as Elton John, who's openly gay) and don't wish to be exposed (or have their children exposed) to the sight of one. This is one of countless "potentially objectionable" image subjects not covered by any category currently in use. Other examples have been cited. —David Levy 00:19, 22 August 2011 (UTC)[reply]
Who, other than you, has "pointed this out"? It simply isn't true (or even feasible). The current image categories are neither designed to label objectionable content (much of which is incidental, and therefore not categorized) nor presentable in such a context (due to their sheer quantity, among other issues). —David Levy 22:32, 21 August 2011 (UTC
After the porn dispute on Commons, the categories were reworked specifically to put the various types of pornography and other content together so that excess images would be removed. That has been a major issue over there for the past year or so. Where have you been? Ottava Rima (talk) 22:35, 21 August 2011 (UTC)[reply]
You're missing the point. No one is suggesting that such categories won't play a role in the planned filter system. —David Levy 23:09, 21 August 2011 (UTC)[reply]
(Responding to Ottava Rima @ 02:50, 21 August 2011) If someone doesn't want any pictures of nudity on her/his computer after reading Wikipedia, then that person should be careful about which articles to read in the first place. For example, I've worked on a number of articles about 5th, 6th & 7th century European history & didn't encounter one picture of a nude person. Hell, I've read or worked on a couple thousand articles relating to Ethiopia & not once did I encounter a picture of a nude person. Not even the pictures of topless native women which made National Geographic so popular with grade school boys back in my youth -- before the Internet brought pictures of boobies a few mouse-clicks away. -- Llywrch 04:45, 22 August 2011 (UTC)[reply]
What? How does that make sense? So people aren't able to get pictures about, say, the herpes virus without being subjected to genitalia covered with it? People aren't able to look up some Greek mythological figure without seeing some representation of the individual having graphic sex? There are plenty of historical or "cultural" articles that have pictures containing nudity that are just not safe for work or school. Ottava Rima (talk) 16:39, 22 August 2011 (UTC)[reply]
This is the butcher shop, the factory floor, where things are made. Deciding what you want to cut out is up to you - and when I say "up to you", I don't mean that you should expect us to perfectly categorize every image that might offend you in some impartial way, because that's just not doable. Wnt 19:05, 23 August 2011 (UTC)[reply]

What we have here[edit]

is a failure to communicate and yet another complete failure of Wikimedia/Wikipedia governance.

And I'm not even talking about the "referendum" itself, but about a failure in a lot of everyday communication on Wikipedia. Instead of throwing a massively intrusive measure like an image filter at the supposed problem, we should at least try to look at the real problem, which plays out on Wikipedia talk pages all the time. Sometimes it works out, sometimes it doesn't.

Consensus is consensus. If you don't like an image in an article, take your concern to the talk page. If others provide overriding reasonings for why a particular image holds explanatory value for the article, that's that. Get over it. But deal with it locally, and deal with it yourself.

Take the example of arachnophobia. On the English Wikipedia, editors at one point agreed to remove actual spider images from the article, not because "people are offended/irritated" but because including actual spider images defies the purpose of the article, rendering it useless to arachnophobics who needless to say are the most likely to look this up on Wikipedia in the first place. On the English Wikipedia, this overriding rationale to replace the photos with non-phobia-triggering drawings prevailed in large part because enough people showed good judgment and didn't keep senselessly pointing to "WP:NOTCENSORED".

On the German Wikipedia by contrast, the situation is different. On the German arachnophobia article, those images are still there, because self-appointed anti-censorship watchdogs keep restoring those images and ignore or shout down anyone raising the point at the talk page. The same reasoning (that those photos defy the purpose of the article) does apply just as much on the German article, but it keeps getting ignored and shouted down. And that is the real problem. When true, discussion-based consensus doesn't prevail against a group of "established" "editors".

The answer to that real problem is of course not an image filter. The answer is better governance. Editors, and admins in particular should look at article histories, go to the talk pages and read the presented reasonings from all sides. Most of the time, a clear judgment is possible as to whether the inclusion of a particular image or type of image does or does not ultimately make sense in the respective article. That then is the consensus, whether or not it takes an RfC or RfAr to get people to accept it.

If we only had strong enough governance so that Wikipedia could be at least somewhat more discussion- and consensus-based and not the pure shoutocracy it has turned into, there would be no need for silly things like an image filter.

Some might point out that many people who want images filtered are only readers. But the answer we give to those people cannot seriously be to essentially say, as a project and community, "yeah, we are also not sure about those images, some want them and some don't, and we as an encyclopedic project are not able to make up our collective mind". What a declaration of bankruptcy! It's basically unconditional surrender to the exact kind of problem we implicitly claim we are here to solve when we call ourselves an encyclopedic project. --87.78.45.196 00:55, 21 August 2011 (UTC)[reply]

Your example is stupid - in both Wikipedias the rules are the same. In the EN wikipedia a certain group of editors has got their way, in the DE wikipedia the mayority was different. But this new "feature" would ignore the discussion about single articles and the appropiate image to illustrate that article and will allow certain users in the back of commons to distroy hundereds of articles by censoring images. --Bahnmoeller 15:33, 21 August 2011 (UTC)[reply]

"stupid". Oh well, I'm no stranger to putting a conversation ender near the beginning of my own posts, so I guess I can't complain. Anyway, it may not be the most suitable example, but it's the one I could immediately think of where a well-defined group of people (arachnophobics) strongly favor the omission of a certain type of images (photos of spiders) -- and where it makes sense to omit those images based on some people's objection.
In the EN wikipedia a certain group of editors has got their way -- No, in the English Arachnophobia article, reason prevailed. In the German article, reason was single-mindedly ignored by a bunch of editors who are a tad paranoid about "censorship".
But this new "feature" would ignore the discussion about single articles -- Yes, that is exactly my point, and one of the reasons I think the image filter is a uniquely bad idea. --87.79.214.168 20:40, 21 August 2011 (UTC)[reply]
It's funny, I take a different approach. I think the spider images shouldn't have been taken out of EN, and one of the benefits of this filter will be that we can make purely editorial decisions without having to stress over the emotions of our readers. Put "the best" spider image at the top, but let readers who are upset by it choose to hide it, so that they instead see a shuttered image. This will remind them that they are missing out on information because of their preference, but it won't be so brutal as to shove a spider in their face against their will either. --AlecMeta 18:20, 22 August 2011 (UTC)[reply]
No, you are wrong. Spider images serve no encyclopedic purpose on that particular article. The en.wp editors were correct in removing the image. The German Wikipedia is generally more authoritative and normative, and therefore the image was kept there much more to prove a point regarding "censorship" than because of any actual editorial considerations. --195.14.220.250 18:28, 22 August 2011 (UTC)[reply]
Well, I don't mean to single out and judge that specific case, I wasn't part of the discussion, I don't mean to say it was 'wrong'. And we shouldn't be troubled that two different groups of people reached different conclusions about a content decision-- that's the way it always works. I accepted the premise that we removed encyclopedic pictures because they were causing negative emotions-- if I didn't accept that premise, it wouldn't be a good example/metaphor. If it was just boring run-of-the-mill decision, then just consider a different example. :) --AlecMeta 14:27, 23 August 2011 (UTC)[reply]
Looking at the two, I think that the German version of the article is much more informative in general. The photo removed from the English version is not the same as the photo used in the German version - the English version just had a random spider, which might or might not be particularly disturbing; the German version shows an arachnophobic person directly confronting a massive spider by holding it in his hand as therapy. Now if confrontation of fear of a spider is good therapy, perhaps confrontation of a fear of an image of a spider is also good therapy.
But what's most important to remember is that Wikipedia articles are not for consumers only. I see the wrong attitude all the time - people think coverage of a company should only include things of use to people who want to buy their products, or coverage of medical issues should only include the perspective of the patient. Not at all. We should be providing useful information for the investor in the company, the competitor to the company, and all those who lazily daydream about whether in a year or five they might become so. And our articles about medical conditions should give information that is informative to scientists, doctors, med students, pre-meds, and kids taking biology class and thinking about their major. That means yes, we show a picture demonstrating the reality that with confrontation therapy, you can somehow talk an arachnophobe into handling a giant spider I wouldn't touch because I'd be worried it could somehow get its little hairs embedded in the corneas of my eyes or just plain bite me. We cover the data, all the data, for all the people. We are not training consumers dependent on an expert elite, we are training the people of the world to be the expert elite. Wnt 18:45, 23 August 2011 (UTC)[reply]

Are we addressing the right problem?[edit]

Much of this debate appears to be emotional, or about values that clearly vary from person to person. Could someone at the Wikimedia Foundation (or in some independent and trustworthy organization) provide basic, 'objective', neutral information about the following simple questions?

- Who is actually asking or pushing for this 'Personal Image Filter' in the first place, and why is this issue coming up now, since Wikipedia and sister projects have been working well for years without it?

- As of this writing, there are 3,715,536 articles in English version of Wikipedia. What fraction of these pages (or of other WikiMedia projects) could possibly be considered controversial (for obvious categories, such as explicit sexual references, extreme violence, etc.), even roughly speaking?

- What clear quantitative, incontrovertible evidence is there that governments (or 'authorities') are actually controlling or restricting access by its population to Wikipedia and associated projects?

- Why are the current practices of 'Neutral Point of View' and 'Least Astonishment' not sufficient for Contributors and Editors to include only materials that are appropriate to the context of their article in an encyclopedia, book, definition, etc? In other words, why is this a 'Viewer' issue and not simply a 'Contributor' issue, or possibly a quality control issue?

Since there are millions of public sites and pages dedicated to potentially offensive subjects, I would not expect curious people (or children, or random visitors) to search Wikipedia for these materials, nor for them to find much offensive content (compared to the bulk of the web outside Wikimedia). To the extent our goal is to generate open access reference and educational materials, are we not losing a lot of time and energy in trying to address a side issue?

Michel M Verstraete 22:35, 21 August 2011 (UTC).[reply]

Great questions, thanks Michel. And thanks to AlecMeta for giving very appropriate answers for them. SJ talk | translate   02:51, 6 September 2011 (UTC)[reply]
Michel M Verstraete raises lots of great questions. I'm not with the foundation in any way, but let me try to answer those I can:
"Who is actually asking or pushing for this 'Personal Image Filter' in the first place" -- The only people I know who really really want it are Muslim editors of the English-language article on Muhammad. We can't be NPOV without images of him, but there's something very sick and wrong about making old people who barely speak English learn about computers or multiculturalism before they can even happily read the article on their favorite subject. I wish they already had a browser that they could happily use, I wish they had learned multiculturalism before they came here. But the image issue takes what should be a positive experience and turns it into a negative one. People on talk can sometimes be very nice to the people who are upset, but occasionally the very worst of our commmunity replies, creating what could be 'intensely memorable, intensely negative experience". I've talked a lot with that group, and it's important to me that we find a way to 'have our cake and eat it too'-- we know it upsets them, we know how to fix it, and we know how to do it while still "modeling our values" too. So, even though I was one of the biggest opponents of the last bout of censorsip, I think at this point you could reasonably say I'm one of the people 'pushing for this', although I myself would never use it, I do want our low-english-literacy Muslim readers to be able to have access to it. And of course, I want it for people in their same situation who I just haven't personally interacted with, even though it was my experience with that particular group of readers that convinced me we should do something minimal to help them avoid offense.
why is this issue coming up now, since Wikipedia and sister projects have been working well for years without it? -- I have a GREAT answer to this. The only problem is that it's an answer I made up that doesn't reflect the board's point of view. But my answer is: We need it now because we're moving beyond Wikipedia now. This filter wasn't essential to Wikipedias because Wikipedias operate exclusively on NPOV. But the encyclopedia is only the very beginning of a library, and usually it's one of the least-controversial. Our future volumes may require us to be far more offensive and controversial than we had to be just to beat Britannica. In the future, we're not just trying to beat Britannica, we're trying to beat all books. --AlecMeta 17:15, 22 August 2011 (UTC)[reply]
The only people I know who really really want it are Muslim editors of the English-language article on Muhammad -- Ah? I was wondering about that. I thought the main problem for Muslims was not primarily looking at depictions of Muhammad, but much more fundamentally the fact that there are depictions of Muhammad. --195.14.220.250 17:19, 22 August 2011 (UTC)[reply]
Well, there's plenty of anger about the mere existence of the images, but our hands are totally tied there. Some people want deletion, but we can't give those people what they want, and we can't even seriously consider it, in my estimation. If someone's only problem is that we're giving people content that they want, then they, in fact, have no legitimate problem with us-- sharing information is what we do and what we are.
But a much larger group, and a lot of the raw emotions, come not from abstractly knowing about the images, but from personally encountering them unexpectedly.
When people come and say "this image upsets virtually all of the people around me, so I don't want to show it on my computer screen unless I actually click on it, so I won't upset the people around me"-- we can, in fact, help our readers with that much smaller problem. And we can do it while staying 100% true to ourselves.
I was raised by tv-- I have a very difficult time imagining what it's like that total strangers could have so much power over your emotions just by showing you a single image that wasn't intended to offend. But the fact of the matter is we're writing for Earth-- all of Earth, and believe it or not, we do, in fact, an unbelievable power over some of our global readers. We can upset or enrage them just by writing a good article. Or we can not upset them, simply by using this feature. And we don't actually want to intentionally upset people over and over and over if something this simple would really solve their problem.
There will be lots of people who aren't happy with this system because they'll know the images 'exist'. But some people will be satisfied. Even if they're just a tiny minority, that's okay- it's kind of a tiny feature. --AlecMeta 17:32, 22 August 2011 (UTC)[reply]
Maybe I don't quite understand this type of self-censorship. To me, these people are rather like little children who "hide" by closing their eyes. And more importantly, it's weird to help people do that to themselves. It implies that sticking your head in the sand is healthy and good. Also reminds me a bit of the Formerly Obese Man from the Onion Movie.
I for one believe that maybe the core idea of encyclopedias is that, while everyone is entitled to their own opinion, nobody is entitled to their own reality. When will we introduce an option to view the Conservapedia version of an article instead of the Wikipedia version? --195.14.220.250 17:47, 22 August 2011 (UTC)[reply]
Maybe I don't quite understand this type of self-censorship. To me, these people are rather like little children who "hide" by closing their eyes. And more importantly, it's weird to help people do that to themselves. It implies that sticking your head in the sand is healthy and good.
I hear you. I do not understand it either. To complicate matters, I don't know that young wikimedians from Muslim nations fully understand it either. We have the same thing in all cultures-- I'm American, and my grandfather used to get quietly furious if anyone dared to enter his home wearing a hat-- I have no earthly clue how he got the idea that this was offensive, I have no idea what he could never understand that young people weren't meaning offense. But ya know-- he got upset every time it happened, and eventually, even though I didn't understand, I started asking my cousins to remove their hats before visiting him, just to not upset him. If someone from another culture asked me to try to 'explain' it, I would be completely at a loss-- I don't understand it in the slightest, but if you convince me something is upsetting you and I can easily fix it-- shouldn't I?
While everyone is entitled to their own opinion, nobody is entitled to their own reality
You say in a phrase what took me a paragraph to ask above. Is nobody is entitled to their own reality or is everybody entitled to their own reality? To put it another way, are we, as a movement, ready to expand our scope beyond "reality-based information" to "opinion-based information"? The "Information Filter Project" is the last place I'd have chosen to start experimenting with "opinion-based information"-- but it's still educational, it's still "in scope", people do want it, and we are the only ones in a position to give it to them. --AlecMeta 15:37, 23 August 2011 (UTC)[reply]
Why are the current practices of 'Neutral Point of View' and 'Least Astonishment' not sufficient - again, this is just my answer.. Where we're going, there won't always be a single NPOV. Most books in a library aren't written neutrally. Sometimes art is intentionally astonishing. "A Clockwork Orange" is a hard to watch film because it is so upsetting, and yet, it educates about the horror of violence in a way that few other films do. Tomorrow's "A Clockwork Orange" may be released under creative commons and it may be hosted on Commons-- we want to buy more intellectual freedom for our contributors, and a filter will help buy it. --AlecMeta 17:15, 22 August 2011 (UTC)[reply]
It won't. Thats a fact that the American Library Association accepted a long time ago. [1] --Niabot 17:51, 22 August 2011 (UTC)[reply]
Well, see, let's think about that for a second. ALA dramatically opposes a library endorsing a rating system-- the library tries to be neutral. So consider w:Index Librorum Prohibitorum, a list books that were prohibited by the Catholic Church, the last of edition was 1948. FORCING library patrons to use it would be horrendous. But it would also be bad for a university library to refuse to let its patrons consult the w:Index Librorum Prohibitorum or access it.
We can have a whole section of our library that is "Lists of prohibited books", so long as we never have one special default list. Indeed, knowing what's objectionable is just another kind of knowledge, as long as it's not imposed-- with that knowledge, we could even have our own "banned images week" to promote high-quality images that have been objected to. --AlecMeta 14:38, 23 August 2011 (UTC)[reply]
Once it exists, we shouldn't censor it; not even the Index Librorum Prohibitorum. At the same time, I do not think we should generate a new and updated Index Librorum Prohibitorum, as it is not in line with our own sense of morality to do so. --Kim Bruning 20:18, 23 August 2011 (UTC)[reply]
But that information set already does exist-- in the collective minds of all our readers. They want to 'upload' that data to us. It's the data of who hates who and who hates what-- it's ugly data, data that is very, VERY different from Wikipedia data. It's the data of primates and how images affect their emotions. The data could empower evil third parties, but it could just as easily empower benevolent third parties. It's hard to get people to admit prejudices-- this dataset will show the entire planet's biases and prejudices, live and in living color for all to see. If we let people collect that data here, the world will learn something very valuable from it-- IF we do it right. --AlecMeta 20:42, 23 August 2011 (UTC)[reply]
I think a library would happily collect the Index Librorum Prohibitorum, but I doubt they would require the books it names to carry labels so identifying them, or integrate the Index as a filter into their search catalogue.--Trystan 23:54, 23 August 2011 (UTC)[reply]
Trystan is right. Additionally, a public library will not list either the MPAA's film ratings or the Entertainment Software Rating Board's ratings for games in their catalog system, and they will not stop children from checking out "inappropriate" materials, whether those be books like Joy of Sex, R-rated movies, or M-rated video games. That would be imposing (or endorsing or enforcing) someone's idea of what's acceptable for a particular type of reader on all of their readers.
But they will cheerfully list Hustler magazine under "Pornography -- Periodicals" and Stephen King's writing under "Horror fiction", and they will not expurgate the film and game ratings from the materials themselves, even though individuals choose to use those labels and ratings to avoid material that they do not want to see. WhatamIdoing 18:58, 24 August 2011 (UTC)[reply]
This an excellent point, Trystan et al. The only thing I can say is that a library is a physical place usually with shelved content. Our content can be viewed everywhere, often in very public places, and maybe that creates a special need for our patrons that traditional libraries don't have. But there's no escaping the conclusion you all reach-- this is a step away from our 'purest ideology' of Wikimedia's purpose. Sometimes ideology must give way to flexibility and strategy, and this may be one of those times. --AlecMeta 21:26, 24 August 2011 (UTC)[reply]
The problem developed because Wikipedia has an absurd amount of pornography and that drew a lot of complaints, especially with our high number of child users (and it being illegal to allow them to see such images). The surveys showed that there were over 1000 white penises but only 1 black penis, negating any evidence that there was truly an "encyclopedic" reason for them. However, a handful of vocal but a minority of users caused an uproar when the porn started to be trimmed down. It was suggested that we could create a filter system so that people could have the porn and others could ignore it. It was a compromise. Ottava Rima (talk) 23:48, 21 August 2011 (UTC)[reply]
Can you give any sources for your claims that:
  • "... Wikipedia has an absurd amount of pornography and that drew a lot of complaints, ..."
  • "... there were over 1000 white penises but only 1 black penis, ..."
  • "... a handful of vocal but a minority of users caused an uproar when the porn started to be trimmed down."
  • "It was a compromise."
Otherwise i will have to ignore your opinion entirely, since it seams way overextended and onesided. --Niabot 00:49, 22 August 2011 (UTC)[reply]
The statistics were all in 3 reports. I assumed that you bothered to read them before making such responses, being incivil, and vandalising here. I guess I assumed too much. Ottava Rima (talk) 01:25, 22 August 2011 (UTC)[reply]
Give me a link to that reports and cite the related part of it, that aren't written by a good friend of Sue or his daughter. PS: I'm curious why you got banned indefinitely from EN and now proudly claim that others are uncivil or vandals. --Niabot 01:35, 22 August 2011 (UTC)[reply]
"Give me a link to that reports and cite the related part of it" No. It is on the article portion of the page. You have proved that you are here inappropriately because you failed to actually look and see the basis of the filter and what the filter is supposed to do. Merely jumping in here and making all sorts of ignorant comments is not appropriate behavior. And Niabot, there are multiple websites right now watching this discussion and making it very clear that you have a major track record of inappropriate behavior. Unlike you, I wrote articles that were so respected that a Foundation Board member and an Arbitrator proxied them onto Wikipedia during my ban, and the articles were on some of the most important topics [2], [3], and [4]. You've merely put together a handful of stub like articles on obscure anime or uploaded lots of hentai that is dubiously within CC-BY standards. Your block log [5] is quite impressive. It seems rather clear that you want to force as many people to look at as much porn as possible for whatever reason. Ottava Rima (talk) 16:46, 22 August 2011 (UTC)[reply]
(1) Does file:Human vulva with visible vaginal opening.jpg qualify as "porn"?
(2) Who is "forcing" anybody to use Wikipedia? --195.14.220.250 16:53, 22 August 2011 (UTC)[reply]
Nice insults. I didn't expected anything better from you. A "handfull" is now comparable to dozens. Hentai is dubiously within CC-BY standards? I laughed again, a little. I was effectively blocked two times on the German wikipedia, all other blocks where removed soon afterwards because they where false. Not a single one was related to a sexual topic. To me it seams very clear that you blatantly insult others with false claims and that you would do anything to harm freedom or the project. Thats my opinion and last word. I don't see the need for a further going discussion, based upon lies. --Niabot 17:46, 22 August 2011 (UTC)[reply]
1. There were no insults so don't pretend otherwise. 2. The hentai is from a dubious source and randomly pulled from the internet. 3. And not being blocked for a sexual topic doesn't mean you don't deserve to be. We are a project with children and your mere interacting with them while having explicit images on your user page at Commons is a serious behavioral problem. Ottava Rima (talk) 02:44, 23 August 2011 (UTC)[reply]
What about some research? Look here: Category:Intact human penis. How many are black?
If the amount of porn on commons ist "absurd" I can't decide. But the number of white man who like to show off their dicks clearly is :-) Adornix 15:29, 22 August 2011 (UTC)[reply]
I can recall deletion requests against mostly black penises. But for 1000 to 1 we should have at least 1000 pictures inside this category. ;-) --Niabot 16:29, 22 August 2011 (UTC)[reply]
the number of white man who like to show off their dicks clearly is [absurd] -- That is correct. But why are we trying to superficially address this problem with an image filter rather than showing these people the door and deleting the educationally meritless images? --195.14.220.250 17:05, 22 August 2011 (UTC)[reply]
    • About some so-called facts:
  • "... Wikipedia has an absurd amount of pornography and that drew a lot of complaints, ..."
"absurd" is a matter of one's own standard. For that matter, so is "pornography" All in all, as compared to what's in the online world in general, I've seen very little here--but, then, I don't go looking for it. And the complaints represent what proportion of the readership? I suppose between one in ten thousand and one in a hundred thousand, Unless, like the group preparing the report, you go out and try to evoke them.
  • "... there were over 1000 white penises but only 1 black penis, ..."
If so, it is a problem. The solution is to add more racial diversity--just as usual with cultural bias.
  • "... a handful of vocal but a minority of users caused an uproar when the porn started to be trimmed down."
"handful" is also a matter of one's own standard. To me, they include a very high proportion of the people here whose work I respect. Considering that the original trimming was the heavy -handed work of a single editor, I'd expect no less than an uproar. As I recall, most of the images deleted by that editor at that time were restored.
  • "It was a compromise."
That at least is apparently true. But there are some things we do not compromise about. the most important is NPOV, and that the encyclopedia is free of the selection & classification of any band of expert or presumed representative editors is the basis of our NPOV. The WMF does not own the content. It doesn't even own the software. All it owns is a small amount of hardware and a extremely valuable trademark. Of all sorts of intellectual property, trademarks are the sort of thing most vulnerable to compromise and dilution. Compromise NPOV, and you lowers the value of the trademark, because that's the principle thing it stands for. DGG 04:42, 23 August 2011 (UTC)[reply]
Neutral point of view does not require violating the law by providing those under 18 in the United States with pornographic imagery. The WMF is legally responsible for the content regardless of "ownership" because it would qualify as a distributor. Pornography is also legally define and the standard upheld by the Supreme Court. You know all of this, so I am surprised that you would say the things you said above. Ottava Rima (talk) 05:39, 23 August 2011 (UTC)[reply]
As a side question, whose work are you claiming that you respect? I have spent time looking into who posted, and the ones who oppose the referendum either don't edit en.wikipedia or haven't produced much content. I find it odd how you can respect them, when many of them just upload graphic sexual images and do little beyond that. It is also odd that you are saying that the problem of "1000 white penises" means you should have more non-white penises when it was obvious that the problem actually was that it had nothing to do with being encyclopedic but a small, undiversed group uploading images simply to put it on their user pages in an exhibitionist manner. The worst part is that we allow these people to interact with those under 18. If the WMf was being responsible, every single one of them would be indeffed under a child protection standard. Ottava Rima (talk) 05:42, 23 August 2011 (UTC)[reply]
This is apparently not well known, but: NPOV is a policy voluntarily adopted by some, but not all, of the WMF projects. It is not a mandatory policy imposed from on high. In particular, Commons does not have an NPOV policy. Instead, they have a page explicitly saying that the policies of the English Wikipedia do not apply to the whole world, and that they don't follow the NPOV approach. WhatamIdoing 18:11, 24 August 2011 (UTC)[reply]

Interim vote totals[edit]

Just for the sake of information, there have now been over 20,000 votes registered in this plebiscite, which makes it the highest degree of participation in any Wikimedia poll/vote/referendum ever. There is still a week to go. Risker 03:10, 23 August 2011 (UTC)[reply]

But the question is, is the software sophisticated enough to determine how many of those are done from the same IP? And with the qualification only being "10 edits", how many of those are from accounts with only 10 edits or under, say, 100? Ottava Rima (talk) 03:37, 23 August 2011 (UTC)[reply]
In answer to your first question, yes, vote administrators have access to IP data. As to the second point, that is not a key criterion that will be reported by the vote administration process, although I am sure someone with API access can do the calculations if they want to run through all 20K+ votes, if they are so motivated to do. I'd suggest they wait until the polls close, though; we have no idea how many more votes are yet to come. Risker 03:47, 23 August 2011 (UTC)[reply]
Well, I have a suspicion that the loose restrictions are a reason why there have been over 20,000 votes so far. I would be willing to bet that at least 50% of the votes are done by those with less than 100 edits. I hope someone who knows scripts would be able to parse through the info, especially if the WMF is going to be relying on it later. Ottava Rima (talk) 04:13, 23 August 2011 (UTC)[reply]
Actually, the huge leap in the number of votes is most likely related to editors with email enabled actually reading up on the issue and expressing their opinion. The objective here is to encourage many opinions, hence the low threshold for eligibility. This tool is designed with *readers* in mind, so the edit count is less relevant for this plebiscite than would be the case for election of Trustees, for example; many people read one of the Wikimedia projects regularly without making many contributions, for a multitude of reasons. Risker 04:22, 23 August 2011 (UTC)[reply]
Polls only work when they attain a level of objectivity and negate the ability to game them. Giving those like sock masters the power to decide this matter isn't what I would see as a good thing. By raising a standard, we negate the inappropriate influence of such individuals by forcing them to perform far more work at pretending to be legitimate than they most likely would be interested in performing. I'd rather not see those like Grawp be the primary decider in what the editors of Wikipedia feel. Ottava Rima (talk) 04:31, 23 August 2011 (UTC)[reply]
Yeah, whatever the outcome, you'll claim victory. We get it. Also, what the editors of Wikipedia feel is that you should be banned there, which you are. Just saying. --78.35.232.131 04:34, 23 August 2011 (UTC)[reply]
I find it interesting that you assume that the sock masters would all be voting against the proposal. That is quite telling, especially with your logged out editing above. Ottava Rima (talk) 05:43, 23 August 2011 (UTC)[reply]
LOL, that is what you are clearly assuming, not me or anyone else. Your projection is quite telling. Unfortunately for you, the one person it tells nothing is you. --213.196.209.190 13:08, 23 August 2011 (UTC)[reply]
You just switched to yet another IP, which verifies my statement that a small handful of people are using sock puppets and changing IPs to make it seem far more than there actually exists. And I find it interesting how you say "projection" as if I would be logged out editing or socking, which is laughable. Ottava Rima (talk) 14:07, 23 August 2011 (UTC)[reply]
Not happening, and not that important if it does. IP users are first class citizens here, let them be or try to change their minds-- don't overly fixate on the signatures. Each comment comes from a human mind-- discuss the issues raised or discuss the concerns in the abstract, but singling out specific comments and using their login status to dismiss their arguments is... counterproductive. --AlecMeta 15:26, 23 August 2011 (UTC)[reply]
Generally the sock puppet cases I've heard of are about a few accounts, or at most a couple of dozen. If "sock masters" can sway a poll with 20,000 votes and specific eligibility rules, we have a pretty big problem! Of course any such allegation should be investigated - if there's any evidence for it. Wnt 18:24, 23 August 2011 (UTC)[reply]
There are actually tens of thousands of known sock puppets. There are programs where people used to make such things. Go look at the global lock logs and see how many names are oversighted to appreciate the magnitude. Ottava Rima (talk) 19:42, 23 August 2011 (UTC)[reply]
Actually, IP users don't matter, especially here. They shouldn't even have access to this talk page. And saying that a person is valid because they are human is ridiculous and doesn't serve any purpose. Ottava Rima (talk) 19:42, 23 August 2011 (UTC)[reply]
saying that a person is valid because they are human is ridiculous and doesn't serve any purpose. -- Seek professional help. --213.196.212.168 20:44, 23 August 2011 (UTC)[reply]
saying that a person is valid because they are human is ridiculous and doesn't serve any purpose. If you say my remarks served no purpose, your very words prove your statement correct. For my part, I want to hear as many different sincere opinions as I can get, regardless of author. --AlecMeta 21:35, 23 August 2011 (UTC)[reply]
there have now been over 20,000 votes registered in this plebiscite, which makes it the highest degree of participation in any Wikimedia poll/vote/referendum ever.
This is very, very exciting and it's going to make it a lot, lot easier to understand this issue. THIS is why we had a referendum-- this is the largest feedback we've ever had about any issue ever. The questions are confusing, but that's okay. Referendums work at getting massive feedback from readers and editors across cultures. They are a new technology the board has just started using, and despite all the handwringing that the numbers will be misused, I think it's a GREAT step in the right direction! Regardless of what the outcome is, just getting this many people to take part in a governance decision is great. --AlecMeta 15:15, 23 August 2011 (UTC)[reply]
I could gladly agree if i wouldn't have some bad feelings about this first example.
  1. No one asked for a "no" or "yes". Instead the questioning is in the form: "Is it important for you to get punished today? Ok, we have to admit, it is not important for you. But it is important for use. Please be prepared to receive your punishment.
  2. Because of the first fact, the name for the voting was completely wrong. Just a little mistake? At least the media calls it "referendum", without a second thought, as usual.
  3. All the "fair" argumentation at the voting page is based on the "Harris Report". Did no one asked the question: What qualifies him and his family (daughter) to make such an decision? He doesn't seam to have any expertise regarding the questions of the report. The only qualification i could find is the following: Sue and Harris worked for the same company. That's it.
  4. Why did the WMF choose some "random" man and his family to write this report? Why didn't they ask the experts from the American Library Association? We already know that both would have very contradictory opinions regarding this topic.
Whats left at the end makes me sick. Either the WMF is really that stupid, or the WMF is willingly working against the initial goals of the project. I would be lucky, if they are just that stupid. --Niabot 19:39, 23 August 2011 (UTC)[reply]
For the record, Naibot, I'm really getting a lot out of this dialogue. A month ago I thought I was THE most anti-censorship person around, and its very reassuring to find that my basic values are more mainstream than I realized.
The lack of a clearer "Hell No" option in the survey was a mistake. I think if the board had realized how successful this survey-discussion process would be, they probably would have done things differently-- the lack of a "hell no" option makes the survey less powerful than it could have been. But, they can't tinker with the wording once things are underway-- too many languages.
I agree that the general thinking is that we need "something", and that thinking colored the whole referendum process. But that's okay-- 20,000 people now know about this issue and they have expressed an opinion on it. Regardless of numbers, if they feel "Hell No" about it, they'll get that message to us.
During the last board elections, it seemed like nobody was even interested in the global movement. This process is MUCH more successful, albeit imperfect.
Lastly, remembering against that I am absolutely nobody in authority, I think we indeed 'expanding beyond' our initial goals and our initial vision. We won. We're the most useful reference source on the planet. We won beyond our wildest imagination. And going forward, we have to move beyond our humble initial vision, while retaining the core values that make us work. (and I absolutely believe that intellectual freedom is one of those values, so this discussion is good to see. This SHOULD be a hard thing to do right, this SHOULD be controversial, everyone SHOULD have lots of questions and skepticism.)
"Why didn't they ask the ALA experts?" What makes you think they didn't? Off the top of my head, two of our board members spend their days working with/for libraries. I know our board's advisers include some very good ALA experts. I'm not an ALA "expert", but my heart is, if that makes sense. All our opinions helped shape the Harris report. We want all the ALA experts we can get, and I believe we're actively recruiting such individuals whenever we get the opportunity. --AlecMeta 20:03, 23 August 2011 (UTC)[reply]
I didn't read anything about the fact that the ALA would have been involved (would be quite an argument). In fact it would run straight against their basic principles, the Library Bill of Rights, to include a non-neutral declaration system.[6] So I'm sure they would never given their support to create such a filter, that labels content in arbitrary, non-neutral categories.
I can only see a report written by some "non-expert" (regarding this topic) and his family. Was that truly the only valid source, the only report? If so - then it's a real joke. Idiocy if you will. --Niabot 01:07, 24 August 2011 (UTC)[reply]
Confirming Alec's guess above: the ALA was consulted, as were other Library Associations. The notion of intellectual freedom was referred to often during the development of the report. See for instance Talk:2010 Wikimedia Study of Controversial Content/Archive 2#The Librarians's Perspective. None of the people consulted with during the report were asked to 'give support' to the result. I agree with the sentiment that grouping knowledge into arbitrary non-neutral categories are not compatible with general Library practice. Recommendation 7 was the most controversial of the study's results for this reason. Arguments were made that since the categories were descriptive and already existed, and would not be used to restrict anyone's access (only to make it easy for users to choose to delay their own access to information), that this is rather different from the specific concerns raised in the Library Bill of Rights, the interview above, and similar documents. But the problem remains that choosing to highlight a set of categories as worthy of special concern is in itself proscriptive - which is why the idea of arbitrary categories didn't make it into the Board resolution.. SJ talk | translate   03:34, 6 September 2011 (UTC)[reply]
I read the answers to the questions which are closly related to the current issue. None of the questions actually hits the nail on the head, but they give a good indication. For example:
4. Are there various levels of display and accessibility of materials in most libraries?
I suppose in the sense that in public libraries and school libraries materials are arranged according to age groups: hence the picture books in the children’s areas, the teen age books (Young Adult – YA) in the teen (or YA) areas, and so on. But this (or should be) is more about reading level rather than sensitivities. (Underlined by myself)
5. And are these decisions about accessibility made on the basis of potential cultural sensitivities or sensitivities around sexuality (I'm talking in general here, not a policy about kids). In other words, are there certain books that are restricted in some way because of their subject matter?
Ideally, no – this should not be happening; ...
6. Do libraries have a policy about demarcating volumes within the library itself -- in other words, some sort of system that identifies certain volumes as potentially objectionable, or safe?
No – this practice is definitely frowned on and should not be happening. This is counter to our principles, philosophies and practices.
Since some news articles already covered the filter under the name of youth-protection (or in German "Jugendschutz") we quickly see a huge gap between „Sorting material for the reader“ for better accessibility and „labeling content by sensitivities”, even so the proposed system would not be able "to protect the youth".
This continues throughout all the other following questions and falls back to very simple point. The rules for labeling of content. In other words the ALA did not agree with the filter at all. So i may ask: How does it come, that this statements where ignored entirely throughout the further progress?
A very basic question is: Is the categorizing of images/content after categories meant to please sensivities in any way compatible to NPOV? -- Niabot 22:17, 6 September 2011 (UTC)[reply]

What info are we missing? Potential ways to move discussion forward[edit]

The voting part is a huge success. I hear a lot of skeptical voices here on talk though, and while I can try to assuage most of their concerns, I'm not knowledgable enough or authoritative to even address all the points. To brainstorm:

  • What are our best minds picking between in their current thinking? (understanding of course they're always open-minded to new info and new consensus)
  • As realistically as possible, can we 'label' the options under serious consideration and provide as much info as possible about them?
    • In particular, provide some very vague characterization of the expected resource expenditures in each option. This doesn't have to be a dollar or time figure, but just some sense of scale. For example, I think a bare-minimum filter's resources would likely be "extremely minimal" when considered relative to our budget, and just giving a sense of scale for in some semi-official capacity might help.
  • People like knowing who is where and why. It humanizes the board and breaks up the idea of a mythical monolithic "they" who have nefariously planned out the future in advance. When people don't see the disagreement, they're scared that there is no disagreement and the democracy is a sham. Let everyone, board members or advisers, anyone even semi-'official' who wants to wants to, just write a summary of their thinking on the subject, without necessarily committing to that thinking or promoting it, just sorta 'show' the thinking. Even 'undecided' or 'in the middle' or 'not focused on this particular sub-issue' would help people learn they're interacting with good-faith humans, not puppets or cult members. :)
    • As always, some people have come to done this on talk, and they do it well. I just want to encourage others to do it more, especially others who can represent other viewpoints or different thinking on the subject. If we ever evolved to the point that every single board member released a statement, I think that'd be great for banishing the 'evil conspiracy board' myth-- we just don't want anyone to feel 'pressured' into sharing their thinking. Private thoughts are okay too. :)
    • As silly as it is, just seeing that the thinking doesn't involve anyone "out to rid the world of evil images" might be helpful. I know that's comically obvious to longtime meta people, but project people don't know this, and they still worry that religious or cultural conservative americans are co-opting leadership.
  • Consider designating some 'advocates' who will explicitly advocate for a given option. Board members probably wouldn't be good choices for the role of advocates, but former board or advisers or just very active innerish-circle wikimedians would all do fine. This would be another very experimental thing, so think it through-- but again, a highlighting a very specific public debate, personified by a team of advocates, a debate where the public can be spectator or become a debater-- that might help to focus discussion. (or it might make things worse.. think it over).

I'm agnostic on whether these are needed or appropriate for this particular issue and this particular discussion-- but if not this issue, there will be more in the future-- labeling and explaining prototype options while encouraging alternatives, writing simple honest statements reflecting the current thinking of open minds, and also showing diversity of thinking in leadership-- those should all help to focus future discussions on the feedback we most need. --AlecMeta 16:39, 23 August 2011 (UTC)[reply]

I think it would be a very good idea to create some alternative implementation proposals (perhaps on a new subpage?) With this much discussion, it can be hard to mentally capture and evaluate the issues being thrown around. If we start a new subpage with various options under each heading, it could help. For example, we could have headings for:
  1. Repeal the board resolution. Could list arguments for doing so and reasons why it is unlikely to happen.
  2. Implement a personal image filter that does not use warning labels, but instead blacklist/whitelist functionality.
  3. Acknowledging that many view warning labels as an infringement on intellectual freedom because they prejudice the user against certain classes of images, develop a minimal, objectively-defined list for the most extreme cases where that infringement is justified, and where it does not apply unequally, in theory or practice, to limit depictions of any identifiable group of people.
  4. Conduct a global survey and identify the top 5-10 classes of objectionable images, attempt to codify what is objectionable about them, and create warning labels based on those criteria.
  5. Develop an interface that allows users to filter an unrestricted number of community-managed warning labels.
  6. Allow users to filter using a selection of 5-10 predefined existing categories.
And so on. The idea would be to coalesce sone of the concepts into a more concrete "What will this look like?" options. The alternatives would have some utility for moving forward (in combination with the survey results), but would primarily be to give users a sense of what is being proposed.--Trystan 14:00, 24 August 2011 (UTC)[reply]
I think you need to go read Warning 10 in the report. Using "community-managed warning labels" (like RFD tags) is not only far more work, but it would enable third-party censorship in ways that saying "Sexually explicit stuff: Click here to initially hide images in Commons:Category:Photographs of sexual intercourse, the hundreds of vanity shots in Commons:Category:Human penis, all the images Commons:Category:Sexual penetrative use of dildos, and so forth" would not.
Also, per-user blacklist functionality already exists—but only after the user has been exposed to the image, and only for logged-in users, which is not what our readers want. WhatamIdoing 19:07, 24 August 2011 (UTC)[reply]
Well, um, I'm a wikipedia reader, and that's actually what I want (The "eek! click it away!" option). When did we start differentiating between readers and editors anyway? Isn't that going to be one of the root causes why less people actually edit? ;-) --Kim Bruning 19:51, 24 August 2011 (UTC)[reply]
Then you've already got what you want (so long as you're logged in). What you want is not what a significant proportion of our readers want. WhatamIdoing 17:20, 25 August 2011 (UTC)[reply]
I too am worried about putting the interests of the "reader" over those of the "editor". That'd be bad. We don't have to please non-donor/non-editor populations. That said-- between global users and mobile users, there probably is a very genuine and legitimate need for rudimentary "shockingly upsetting" image shuttering. People sitting in a private office on a computer with a full keyboard is going to become the exception, not the rule. --AlecMeta 21:37, 24 August 2011 (UTC)[reply]
Today's readers are tomorrow's editors and tomorrow's donors. We do have to please these people, if we want Wikipedia to have a future. WhatamIdoing 17:20, 25 August 2011 (UTC)[reply]

Dubious archiving process[edit]

Some of the content originally posted on this page has been archived, some not. As necessary as it is to make this page readable, it should be explain why some older content has been kept here and some newer been archived. Chosing what to archive isn't that neutral. Some post where really useless and archiving was a good idea, but still, archiving on a current discussion *might* be seen as a way to hide/censor some suggestions. Cherry 11:47, 24 August 2011 (UTC)[reply]

Hi. The content is being archived strictly on date of last edit (which should be up to but not beyond the 20th at this point). While somebody seems to have duplicated some of the content from the archives for some reason (while still leaving it there, very puzzling :/) threads that have not been edited since that date have not been left intentionally behind and no edits past that date have been intentionally archived. (I discovered the restoration of some older content when archiving today; I believe it's all been rearchived...some redundantly. :/) --Mdennis (WMF) 15:51, 24 August 2011 (UTC)[reply]

I feel, that splitting and archiving the page is part of the masterplan. I miss a section about the Public. --Bahnmoeller 13:46, 24 August 2011 (UTC)[reply]

{{sofixit}} :-) ? --Kim Bruning 19:48, 24 August 2011 (UTC)[reply]

Arguments from the German poll[edit]

Here are the arguments taken from german poll: de:Wikipedia:Meinungsbilder/Einführung persönlicher Bildfilter. If anyone wants to translat it, go on... -- WSC ® 18:40, 25 August 2011 (UTC)[reply]

Argumente gegen die Einführung der Filter[edit]

  • Die Wikipedia wurde nicht begründet, um Informationen zu verbergen, sondern um sie zugänglich zu machen. Das Ausblenden von Dateien reduziert unter Umständen wichtige Informationen, die in einem Wikipedia-Artikel dargeboten werden. Dadurch könnte jede Art der Aufklärung und des Erkennens von Zusammenhängen eingeschränkt werden. Beispiele: Artikel über Künstler, Kunstwerke und medizinische Themen könnten absichtlich oder ohne Absicht des Lesers wesentliche Teile ihrer Informationen verlieren. Das Ziel, ein Thema neutral und in seiner Gesamtheit darzustellen, wäre dadurch gefährdet.
  • Die Kategorisierung von Inhalten nach Zumutbarkeit widerspricht dem Grundsatz von de:Wikipedia:Neutraler Standpunkt. In ähnlicher Weise argumentiert auch die de:American Library Association (ALA), die ein Markieren von Inhalten in Bibliotheken nach nicht-neutralen Gesichtspunkten strikt ablehnt und sogar als „Mittel der Zensur“ ansieht, wenn damit versucht wird, den Lesern bestimmte Empfehlungen zu geben oder sie vor Inhalten zu warnen.[1] Entsprechende Richtlinien hat die ALA in ihrer Library Bill of Rights festgehalten.[2]
  • Rücksichtnahme auf Interessen oder Vorlieben einzelner Leser oder Gruppen ist nicht Aufgabe einer Enzyklopädie (de:Wikipedia:Grundprinzipien). Für Wünsche an die Bildauswahl sind die Leser selbst verantwortlich (z. B. indem sie die Software ihres eigenen Endgerätes entsprechend einstellen oder eigene Filtersoftware einsetzen).
  • Gegner der beabsichtigten Filterkategorien und Dateienfilter sehen in deren Einsatz eine Zensur von Inhalten, welche dem Anspruch einer Enzyklopädie entgegenläuft.[7] Insbesondere wird hier die Gefahr einer Ausweitung der Filterung, eine Pflicht zur Filterung (Anmeldepflicht zum Deaktivieren des Filters) betont, wenn erst einmal eine entsprechende Infrastruktur geschaffen wurde.
  • Auch in Schul- und sonstigen Lehrbüchern finden sich Darstellungen von Gewalt oder expliziter Sexualität. Offenbar wird dies didaktisch nicht in Zweifel gezogen.
  • Schulen könnten ihre Schüler anweisen, u.a. bei Benutzung der deutschsprachigen Wikipedia die Filter zu aktivieren und keinen Gebrauch von der Bilder-Einblendmöglichkeit zu machen. Für Schüler, die diese Anweisung befolgen und keinen anderweitigen Internetzugang haben, käme das einer Zensur gleich. Entsprechendes ist auch für andere Organisationen, die Internetzugänge bereitstellen wie Bibliotheken, Parteien und Religionsgemeinschaften denkbar.
  • Der Filter ist keine Content-Sperre oder eine Jugendschutz-Software. Der Filter ist nur auf den Projekten von Wikimedia aktiv und soll sich problemlos von jedem Leser abschalten lassen. Da der Filter nicht den Zugriff auf Inhalte blockiert, dürfte dies kein Grund sein, weshalb de:Contentfilter-Betreiber ihre Mittel (Domainsperre, Sperren von Artikeln) zurückschrauben werden oder Wikipedia aus dem Index nehmen. Effektiv würde zweimal gefiltert werden.
  • Durch die Einführung von Bildfiltern entfällt nicht die Diskussion darüber, welche Bilder einem bestimmten Leser zuzumuten sind. Auch die Einordnung der Inhalte in bestimmte Ausschlusskategorien kann dadurch auf unterschiedliche Interessen und Vorstellungen treffen. Es ist unsinnig, bestimmte Inhalte noch und noch in Filterkategorien einzuordnen, da es nicht absehbar ist, dass klare Richtlinien darüber entwickelt werden können, welche Dateien in die verschiedenen Ausschlusskategorien gehören.
  • Die Vorstellungen davon, was als anstößig bzw. unerwünscht angesehen wird, können je nach Benutzer, kulturellem Hintergrund und Sprachversion voneinander abweichen. Die Verwendung global gültiger Filterkategorien ist daher nicht sinnvoll, da damit das Ziel, Filter zu erstellen, die allen Lesern und allen Kulturen gerecht werden, technisch nicht umsetzbar ist. Ein Leser, der selbst entscheiden will, was er zu sehen bekommt und was nicht, müsste sich die Bilder ansehen, was in einem Widerspruch endet.
  • Die Aufgabe, Filterkategorien einzurichten und aktuell zu halten, wird den Benutzern übertragen, die dafür Zeit und Mühe aufwenden müssen. Diese Ressourcen könnten stattdessen für anderweitige technische und inhaltliche Verbesserungen eingesetzt werden (Eine Übernahme bestehender Kategorien von Commons als Filterkategorien ist allenfalls eingeschränkt möglich. So finden sich beispielsweise in der Kategorie Violence (Gewalt) und deren Unterkategorien nicht nur Gewaltdarstellungen, sondern auch Bilder von Mahnmalen, Demonstrationen, Portraits, kritischen Karikaturen usw.).
  • Auch ist fraglich, in wie weit die Wikipedia die Filter überhaupt betreiben sollte, schließlich wurde der Inhalt von den eigenen freiwilligen Mitarbeitern nach den de:Wikipedia:Grundprinzipien erstellt und in einem kollektiven Prozess als behaltenswert anerkannt. Eine Ausblendungsmöglichkeit eigener Inhalte zu ermöglichen erscheint daher paradox.
  • Es könnte ein Bumerang-Effekt eintreten: Bearbeiter würden dann nach dem Motto „Dafür gibt es Filter“ ungehemmter potentiell Anstößiges in Artikeln unterbringen.
  • Die fachliche Qualifikation und methodische Vorgehensweise von Robert Harris und dessen Tochter Dory Carr-Harris, den beiden Autoren des Harris-Reports, auf dem der Beschluss des WMF-Kuratoriums maßgeblich fußt, sind fragwürdig. Robert Harris ist ein Rundfunkmoderator und -journalist, der 30 Jahre bei der de:Canadian Broadcasting Corporation (CBC) gearbeitet und dort u. a. mehrere Sendereihen über klassische Musik produziert hat. Zudem hat er einige Einführungsbücher zu klassischer Musik verfasst. Bei der CBC hat Harris 17 Jahre mit der heutigen Wikimedia-Geschäftsführerin Sue Gardner zusammengearbeitet und wurde 2010 von der Wikimedia Foundation als Berater angeworben. Es ist unklar, wodurch sich Harris über seine journalistische Erfahrung hinaus als Gutachter für umstrittene Inhalte der Wikimedia qualifiziert. Harris vergleicht Wikipedia mit CBC,[3] aber eine wissenschaftliche und allgemeinbildende Enzyklopädie hat andere Ziele und Methoden als eine journalistische Institution. Der Bericht ignoriert den kritischen Diskurs zur bewertenden Kennzeichnung von Medien, wie ihn z. B. die American Library Association führt.
  • Das Argument der Foundation (de:Principle of Least Surprise), welches voraussetzt, dass Menschen lieber wenige Überraschungen erleben, ist aus den Computerwissenschaften (Ergonomie von Computerprogrammen) übernommen. Sowohl psychologisch betrachtet, als auch in den Kommunikationswissenschaften, wird überwiegend eine gegenteilige Meinung vertreten. In der Presse beispielsweise werden Fotos zur Verdeutlichung als auch zur Weckung des Interesses verwendet (vgl. bspw. [8]).
  • Die Filter verstoßen gegen die enzyklopädische Säkularität. Der Harris-Report empfiehlt einen besonderen Umgang mit sogenannten „Bildern des Heiligen“,[4] aber schlägt Filter nur ausdrücklich für sexuelle und gewalttätige Bilder vor. Die Resolution des Kuratoriums geht davon aus, dass Benutzer nicht nur sexuelle und gewalttätige, sondern auch religiöse Inhalte als anstößig empfinden können und verweist zudem auf die Verschiedenartigkeit nach Alter, Herkunft und Wertorientierung von Benutzergruppen.[5] Im Rohentwurf der Foundation wird nun ausdrücklich auch ein Filter nach der Kategorie „Bilder des Propheten Mohammed“ geplant.[6] Eine Filterung nach religiösen Vorlieben widerspricht aber der Neutralität und dem universellen Bildungsanspruch einer Enzyklopädie.

Argumente für die Einführung der Filter[edit]

  • Leser, die zum Beispiel die Darstellung von Gewalt oder Sexualität als anstößig empfinden, sich davon in ihren Gefühlen verletzt sehen oder nicht von davon überrascht werden wollen, können entsprechend kategorisierte Dateien ausblenden.
  • Am Arbeitsplatz oder in einer öffentlichen Bibliothek kann es für den Benutzer der Wikipedia von Nachteil sein, als unpassend oder anstößig empfundene Bilder auf dem Bildschirm zu haben. Die Filter wären ein Hilfsmittel, derartige Situationen zu vermeiden.
  • Es könnten eine größere Leserzahl erreicht und zusätzliche Autoren gewonnen werden, weil manche Leser und potenzielle Mitarbeiter die Wikipedia oder bestimmte Artikel nicht mehr wegen als anstößig empfundener Darstellungen meiden.
  • Bestrebungen, potenziell anstößige Inhalte vollständig zu entfernen (zum Beispiel durch Löschung von Bilddateien), wird der Wind aus den Segeln genommen.
  • Durch die Einführung der Filter könnten öffentlich geäußerte Vorbehalte gegenüber der Wikipedia, die auf der Darstellung von angeblich fragwürdigen Inhalten beruhen, reduziert werden (→ Diskussion).
  • Es handelt sich nicht um Zensur, da es sich ausdrücklich um persönliche Filter (siehe Problembeschreibung) handeln soll, die nur auf Benutzerwunsch aktiviert werden. Die freie Wahl eines jeden Benutzers soll über mehrere Funktionen sichergestellt werden:
    • Der Benutzer wird über die Möglichkeit informiert, für ihn unangenehme Inhalte zu filtern.
    • Der Benutzer entscheidet selbst, ob er die Filterung aktivieren möchte (de:Opt-in).
    • Der Benutzer kann die Filterung jederzeit wieder deaktivieren oder die ausgeblendeten Bilder einzeln einblenden.
    • Dies gilt nach derzeitiger Planung auch für unangemeldete Benutzer [9]. Wahlweise – je nach Ausgang des Referendums („Es ist wichtig, dass die Funktion sowohl für angemeldete als auch für nicht angemeldete Benutzer verfügbar ist“) – ist es auch möglich, dass es für unangemeldete Benutzer gar keine Filter geben wird.
  • Es ist unklar, ob ein Verzicht auf die Filterfunktion überhaupt technisch möglich sein wird. Falls sie „fest eingebaut“ sein wird, würde die deutschsprachige Wikipedia sich mit einem Filterverbot von Weiterentwicklungen der de:MediaWiki-Software abschneiden bzw. müsste eine Parallelversion der Software selbst fortführen.
  • Die Wirksamkeit des vorgeschlagenen „Verbots“ der Filter ist fraglich, da es sich technisch leicht umgehen lässt: Die Haupt-Filterkategorien mit den meisten Bildern werden sich bei den Wikimedia Commons befinden. Darauf könnten sowohl externe „Zensoren“ als auch Zusatzsoftware wie zum Beispiel ein de:Browser-Plugin zugreifen, mit dem sich die Filter exakt nachbilden ließen. Ein Verbot der Filterkategorien in der deutschsprachigen WP wäre bei Bedarf durch ein von Dritten betriebenes Filterkategoriesystem umgehbar. Ein „Filterverbot“ könnte solche Drittlösungen provozieren, die dann nicht mehr kontrollierbar sind und Zensurmechanismen vorsehen könnten.
  • Angemeldete Benutzer können schon jetzt über ihre CSS-Einstellungen einzelne Inhalte ausblenden; es sehen also ohnehin nicht alle Benutzer dasselbe.
  • Die Wikimedia Foundation begründet die Einführung der Filter auch mit dem z. B. in der englischsprachigen Wikipedia geltenden[7] „Prinzip der geringsten Überraschung“. Dies bedeute, dass der Inhalt einer Seite den Lesern auf eine Art dargestellt werde, die ihre Erwartungen respektiert.[8]
  • Der Harris-Report empfiehlt, Benutzern zu ermöglichen, Bilder (mit Darstellung von Sexualität und Gewalt) „in einklappbare Galerien zu platzieren, damit z. B. Kinder diese Bilder nicht versehentlich oder unerwartet zu sehen bekommen“.[9]

Refs[edit]

  1. Template:Internetquelle
  2. Template:Internetquelle
  3. „The CBC is an interesting place. Like Wikipedia, it is a powerful (in its world) and respected information-providing institution, dedicated to public service and the provision of unbiased (the analog equivalent of NPOV) news and information to the Canadian public. However, like your projects, the power of the institution, and its public-service character, make it the focus of intense and perfectly legitimate discussions over content, balance, mandate, and the need to serve different publics simultaneously.“, Robert Harris, meta:2010_Wikimedia_Study_of_Controversial_Content/Archive
  4. Images of the „sacred“, Harris-Report, 2010
  5. „Some kinds of content, particularly that of a sexual, violent or religious nature, may be offensive to some viewers; […] We recognize that we serve a global and diverse (in age, background and values) audience, and we support access to information for all“, Resolution, 29. Mai 2011
  6. „pictures of the prophet (sic) Mohammed“, Personal image filter, Overview of this system, Mediawiki
  7. Wikipedia:Writing better articles - Principle of least astonishment in der englischsprachigen Wikipedia
  8. „We support the principle of least astonishment: content on Wikimedia projects should be presented to readers in such a way as to respect their expectations of what any page or feature might contain“, Resolution, 29. Mai 2011
  9. „The major recommendation we have made to deal with children and their parents is our recommendation to allow users (at their discretion, and only for their personal use) to place some images (of sexuality and violence) in collapsible galleries so that children (for example) might not come across these images unintentionally or unexpectedly. As we noted in our section on basic principles, we did so because we believed it would show some basic respect and service to one group of our users (those worried about exposure to these images) without compromising the different needs and desires of another (those desiring, even insisting, the projects stay open).“ Children, Harris-Report, 2010

Translation and Discussion[edit]

Google Translation of Poll and its Talk
Never trust Babelfish! Even it's made by Google. -- WSC ® 08:54, 26 August 2011 (UTC)[reply]
Note that the real discussion takes place on the talk page of the German poll, and reflects, largely (as far as I read) the tenor and conclusions of the discussion here. Rich Farmbrough 23:30 29 August 2011 (GMT).

Discussion page unsuable[edit]

This arbitrary change to the discussion page makes it unusable. Instead of seeing new things pop up at the bottom where I knew how far I had read they are now all over the page. I also can't remember any discussion that the talk page should be changed to this format and now discussion how the topics would be arbitrarily divided among the arbitrarily chosen headers. This makes this discussion more difficult to follow. --94.134.219.11 17:08, 23 August 2011 (UTC)[reply]

I can't find which topics I had already read and which I hadn't read. Change it back.

Please put the talk page back to the way it was. --Shabidoo 19:04, 23 August 2011 (UTC)[reply]

Censorship is for other people![edit]

Did you ever hear anyone say, "That work had better be banned because I might read it and it might be very damaging to me"?
Joseph Henry Jackson

The complaint of most people who don't like it that Wikipedia has pictures of vulvas, or Muhammad, or whatever, is not that they don't want to see them, but that they don't want other people to see them. As a consequence, this measure will satisfy few people. It will make little difference to them if they can hide the horrible, offensive sight of the human body from their own eyes, but poor vulnerable children will still be exposed to corrupting genital mind-rays. It will not satisfy the religious if their particular object of veneration, forbidden from sacrilegious depiction, is invisible on their own computer, but readily visible on a million computer screens around the world. What this will do, however, is provide a facility that can very easily be changed from opt-in to opt-out at some future time. And given that the "think of the children" brigade will never be happy until other people are unable to view certain images, we will have made it much easier for them to lobby for the imposition of their prudish or superstitious ways on the rest of us, who want Wikipedia to present the world as it is, unfiltered by the content police. Beorhtwulf 21:03, 23 August 2011 (UTC)[reply]

Couldn't be said better. --Shabidoo 23:05, 23 August 2011 (UTC)[reply]
This is the strongest argument against investing time in making any sort of one-click solution easy: that few people will be directly satisfied by having this feature (satisfaction is hard to measure...), that it will not support the desires of parents or teachers or community leaders who wish to impose actual filters on their families/schools/communities - and yet that it will make it easier to switch on "opt-out" filtering in the future ('easier' is hard to measure, but this is surely true to some degree.) However even if one accepts this argument, there are still some features (such as the ability to hide all images, or to turn shutter a single image that offends) that would not run this risk and could quickly be realized. SJ talk | translate   03:12, 6 September 2011 (UTC)[reply]
Much truth. We can try to help our readers themselves, but we must never help the true censors. The true pro-censorship crowd gets nothing out of this filter, absolutely nothing. --AlecMeta 23:51, 23 August 2011 (UTC)[reply]
Absolutely nothing except a comprehensive and semi-reliable database of tags maintained by a party other than themselves, that they can use to implement an opt-out or even no-opt content filter. In other words, the wiki* community will be doing the true censors' work for them. 41.185.167.199 03:40, 24 August 2011 (UTC) a.k.a. Bernd Jendrissek[reply]
You don't seem to have understood the proposal. There are no tags being used in this proposal. The report specifically warns against the use of tags. The proposed filter is based on categories, which already exist and which therefore could already be used by any would-be censors. If a censor wanted to restrict access to the images in Commons:Category:Penis, they could have started doing that in 2008, when the category was created. They could do that today. They could do that tomorrow. They could do that no matter what the WMF does or does not do with this filter. WhatamIdoing 19:27, 24 August 2011 (UTC)[reply]
Category, tag: the same thing. What's being proposed is some way to flag content. The precise technical implementation of said flags, cats, tags, and how we call it, is irrelevant. Zanaq 18:41, 25 August 2011 (UTC)[reply]
No, there really is an important technical difference here. Tags make third-party censorship very easy. Saying "Tick here if you don't want to look at images in Commons:Category:Photographs of sexual intercourse" does not have any effect at all on third-party censorship. The images are already in that category; would-be censors can already use that information to suppress images. Using tags (or specially created categories, like "Category:Images that offend people because they show naked bodies") would give censors new and specifically designed methods of identifying images for censorship. This proposal gives censors nothing that we didn't give them years ago, when we decided to organize images on Commons. WhatamIdoing 21:04, 25 August 2011 (UTC)[reply]
I completely agree. The true censors will need to look at the images and to decide what to censor for others. The WMF does not protect them, the WMF proudly invests in their new censorship tools. Overall an disgusting idea. --Niabot 02:21, 24 August 2011 (UTC)[reply]
Current cats are fine, if that's what we're actually going with; as long as the *users* determine which cats they do or do not want to see. There is still a danger here though: if certain categories are popularly selected by users, they can de-facto become tagging-categories, even if that was never the intent. --Kim Bruning 19:55, 24 August 2011 (UTC)[reply]
Current image categories could 'seed' initial filter categories, but on-going, the filter categories would be fundamentally different than image categories as we now know them. There will be different kinds of arguments, different emotions, different standards of inclusion, and probably whole different populations of editors. --AlecMeta 21:18, 24 August 2011 (UTC)[reply]
Would you PLEASE stop saying "will". This only confirms that the decision has already been made and that this is not a referendum but some formal process to affirm what you already plan to implement. --Shabidoo 21:45, 24 August 2011 (UTC)[reply]
That is excellent feedback. My usage of "will" is entirely hypothetical and I'm absolutely nobody important, speaking with absolutely zero authority or direct involvement with governance-- but yeah it does send the wrong impression. I'll switch to "would" or "might" or something. :) --AlecMeta 04:08, 25 August 2011 (UTC)[reply]
not just a whole different group of editors would be appropriate, but an entirely separate organization from the WMF, one frankly and openly devoted to censoring information, rather than disseminating it. People could use their filters in various ways, if they chose--this is free software, and anyone can modify it. All the arguments supporting this proposal is for why censorship is needed in particular situations. Let's assume this true. Those who advocate censorship and think they can do it and get it right and universally appropriate and non-obtrusive should go right ahead and do it somewhere. Somewhere else. It's perfectly all right to work on other places as well as WMF projects--I've worked on two other wikis myself, & one is a wiki with closed editing. Right for its purpose, which is advocacy. Not right for WMF, any more than this proposal is. DGG 19:05, 25 August 2011 (UTC)[reply]
That has to be one of the most misleading statements given here. It is a simple fact that millions of people can't read the site because there is no protection, and you are saying that people will somehow be "censored" because they have to make one extra click to see an image? Wow. That is just disturbing and I lost all respect for you. That is beyond logical and is one of the most slanted POVs that could possibly exist. As a librarian, do you give children porn or do you abide by those "censors" and not? Ottava Rima (talk) 21:05, 25 August 2011 (UTC)[reply]
As I understand it, the largest group of people who cannot read Wikipedia are the ones in PRC and countries with a similar approach to freedom of expression, , and I hope you're not suggesting we try to meet their standards. The main other group will be schools that do not permit it, and I don't think most of those would permit it no matter what we did with illustrations. After all, we talk about things they may not want their students to know about. DGG 04:50, 26 August 2011 (UTC)[reply]
Outdent - "approach to freedom of expression" So basically if they don't have your view they don't have "freedom of expression"? Odd, because countries like Germany and the Netherlands restricts various things like certain kinds of political cartoons as "hate speech" yet have lesser restrictions on pornography. Regardless of your false claims, would you give a child under 18 pornography as a librarian? Because that is what this boils down to - protecting children. Do you think children have the "right" to access pornography or not? Ottava Rima (talk) 14:13, 26 August 2011 (UTC)[reply]
Libraries very explicitly disclaim any responsibility for what patrons take out. Some, but not all, libraries restrict children's cards to the children's collection, but I am aware of no public library that monitors and filters what is being checked out. Just the opposite, in fact.[10] It is up to parents to determine what is appropriate for their children, not librarians.--Trystan 15:07, 26 August 2011 (UTC)[reply]
Disclaiming responsibility and knowingly providing access to are two very different things. It is not up to the parents to determine what is appropriate for children under US law - many parents have been arrested and jailed for knowingly providing access to pornographic material to those under 18. Librarians know that they legally cannot provide access to pornographic materials to those under 18 and require proof of identification even for computer use of that. DGG knows this. Ottava Rima (talk) 17:27, 26 August 2011 (UTC)[reply]
Wikipedia is one of the most incredible experiments ever achieved. It is the result of combined writing and edits and compromise of thousands of contributors and a completely democratic framework which expects nothing more than transparency and openness and now there is a "referendum" which has been organised from the beginning to get the answer the organizers expect...which will give the tools to people to have a dumbed down, tempered if not censored version of this incredible experiment. No...its not a misleading statement at all Ottava Rima.--Shabidoo 00:04, 26 August 2011 (UTC)[reply]
Wikipedia is not a Democracy. You do not have a right to edit it. Before you preach about what we are, please get the basics right. Furthermore, you are not DGG so don't attempt to speak for him. Ottava Rima (talk) 02:30, 26 August 2011 (UTC)[reply]
Then why is there a referendum if there is no democracy? Why do we vote for the board if there is no democracy? And even if there is no democracy then there is still transparancy and openess....which you forgot to mention in your response. Why did you leave that out? Because there is no transparency in this referendum and you know it. I do not have to speak for DGG I am speaking for myself. And I probably reflect the dozens and dozens of users who are exasperated by the management of this farce "referendum". --Shabidoo 02:47, 26 August 2011 (UTC)[reply]
You just said that this referendum proved that the WMF was violating the standards of Democracy, now you are trying to say it is proof of Democracy? Wow. I really think you need to sort out your own confused ideas before trying to get others to agree with you. And you aren't speaking for yourself because I wasn't talking to you before. P.S. asking for opinions isn't a Democracy. Ottava Rima (talk) 02:49, 26 August 2011 (UTC)[reply]
I didn't say any of the things you claim I said. This referendum is a farce. It is not even close to open and transparent...just looking at the FAQ of this referendum is evidence enough. I did not link this referendum to democracy once in anything I said. I said that the wikipedia experiment is. Nor do I need to be lectured by you, some one who has had their account on wikipedia blocked dozens of times to tell me what is or what is not democratic. What a farce. --Shabidoo 02:55, 26 August 2011 (UTC)[reply]
Outdent - 1. "I didn't say any of the things you claim I said." 2. "a completely democratic framework which expects nothing more than transparency and openness and now there is a "referendum" which has been organised from the beginning to get the answer the organizers expect" 3. "Then why is there a referendum if there is no democracy?" You implied that a referendum must be Democratic. Wikipedia is not Democratic and has never been a Democracy. Under w:WP:DEMOCRACY: "Wikipedia is not an experiment in democracy or any other political system." Your complaints are meritless and show a disrespect for the system as a whole in addition to contradicting themselves. Ottava Rima (talk) 03:13, 26 August 2011 (UTC)[reply]
Dear Ottava Rima, where exactly on Wikipedia have you seen pornography? Pictures of vaginas or penisses aren't pornographic. Just look at the article on [11], please.
Pornography is already banned on Wikipedia, and for good reason. Pictures of naked humans aren't immediately pornographic, and none of the pictures on WP can be pornographic since they are all here for educational purposes, clearly not "for the purposes of sexual arousal and erotic satisfaction". Hence none of the nude pictures on Wikipedia can harm any child in the world. Parents who don't let their children explorer human sexuality in a safe way - like here on WP - are the ones that harm their children. This should be obvious. --178.201.100.215 20:11, 26 August 2011 (UTC)[reply]
There are pictures of people having sex. If you want to say that is not pornography, then you are using a very different definition of the term than what is within the law. And merely claiming something is educational does not make it so and the law doesn't really care if it is "educational" pornography or not. If you want to say that exploring sexuality is supposed to be a purpose of Wikipedia, then I hope you are banned for encouraging the predation of minors. Having pornographic images in an environment with children is the one of the primary ways pedophiles groom children and the WMF has taken a stance against allowing pedophiles and grooming to take place here. Ottava Rima (talk) 20:29, 26 August 2011 (UTC)[reply]
"Within the law"? Maybe American law, but not UK law, for instance. And also not in many other countries that have fluent english speakers. So, please elaborate on that and link to some law texts, otherwise your argument is completely void to start with.
All I am saying by the way is that it would be really weird for us Wikipedians to not trust our own articles here. So if the definition on the en Wikipedia doesn't capture your definition of pornography, then please work on the article together with the other authors and find a better definitions. Then come back and go on discussing. Before that, let's use the official WP definition, shall we?
And with all due respect, for the vast majority of human history, children have naturally seen adults having sex. There was no other way, out in the nature, without houses and beds. To say that merely seeing "people having sex" can be harmful to children is therefore a pretty big claim on your side, so please go ahead and provide evidence.
Last thing, I hope you take the whole accusation of "encouraging the predation of minors" back, otherwise I will report you for this. Yes, I think Wikipedia can be educational for children. And yes, I think Wikipedia should be educational on all topics. So naturally it follows that Wikipedia may provide sexual education of children, for instance by showing them non-pornographic images of people having sex. Just look at the German Wikipedia's "sexual intercourse" article and tell me which of these (drawn) images exactly could harm a child: [12].
Grooming can't happen to a properly sexually educated child. I'm talking about sexual education at ages 6 here, like it is common in countries such as Netherlands or Germany. Anything else will set children up for something unknown, and they will follow it to their doom. If a child knows what it wants and what it DOESNT want, if it has the possibilty to explore sexuality in a safe environment, then a child predator will not seem interesting to it. --178.201.100.215 21:20, 26 August 2011 (UTC)[reply]
We follow US law. Furthermore, UK law does not allow the giving of pornography to children or allowing them to access that, so your claims are highly inappropriate. And if you think they can't harm a child, then you disagree with a lot of psychiatrists and are arguing a stance that those like NAMBLA argue, which sickens me. An encyclopedia is not a place for children to "explore sexuality" and it is really disturbing that you would try to make it such. Ottava Rima (talk) 21:25, 26 August 2011 (UTC)[reply]
Not a single citation of one of these psychiatrists, not a single word of evidence, not a single word about my argument - instead more accusations by putting me next to those NAMBLA sickos. You clearly know how to disqualify from a discussion. Would you be so kind and at least tell me if you REALLY think that children in the stone age were harmed by their parents having sex in their vicinity? Don't forget that we humans are just animals, and I hope you don't think that puppies are psychologically harmed by seeing their mother getting mated? Just, why whould we humans be different? In which way would it have been a evolutionary benefit for humans to develop psychological problems when something very, very important to reproduction (hint: sex!) happens in their vicinity? This obviously doesn't make *any* kind of sense (Or are you one of those creationist nutjobs?). The only reason that something like this may harm children's psychology is if the society they are raised in makes a big deal out of it. --178.201.100.215 22:05, 26 August 2011 (UTC)[reply]
Quite strange, you claim I need to prove that it causes harm. Odd, because if the laws are already in Germany and every other nation forbidding it. It would be up to you to prove why we should break the laws, which is impossible. But your argument boils down to something that is rather inappropriate - you are trying to say that we should show children pornography. Ottava Rima (talk) 01:34, 27 August 2011 (UTC)[reply]
Either you're too ignorant or too dumb to understand that this is precisely *not* what I am saying. Pornography provides a deranged image of human sexuality, e.g. a very patriarchal image in which women are just objects to please male sexual needs. THAT is pornography to me, and THAT is what I also believe can be harmful to children. My argument, on the other hand, is wether seing people have sex is harmful to children. I don't know any laws that forbid having sex in the vicinity of a minor, e.g. in your marital bed while your three months old child is sleeping next to it. The reason porn is forbidden is because pornography doesn't show people having sex as they would in real life *at all*. Do you finally get it, or is your mind still numbed by the sheer prudishness of your personality? --178.201.100.215 09:36, 27 August 2011 (UTC)[reply]
The law disagrees with your POV, and it is odd how you think that women would not look at pornography. Ottava Rima (talk) 12:52, 27 August 2011 (UTC)[reply]
I would dispute that a man looking at gay porn is objectifying women. Though I do agree that there is a lot of unappealing, purely anatomical porn out there (vast archives of "cumshots", for example) I don't believe in general that one sex is objectified more than the other - actors of both sexes are hired the same way, after all. Though one would like to see a better level of quality, bad art is not a crime. Wnt 16:05, 27 August 2011 (UTC)[reply]
I neither said that women don't look at pornography, nor that women are the only sex that is objectified (that's why I wrote "e.g."). But apparently you are too stupid or ignorant to understand this. Since you simply ignored the real content of my comment, because you don't have ANYTHING to say against my arguments, I consider that you, in fact, don't know any kind of intelligent response. Well done! Frankly, I believe you are the exact reason why children get abused in the world: ignorant idiotic puritans that never, ever would open their minds for anything that doesn't fit their own view of the world. --178.201.100.215 12:13, 28 August 2011 (UTC)[reply]
You are one of the most illogical people ever. You claim that children are abused because of "puritans", but it is those who make claims like yourself that are the ones who have been revealed as pedophiles on Wikipedia. Such people distract from their real problems and wish to have an environment that has children able to access pornography to help groom them. That is why every country bans children from having access to porn, even Germany. Your incivility is hallmark of all such people with your view - without any logic or reason behind you, you bad mouth and attack in hopes of bullying people into letting their guard down so you have access to children. That sickens me. Ottava Rima (talk) 15:19, 28 August 2011 (UTC)[reply]
German law makes it clear that those under 18 aren't allowed to see pornography, whether it be "educational" or not. That is -your- country. Also, "Germany has a very broad definition of child pornography, which includes images of all real or fictional people who either are under the age of 18 or who appear to be below 18 to an "average viewer". " Would mean that most of the lolicon images at Commons would have to be deleted per German law. It isn't just a "United States" thing as many German users are trying to claim. Ottava Rima (talk) 21:30, 26 August 2011 (UTC)[reply]
What you don't take into account is the fact that "pornography" is rarely defined in any law book. So what you think is pornography doesn't apply here. Please understand that I'm not voting for having pornography on WP, but e.g. drawn pictures such as on the German WP of people having sex. These are *not* pornography by German law and you will not find any judgement by a German court that says otherwise. On the contrary: by a popular judgement of OLG Düsseldorf, Pornography is defined as "grobe Darstellungen des Sexuellen, die in einer den Sexualtrieb aufstachelnden Weise den Menschen zum bloßen, auswechselbaren Objekt geschlechtlicher Begierde degradieren. Diese Darstellungen bleiben ohne Sinnzusammenhang mit anderen Lebensäußerungen und nehmen spurenhafte gedankliche Inhalte lediglich zum Vorwand für provozierende Sexualität." -> translation: "rough depictions of sexuality, which degrade humans to an exchangable object of sexual desires in a way that spurs sex drive. These depictions stay without a connection to other expressions of life, other content is just an excuse for provocative sexuality."
So all you have shown is that you have no clue~about German law. Good job! :) --178.201.100.215 22:05, 26 August 2011 (UTC)[reply]
Pornography is actually very well defined in case law. Depictions of the sexual act have always fallen under pornography. The focus on the genitalia has also. Germany has well defined case law on the matter and is really strict. Furthermore, German law says "or fictional". That includes cartoons. Ottava Rima (talk) 01:34, 27 August 2011 (UTC)[reply]
Actually, I think you will find it quite different in different jurisdictions at different periods. Would you care to give an exact universal definition? Or even an exact one for Germany? Exactly how much has to be shown of the sexual act to be pornographic? Does it apply to all sexual acts, or only heterosexual copulation? Exactly how detailed must the focus on the genitalia be? Is it the same in both sexes? Is it the same for all media? And if you give the German legal text, are you quite sure it's the same anywhere else in the world?
at least in the US in the early 20th century, pornographic was defined as being sexually exciting, or whose purpose was only to be sexually exciting. This was apparently assumed to be a bad thing. Myself, I regard arousing sexual excitement as a fundamentally good thing, (with the usual exceptions that apply to anything however good in itself) and I think so does much of the world. But it's also a basically individual matter just what is sexually exciting. I for example do not find most of what is commercial pornography in the US to be the least exciting. You may be different, and you have a perfect right to be, but on which of us should it be based? DGG 04:40, 27 August 2011 (UTC)[reply]
Pornography is defined by law now. It doesn't matter how you define it, it is quite clear what is pornography and what isn't. So answer the question: As a librarian, would you give those under 18 pornographic material or not? Ottava Rima (talk) 12:52, 27 August 2011 (UTC)[reply]
Any sexual depictions are in fact legal in Germany when it is for scientific/educational purposes. I would argue that WP qualifies for that. It matters a whole lot to German jurisdiction if you show an image depicting people having sex in order to demonstrate an educational article, or if you show it to someone - possibly a minor! - with the intent of increasing their sex drive. You jsut go on by saying that Germany bands this kind of stuff, yet these are all just random claims that you don't back up with anything but your ignorance of actual German jurisdiction. Find me a case that supports your thesis; I already showed and translated a part of German jurisdiction that backs up *my* claim. --178.201.100.215 09:49, 27 August 2011 (UTC)[reply]
I personally would like to have some warning before I find myself downloading an image that is illegal in the jurisdiction that I live (in my case Britain). I have read both sides of the argument on whether pornography causes harm and that has left me mainly as uncertain as before. But the arena to settle this is thru elected parliaments. Given that wikimedia are accessible from many jurisdictions there should be the option for users to avoid breaking the laws of where they live when they don't intend to.Dejvid 11:35, 27 August 2011 (UTC)[reply]
To my knowledge, merely accessing these images is not illegal in most jurisdictions. The image itself also isn't illegal; only sending it to other people is. Therefore, you don't have to be afraid of anything since there is no law preventing you from downloading those images to your computer. The fact that jurisdictions haven't taken action against WP so far is a good indicator that what we are doing is completely fine. The FBI knows about the claims tht WP hosts child porn after Larry Sanger's smear campaign, and still nothing has happened to WP. If we really are to let the jurisdictions decide, then it's best for us to wait out until someone from law inforcement actually complains.
Last but not least, I fail to see in which article there's suddenly pornographic images jumping at you although the lemma of the article is not sexual to start with. Care to give me some insight? "Virgin Killer" apparently doesn't count, that case was discussed way enough. --178.201.100.215 11:51, 27 August 2011 (UTC)[reply]
"To my knowledge, merely accessing these images is not illegal in most jurisdictions" Quite untrue. Even German law says that accessing is illegal. Dejvid is 100% correct and your use of an IP here shows that you are pushing something that you know is quite wrong and refuse to connect it to your account in doing so. Ottava Rima (talk) 12:52, 27 August 2011 (UTC)[reply]
Accidental accessing of child pornography is not illegal by German law. It's only forbidden to intentionally acquire such images (Beschaffung), and only if the material isn't of artistic nature. Any graphical depiction of child pornography, that doesn't focus on realism (based upon real child pornography) is also not forbidden. See: Mutzenbacher-Entscheidung. Otava Rima! I must mention it again: You use one wrong argument after the other. Hopefully you don't do it intentionally and lie to us, all the time. --Niabot 15:39, 27 August 2011 (UTC)[reply]
"Accidental accessing of child pornography is not illegal by German law. " There is no proof of that. There is no way to say "I accidentally saw it". "Accident" is not a defense in Germany. They go off your IP and the material. It is that simple. Ottava Rima (talk) 21:13, 27 August 2011 (UTC)[reply]
Thats a huge difference to the US-System! In Germany the prosecutor has to proof that you did it intentionally to acquire this kind of content. If it really is an accident by yourself, then he has no way to proof it. "benefit of the doubt" always applies in such cases. You should really start get some basic knowledge about the world outside of US. --Niabot 22:39, 27 August 2011 (UTC)[reply]
"the prosecutor has to proof that you did it intentionally to acquire this kind of content" Nope. You can't just make outrageous claims like this. Ottava Rima (talk) 23:41, 27 August 2011 (UTC)[reply]
Thats simply how a Rechtsstaat works. No one is guilty as long it isn't proofed that he is guilty. --Niabot 07:01, 28 August 2011 (UTC)[reply]
Sorry, but the mere having the child porn on their computer is proof enough of guilt. That has always been true regardless of what you say. Otherwise, everyone could claim anything was "accidental" or that they didn't actually do it, which is laughable and preposterous. Germany has had a lot of major busts of child pornography rings lately, which shows that they are one of the least tolerant nations regarding child pornography in the world. Ottava Rima (talk) 15:19, 28 August 2011 (UTC)[reply]
Even accidentally accessing child porn in the US can result in prosecution. There are zero exceptions to that law, not even for academic research or software companies creating anti-porn filters. As written, US admins at Commons who receive a complaint about child porn and access the page for the purpose of verifying that the file contains what the complaint alleges it does and deleting it could be prosecuted. It's stupid, and the DA who filed the charges would probably be crucified politically, but that's the law.
"It was an accident" is a valid defense, but you have to convince a jury to believe you. An inexperienced US school teacher, for example, was convicted when her classroom computer started showing (adult) porn pictures. She couldn't figure out how to make it stop (porn malware's hard that way), had been told not to turn off the computer under any circumstances, and didn't have enough wits to cover up the screen. It really was an accident—nobody triggers malware on purpose—but the jury didn't believe her. (After several years of legal wrangling, the case eventually resolved with the teacher pleading guilty to a much lesser charge and surrendering her teaching license. So now everyone agrees that it really was an accident—but the victim of this accident has been punished with a federal criminal record, a fine, and the loss of her career.) WhatamIdoing 17:10, 27 August 2011 (UTC)[reply]
A sad and upsetting story. But how is this related to the filters anymore? --Niabot 17:20, 27 August 2011 (UTC)[reply]
There is no way to prove that you "accidentally" looked at child porn. It is a bad defense and unbelievable. If someone wants to make up a story that they didn't even know how to turn off a monitor or that they accidentally showed porn to kids, then they should go to jail for stupidity in thinking people would buy that. But if they did "accidentally" show it, then your argument is actually in favor of the filter which would help prevent such a thing when accessing Wikipedia for her class. Ottava Rima (talk) 21:13, 27 August 2011 (UTC)[reply]
Luckily we don't have a jury inside German courts and it must be proven that you did not accidentally came across such material. There must be at least enough evidence, to proof that you did it intentionally. There is no jury to bribe with. --Niabot 22:44, 27 August 2011 (UTC)[reply]
Why the hell are you talking about child porn now? I surely wasn't. And there is no child porn on the commons, not even closely. I was talking about pornographic images (even by my more narrow definition) of adults. No child porn, no bestiality, nothing fetish-related, just normal hardcore porn of people fucking each other. Please show me the point in the US, German and UK lawbooks where it says that accessing adult porn is illegal. For Germany I can guarantee you that accessing adult porn is not illegal for any person. If you don't provide concluse evidence like this, every reader of this discussion will know that I am right indeed. Seriously, I bet you don't even KNOW your own pornography law... --178.201.100.215 12:13, 28 August 2011 (UTC)[reply]
Under German law, "lolicon", the stuff that Niabot has uploaded, is illegal for merely having the "semblance" of "youthful" features. That shows that his arguments against the US as puritan is completely without merit. And what are you on about "accessing adult porn"? It was pointed out that it was illegal for those under 18 to access porn in Germany and just about everywhere else. Please log into your actual account instead of hiding behind your IP. Ottava Rima (talk) 15:19, 28 August 2011 (UTC)[reply]
"It was pointed out that it was illegal for those under 18 to access porn in Germany and just about everywhere else". Wrong. This was claimed by you and you only, and you provided no evidence to hold up your claim. Even on Wikipedia, it is clearly stated that *providing access* of porn to children is illegal. No word about accessing porn being illegal, neither for children nor for adults. Have you ever heard of a child being arrested because the child accessed porn? Maybe in Iran. Not in any western state. Because there is obviously no sense to penalize a child after it was allegedly harmed by porn already. Even you should understand that anything else would be quite absurd. If you don't... well again, provide me with a law text that says *accessing* porn is forbidden anywhere. You know what? You won't be able to. Because such laws exist only inside your twisted imagination of how you would like the world to be. --93.129.35.41 18:53, 28 August 2011 (UTC)[reply]

Most readers of this conversation should already know it. But for those who did not read any line i could just suggest to ignore the comments by Ottava Rima. He has shown (in this thread alone) at least a dozen times that he has no clue about non US law. He also showed a very blindfolded, narrow interpretation about the definition of pornography. Even after US-law sentences like "Depictions of the sexual act have always fallen under pornography." aren't true at all. Maybe he never looked at US art movements from the last 50-60 years. ;-) --Niabot 22:58, 27 August 2011 (UTC)[reply]

All of that unpleasant flame throwing aside (myself included). While German porn law may be interesting, remember that the whole point of almost all of the postings here, is that this referendum was organised in a very unfair way, with awkward questions and not enough discussion before and during the initiation of the referendum. Most voters only read the material written by the organisers who are very sympathetic towards the image filter and thus we end up with a lot of information about users, that is scewed towards how people would actually respond if there was a more balanced presentation of the pros and cons of the filter and clearer questions. --Shabidoo 22:14, 27 August 2011 (UTC)[reply]

Merely claiming it was unfair doesn't make it so. You have a fringe POV, and thinking that you would have more support if you were dominating the presentation is fantasy. Ottava Rima (talk) 23:41, 27 August 2011 (UTC)[reply]
No one should dominate anything here. Thats the very reason why this "referendum" is unfair. --Shabidoo 02:30, 28 August 2011 (UTC)[reply]
The referendum was to get people's level of response. It was not for you to treat it as a bully pulpit to demand that the WMF violate US law, German law, UK law, and just about the laws of most of the world because you want children to view pornography. Ottava Rima (talk) 15:19, 28 August 2011 (UTC)[reply]

Just make a few copies of Wikipedia with different levels of censorship. You can have a completely uncensored Wikipedia, a Wikipedia for Christian conservatives, one for conservative Muslims (similar to the Christian one, except that pictures of women are only allowed if they wear a headscarf), and an ultraconservative Muslim one were women may only appear in a burqa. Count Iblis 18:07, 28 August 2011 (UTC)[reply]

The German Wikipedia is not going to have image filters as it seems. The majority is against it. So yes, this might indeed mean a fork of Wikipedia if the WMF tries to tell us Germans what to do. People on the German WP are discussing this possibility. I hope the WMF has reason and stops this madness before it's too late. --93.129.35.41 18:53, 28 August 2011 (UTC)[reply]

Just wanted to let everyone know that Ottava Rima just got blocked indefinitely for his bad behaviour in this discussion, calling other people pedophiles, etc. Just in case you wonder why there is no further response - this is why. --93.129.35.41 20:46, 28 August 2011 (UTC)[reply]

It's all about how prude and puritanic many people in the U.S. are and it is because of that. Television commercials like this one for Fa won't be broadcast in the U.S. under any circumstances. But they are broadcast in many European countries, even during the daytime. In contrary to U.S. politicians and religious fundamentalists the ordinary people in Europe are not convinced that people get blinded if they see one nipple. Yeah, indeed, I think, seeing nipples is quite a normal thing – most babys see and suck such body parts very early in their lives and it doesn't seem to have any bad influence on their further development. It are politicians and priests who make human sexuality a bad thing. And it's Wikimedia aiding them. Sorry, that's reminding on the Taliban, but in reality it's worse. Worse because of the people in the board of trustees of the foundation know it better. Or at least they should. I have growing doubts if the people in the WMF board of trustes are the right people to promote the free encyclopaedia and the other free sister projects. I doubt they ever gave the slogan Wikipedia. The Free Encyclopedia. It should be free. With a big F. --Matthiasb 18:54, 29 August 2011 (UTC)[reply]

Matthiasb, it is clear that you have not read all of the materials. The WMF receives frequent complaints from readers in Asia, Africa, South America, and the Middle East about the en:cultural imperialism of the USA, and these complaints are often specifically about the inclusion of what they believe are inappropriate sexualized images. American culture might seem prudish to someone from your corner of Europe, but it seems positively libertine to the majority of Internet users in the world.
Our users in the Netherlands and Germany are going to have to face facts: what you think is a normal, acceptable amount of nudity might occasionally be considered a bit much by the average American or Brit, but it is considered very, very inappropriate by nearly every other culture in the world.
Your "enemy" here is not the "prudish" or "puritanical" American. The people who are truly unhappy about your cultural standards are Asians, Africans, Latinos, Arabs, and Persians. They make up more than 80% of the world's population, and they have been vocal and persistent in their requests for the ability to stop what they see as Americans spamming sexualized images everywhere. WhatamIdoing 20:45, 29 August 2011 (UTC)[reply]
Please don't try and speak for British users thanks. I'd rather not be lumped in with the puritan brigade, and am very much of the same opinion as my European neighbours Matthiasb and Niabot, with their healthy and rational attitudes towards sexuality and nudity (two categories which do not always overlap, contrary to some people's beliefs). Wikipedia is supposed to be here to spread knowledge and enlighten people, not cater to their medieval religious prejudices and ill-informed ideas about pictures of nipples making their children's eyes bleed. Unfortunately, reality is frequently offensive to the wilfully ignorant. Creationists would prefer that Wikipedia didn't present the facts of evolution, and similarly, prudish ignoramuses would rather no one was able to see a picture of a penis. We should not concede an inch to either party. We should be NPOV in all cases except that we stand against censorship and for free access to knowledge by everyone, whether or not their parents, governments or religious leaders are happy with the idea. This filter is a serious backward step for Wikipedia, and the 'referendum' on it is a farce. Trilobite 05:10, 1 September 2011 (UTC)[reply]
Agreed, the US people are not the main people that want this censorship. And indeed, not censoring Wikipedia could be called cultural imperialism. On the other hand, it is equally cultural imperialism if some people that have a patriarchal, sex-negative attitude force us to tag our content, at the risk of enabling a lot of censorship possibilities. Hiding the truth about the human body is funamentally wrong for an encylopaedia. We should show the truth about everything. You could use the same argument of culture-sensitivity for non-image content. So we shouldn't even get started on this way. --178.201.100.215 11:14, 30 August 2011 (UTC)[reply]
  1. Censoring Wikipedia on behalf of any group wether US or not is cultural imperialism.
  2. My enemy are censors and censorship. Since the "prudish" and "puritanical" American – heck, almost the whole world laughed about Mr Gonzales' censoring of Justicia – are helping censorship, yes they are the spear-tip in advocating censorship, they are my enemy. If America wants to go back in dark middle age, please proceed as you wish, the next elections are next year, but please let the rest of the world stay in the 21st century. --Matthiasb 21:35, 1 September 2011 (UTC)[reply]
"If America wants" this: You're missing the point. This filter is not being created because Americans want it. It is being created because Asians, Africans, Latinos, and Middle Eastern people want this. If you would please check a map, you will discover that Asia, Africa, Latin America, and the Middle East are not part of the USA. WhatamIdoing 23:49, 1 September 2011 (UTC)[reply]
You ever saw any raw data? The only thing I saw was a report by Mr. and Miss Harris, appearantly not from the Middle East. I don't believe that this filter is not being created because Americans want it. I don't believe it is being created because Asians, Africans, Latinos, and Middle Eastern people want this. The w:Arab spring shows that people in the Arab world are fed up with censorship. I am sure that this filter is going to be created because neoconservative cicles in the US want this. And it's up to the WMF and the board to proof that this is not the case. --Matthiasb 13:35, 3 September 2011 (UTC)[reply]
To set some things straight, while researching the proposed (and rejected) policy for sexual content on Commons, some very surprising things became clear.
  • Yes, it is possible for people to inadvertently view child pornography - in fact, there was a huge flap where w:Traci Lords made some sexual videotapes that were widely distributed in the U.S. and elsewhere as mainstream porn, after which people eventually found out she was 16 at the time. Oops. We can't actually rule out that Wikipedia couldn't likewise pick up an image inadvertently, especially since we have no real option but to trust amateur photographers making educational photographs. In addition there are arguments over definition - there was in fact some argument over w:Virgin Killers where people are still debating whether the album cover is child pornography or not.
  • Yes, Wikipedia does legally provide some things which a person applying the w:Dost test to nowadays would probably call child pornography, namely the "Korpes des Kindes" (Bodies of Children) photographs on Commons. Though not even the ACLU will argue that child pornography is protected by the w:Miller test, it should be apparent that in fact this is the case here, because the photographer, convicted as a pedophile and spending a few months in jail under the lax sexual mores of the Victorian Age, is apparently well regarded historically as a pioneer of the art, and so the photos have historic and artistic significance. While that may sound controversial, the fact is, w:Larry Sanger called the FBI, which was widely reported in the press as investigating Wikipedia at his instigation. And Wikipedia certainly earned no special treatment e.g. by publicly pooh-poohing the FBI's call for restrictions on using the FBI logo in articles at the time. Nonetheless, no prosecution, not so much as a warning ever came out of it, and the photos remain on Commons to this day.
  • I don't know if that means that the FBI believes those photos are protected by the Miller decision or if they're just afraid the courts will finally draw the line against censorship altogether, but they're doing the right thing. The fact is, all that is accomplished by the frantic efforts to ban such images, including expansive bans on computer simulated and artistic images, is the state subsidy of a multibillion dollar market in the kidnapping and videotaping of real children. And the public's ability to remain blissfully ignorant of as many as hundreds of thousands of children held in slave conditions as prostitutes in the U.S. alone. While enforcement agents with weird Orwellian agendas prosecute people for "child porn" for participating in file-sharing networks without knowledge of the files traded, and throw people down the stairs for offering open access to Wi-Fi hot spots, and do next to nothing for kids even when they are arrested as prostitutes!
  • Wikipedia is not about people hiding their heads in the sand like the legendary ostrich, but opening their minds to the facts - including facing brutal realities head-on and finding ways to fix them. Wnt 00:30, 2 September 2011 (UTC)[reply]

Should we have discussions in languages other than en?[edit]

Right now, it looks like all the language discussion pages redirect to en. Being hit with a screenful of english, many editors may simply not even try to discuss.

Could we open up discussion pages in other languages and integrate them into the per-language "discuss" advertisement, along with a reminder for en speakers to participate here as well. --AlecMeta 02:15, 24 August 2011 (UTC)[reply]

The discussion pages can be accessed by first going to the pages of the referendums in those languages, and then going to the talk pages. It's very interesting that most of these pages are empty. This includes the ones for some of the languages whose cultural areas have been presented here as among the primary beneficiaries--there seems, for example, to be no discussion in arabic. My tentative conclusion is that the board has been exercising what can only be called cultural paternalism (along with the more literal paternalism of not wanting young people to see sexual images). . Perhaps that accounts for their remarkable assertion that there is an intercultural objective international standard for questionable images. DGG 04:05, 24 August 2011 (UTC)[reply]
If anybody is automatically redirected to the English discussion page with already present content, then nobody would assume that it was intentional to have discussions in your own language. Since this seams not the right place for many users, they decide to discuss inside their own language wikis. A huge problem, since nobody can follow this discussions and feed them back to Meta. I could have written onto the German discussion page on Meta. But it seemed to me that only the English discussion page was meant to be the place for discussions. --Niabot 11:42, 24 August 2011 (UTC)[reply]
All the sheep herd to the same place? ;-) --Kim Bruning 20:02, 24 August 2011 (UTC)[reply]
I suspect we had no idea that we would get this much discussion. Ar, for example, seems to just dump people here if they click on the arabic word for talk: [13]. How do we say "If you don't speak english, you should talk [[here|Talk:Image filter referendum/$langcode]]" in our most popular languages? --AlecMeta 20:20, 24 August 2011 (UTC)[reply]

Non-English Discussions[edit]

This section requires expansion.

Lag[edit]

The last posts to this page are shown to be from the 21st of August, the history shows that there's more, but it's not shown on the discussion page itself. --94.134.218.22 10:59, 24 August 2011 (UTC)[reply]

Check the subpages, or force the server to refresh its cache on the central page. (transclusion is strange like that). Lemme fix... --Kim Bruning 19:56, 24 August 2011 (UTC)[reply]
Done!--Kim Bruning 20:01, 24 August 2011 (UTC)[reply]
Not done! It's now only up to August 23rd. --94.134.192.160 06:55, 25 August 2011 (UTC)[reply]
And the History page isn't working right either. One can't follow the discussion on the page, because it lags and one can't follow it via the History, because that doesn't record everything. Looks like someone is trying very hard to make it impossible to follow the discussion at all. --94.134.215.77 08:32, 26 August 2011 (UTC)[reply]
Works For Me, I see your comment of 26 august just fine. For history: use the history of the subpages. This is all pretty straightforward transclusion, I wouldn't call it a conspiracy on those grounds! ;-)
Also: see the button I put at the top of Talk:Image filter referendum. Hit that button once in a while to ensure everything is in sync. --Kim Bruning 11:57, 26 August 2011 (UTC)[reply]
That button doesn't work at all and, yes, making it difficult to easily participate in the discussion - including following the history of uptenth subpages linked probably somewhere buried on the page, looks like trying to obfuscate the discussion participation, instead of one page where all new discussion is in the same history and new things added at the bottom not in several random arbitrary categories. --94.134.213.137 09:18, 29 August 2011 (UTC)[reply]

HELP: Translation sv/Swedish ready BUT CANNOT BE USED - URL needed for Swedish questions[edit]

I am going to set the last Swedish page to "done" now, and then the Swedish voting would be ready, but I do not know the URL for the Swedish voting server. It should be something alike the German http://de.wikipedia.org/wiki/Spezial:Sichere_Abstimmung [14] and the French http://fr.wikipedia.org/wiki/Spécial:Vote_sécurisé [15]

--Ronja 12:19, 24 August 2011 (UTC)[reply]

I'll make sure that Andrew sees this. Philippe (WMF) 19:58, 24 August 2011 (UTC)[reply]
Thanks! --Ronja 22:14, 24 August 2011 (UTC)[reply]
Presumably the apprprite redirect from "Special:SecurePoll/vote/230" will be in place? (Not that there's much time left... ) Rich Farmbrough 23:34 29 August 2011 (GMT).

Something to watch[edit]

Philip Pullman No one has the right to not being shocked... What the filter will do better --Eingangskontrolle 19:24, 14 September 2011 (UTC)[reply]

Don't correlate 2 unrelated issues to forge false theory. The filter is proposed because the Foundation respects some user's desire to show the sensitive pictures in the name of encyclopedic value. When Foundation is also trying to respect another user who is offended by the sensitive pictures, the one who is enjoying the freedom to show the picture has no right to condemn the countermeasure. We're not even asking text filter. But would you not be shocked if I glorify Hitler and Holocaust? -- Sameboat (talk) 16:46, 15 September 2011 (UTC)[reply]
Seriously! I wouldn't be shocked at all. You have the right to think what you want. If not against the law, you have the right to publish it. Your argumentation would imply to not surf the WWW, because I might stumble upon an page with exactly this content, which occasionally happens. --Niabot 17:21, 15 September 2011 (UTC)[reply]
In order to protect yourself against unexpected images, everyone can switch off all images. We dont need to respect users who do not want information, because they have choosen the wrong URL when typing wikipedia.org, this site is about knowledge and education. And I understand your statement about Hitler, that the logical next step has to be a text filter in order to avoid written statements, which might disturb some users. --Eingangskontrolle 08:14, 16 September 2011 (UTC)[reply]
So this is their fault to open the de.wp main page when the consequence is far beyond personal tolerance or acceptance. Wikimedia is not for your self-righteous value. -- Sameboat (talk) 09:08, 16 September 2011 (UTC)[reply]
It interests me that these people live in a country where cities carefully enforce, as a civil right, strict limitations on when the neighbors mow their lawns, because it's very important that no one be subjected to the dull roar of a lawn mower during the little children's naptime, or (last I checked, which was a few years ago) at any time at all on Sunday.
However, the equivalent of allowing lawn mowing at any time and offering earplugs to the neighbors horrifies them, because while it's very important that the neighbors don't disturb them, it's also very important that they can shock these same neighbors with naked pictures at any time, even if the neighbors don't want to see the pictures. WhatamIdoing 17:46, 16 September 2011 (UTC)[reply]
It's amazing how the very idea of this filter as created a giant polemic and bring out extreme reactions and generalisations. On one hand, a user is screaming "DRACONIAN CENSORSHIP" and on the other hand another side is screaming "PROTECT OUR CHILDREN FROM PEDOPHILES WHO SHOCK US WITH CHILD PORN IMAGES" while a small group of people recognise that the filter would be a feature that challenges the current philosophy behind the wikipedia project of "consensus" and "openness". It will be a long and complicated debate. --Shabidoo 13:06, 19 September 2011 (UTC)[reply]
I barely see there's anyone actually screamed to protect the children with the image filter (otherwise a censorship tool enforced by the parents.) We don't use children to be our meatshield when defending our standpoint. Neither does WMF -- Sameboat (talk) 09:08, 20 September 2011 (UTC)[reply]