Talk:Image filter referendum/en/Usage

From Meta, a Wikimedia project coordination wiki

Usage of the filter[edit]

Proposal: Filter is enabled by default[edit]

I am proposing that, by default, the image filter is enabled (all potentially offensive images are blocked). There are several reasons why this is preferable:

  • It is in keeping with the "principle of least surprise/least astonishment". Users should not have to take action to avoid seeing images which may offend them.
  • Some of Wikipedia's critics have long complained about offensive images. Unless those images are blocked by default, the criticism will continue. Having potentially offensive images blocked by default will silence that criticism.
  • Schools and some libraries are likely to want potentially offensive images blocked by default.
  • Many institutions with shared computers do not save cookies for security reasons, so any setting would be gone the next time the browser is started. This poses a problem for institutions which require potentially offensive images to be blocked, unless the filter is on by default.
  • Wikipedia vandals often place offensive images in templates or articles. While users viewing en:Fisting may not be surprised to see a photograph of a man being anally fisted, they would not expect to see it on an unrelated article, which is what happens with this type of vandalism.
  • The misuse of images (as opposed to the deliberate abuse of images by vandals), coupled with the potential for users, especially younger users, to be unaware of alternate meaning of terms, means that users may be surprised by images that they did not expect to see. Blocking offensive images by default mitigates this possibility.

Therefore, I propose that implementation of image filters be done in such a way so as to enable image filters by default. Unregistered users could turn it off with the dialog shown in the mock-ups. Registered users would be able to change it via a preference that they set once. Note that I am not proposing any change to which images are filtered or how they are classified - those remain issues for the WMF to work out. When I refer to "potentially offensive images" I simply mean images which would be filtered if a user had chosen to use the filter (there is no difference in what is filtered between the opt-in or opt-out implementation). Delicious carbuncle 20:52, 16 August 2011 (UTC)[reply]

  • Support - as proposer. Delicious carbuncle 20:52, 16 August 2011 (UTC)[reply]
  • Support - flickr is probably the largest source of free porn that is hovered up and stored here, and by using simple opt-in filters they still manage to keep the place relatively porn free for those that don't want to encounter it in specific circumstances. John lilburne 21:20, 16 August 2011 (UTC)[reply]
  • Oppose - My understanding of the original "opt-in" filtering proposal was that, under that proposal, we would tag images as belonging to various (relatively value-neutral) categories (e.g. nudity, spiders, firearms, explosives, trees), provide each viewer with a filter that doesn't block anything by default, and allow the viewer to choose which categories to block. I'm not entirely happy with that suggestion, but not strongly opposed as of yet. In contrast, Delicious carbuncle's proposal is to have certain "potentially offensive images" blocked by default, while giving viewers the option of un-blocking those images. Unlike the opt-in proposal, this one would require us to make a definite value judgment (i.e. regarding which kinds of images are "potentially offensive" to a reasonable person); there seems to be no way of drawing the line here, unless we're going to have the filter block by default all images that any identifiable demographic finds offensive. Sorry, but I don't think it's feasible to expect people from all over the world to come to an agreement on that issue. --Phatius McBluff 21:33, 16 August 2011 (UTC)[reply]
    • Comment - However, if we do end up having some categories filtered by default, then we had better allow people the option of adding additional categories to their filters. That's only fair, given the extreme subjectiveness of what counts as "potentially offensive". --Phatius McBluff 21:33, 16 August 2011 (UTC)[reply]
      • To be clear, what I am proposing is that any image which would simply be filtered under the opt-out scheme be filtered by default under the opt-in. Whatever system and schema used to classify images for the original proposal would be used in my variation. By "potentially offensive" I am referring to any image tagged as belonging to any of the categories shown in the mock-ups. Nothing is changed except the filter is on by default. Delicious carbuncle 22:32, 16 August 2011 (UTC)[reply]
  • Strong Oppose for all reasons given by me in the sections above this poll. --Niabot 21:48, 16 August 2011 (UTC)[reply]
  • I can't think of a worse way of using Wikimedia resources than to help libraries keep information from their users. Also, it is impossible to implement in a culturally neutral way... Kusma 22:31, 16 August 2011 (UTC)[reply]
    • Again, there is no difference in implementing classification under my proposed variation, simply that filtering is on by default. If your issue is with the idea of a filter or with the difficulties of classifying images, please do not comment here. Delicious carbuncle 22:35, 16 August 2011 (UTC)[reply]
"All potentially offensive images" would include all images of people for a start, and I suspect all mechanically reproduced images (photographs etc.), and quite possibly all images. Rich Farmbrough 22:47 16 August 2011 (GMT).
OK, I stand corrected. There are two culturally neutral ways to filter images: all or none. Kusma 22:48, 16 August 2011 (UTC)[reply]
Once again, I am not proposing any changes to which images are filtered or how they are classified, only that the filters are on by default. If you have concerns about how image will be classified or who will do it, take them up with the WMF. If you have an opinion about whether the filters (which will be implemented) are on or off by default, feel free to comment here. Otherwise, your participation in this discussion is entirely unhelpful. Delicious carbuncle 23:03, 16 August 2011 (UTC)[reply]
I strongly suggest that they are disabled by default or will never be implemented. There is one big, open and time consuming question: Which image is offensive to who? "Personally i hate flowers and numbers, please block them all in the default view, because i often change my workplace..." --Niabot 23:36, 16 August 2011 (UTC)[reply]
Let me then rephrase. Oppose because in the event that filters are implemented they should, and undoubtedly would, include a filter for the human form "it is feared by many Muslims that the depiction of the human form is idolatry", all human and animal forms - certain hadiths ban all pictures of people or animals - and for all representations of real things "You shall not make for yourself a graven image, or any likeness of anything that is in heaven above, or that is in the earth beneath, or that is in the water under the earth". Rich Farmbrough 23:31 16 August 2011 (GMT).
What a load of crap historically, and currently. Damn I've not seen such ignorance outside of BNP and EDL lies. John lilburne 06:55, 17 August 2011 (UTC)[reply]
I guess WP:CIVIL doesn't apply on Meta? You may wish to go and purge this "crap and ignorance" from en:Wikipedia - and possibly stop off for a chat with the Taliban and advise them of the errors of their ways in banning photography. Or perhaps you could credit people with the knowledge that Persian illustrations such as "Why is that Sufi in the Haman" contain figures which were permissible where and when they where created, but would not be by other groups of Muslims at other times in other places. Since the prohibition is derived from hadiths rather than the Qur'an it is certain Sunni sects that have observed it, while other groups have not. To generalize from a couple of specific localised examples to a faith of a billion people over 14 centuries, is an absurd leap. Rich Farmbrough 23:54 17 August 2011 (GMT).
You wouldn't be trying to censor words here now would you? Back on point every Muslim country has TV with images of people and animals, all have newspapers containing images of people and animals, all have billboards with local politicians, street demonstrations have people holding aloft images of religious leaders, or dead martyrs. Walk around in a Muslim tourist spot and the women in niqab and burka are taking photographs of their family. ban all pictures of people or animals crap. John lilburne 00:12, 18 August 2011 (UTC)[reply]
Just because every country (currently) does or has something, it does not follow that there are not communities that do not dislike, condemn or prohibit it. Rich Farmbrough 12:06 18 August 2011 (GMT).
There is no significant number of people that are offended by general images of people and animals. If such people exist then they must have a very unhappy time wherever they live as there is hardly anywhere where there are no images of people or animals, and if they are truely offended by such images they won't be browsing the internet. So excuse me if I don't dance of into la-la land with you. John lilburne 14:06, 18 August 2011 (UTC)[reply]
Interesting that you feel the need to be rude in every comment you make. However it might still be worth your while to read, for example, en:Islamic aniconism, to consider that there are other ways of delivering the projects than Internet (some of which WMF representatives have specifically discussed in relation to similar minority religious groups), to further consider that it is possible to use the Internet with all images suppressed, and one positive feature is that this might allow access to a reduced subset of WP where nothing was available before. Alternatively you can continue to deny the existence of such groups, or claim that since they are small they don't matter, and that I am a loony to consider that "all potentially offensive images are blocked" is a very broad, indeed overly broad proposal - but you'd be wrong. Rich Farmbrough 00:13 19 August 2011 (GMT).
  • Oppose: If you implement the filter, make it off by default, otherwise it would further marginalize images seen as controversial by those in power, something to which I'm opposed to on anarchist grounds. Cogiati 01:37, 17 August 2011 (UTC)[reply]
How is that an Anarchist perspective! As an Anarchist myself I don't think ANYONE should be dictating to what I MUST see anymore than some one should dictate what I MUST NOT see. The solution that allows the most freedom of choice is to allow me to decide whether to view the image or not. John lilburne 11:46, 17 August 2011 (UTC)[reply]
It is your choice to filter some categories or not. But it is not your choice which image belongs to which category. Some might still offend you and some might stay hidden even if you hadn't any objections to view it. --Niabot 18:35, 18 August 2011 (UTC)[reply]
  • Oppose – Forcing users to opt out is far more intrusive than granting users the ability to opt-in. --Michaeldsuarez 02:56, 17 August 2011 (UTC)[reply]
  • Oppose. Delicious carbuncle: You seem to be under the impression that the idea is to compile a finite list of "potentially offensive" subject areas and offer the option of filtering them. This is incorrect. An image "offensive" or "controversial" to one culture/individual is not necessarily viewed as such by another. "All potentially offensive images" = "all images". The "categories shown in the mock-ups" are merely examples.
    For the record, I strongly oppose the plan on philosophical grounds. Your suggested variant, in addition to being even more philosophically objectionable, isn't even viable from a technical standpoint. —David Levy 03:45, 17 August 2011 (UTC)[reply]
    You appear not to have read what I wrote. A filtering system is being implemented. I am simply proposing that the filters are on by default. Any issues with deciding what' to filter exist in the current implementation. Delicious carbuncle 10:59, 17 August 2011 (UTC)[reply]
    You appear not to have read what I wrote (or what Phatius McBluff wrote). The proposed system would not include a single set of filters applicable to all cultures (an impossibility) or to one particular culture (a non-neutral, discriminatory practice). It would, at a bare minimum, encompass numerous categories widely considered controversial/offensive in some cultures and perfectly innocuous in others.
    "The filters are on by default" would mean that everything categorized within the system would be filtered for everyone by default. Some religious adherents object to images of women without veils or women in general, so all images of women would be blocked by default. That's merely a single example.
    For the plan to even conceivably succeed, "deciding what to filter" must be left to the end-user. (Even then, there are significant issues regarding how to determine the options provided.) Your suggestion relies upon the existence of a universal filter set, which is neither planned nor remotely feasible. —David Levy 18:54, 17 August 2011 (UTC)[reply]
  • Oppose. Just because you're playing on a slippery slope over shark-infested waters doesn't mean you have to dive off it the first minute. Look, I'm not completely ignorant of the religion of the surveillance state; the concept is that the human soul is a plastic card issued by a state office, and all of the rights of man emanate from it. For the unidentified reader, who doesn't run Javascript or cookies, to be able to access specialized information (or eventually, any information), let alone post, is a Wikipedia blasphemy against this faith no less extreme than the Muhammad pictures are against Islam. And so first you feel the need to make the image hiding opt out by default; then limit it to the registered user; and eventually only such registered users as are certified by the state, because your god commands it. After that it would be those specialized articles about chemistry, and so on. But Wikipedia has been defying this god since its inception, so I have high hopes it will continue to do so. Wnt 05:45, 17 August 2011 (UTC)[reply]
  • Support, as a parent in particular. Currently I have all of the WMF's websites blocked on my daughter's browser, which is a shame because she loves learning and wikipedia would be a good resource for her if it weren't for problematic images that are generally only a few clicks away. --SB_Johnny talk 12:30, 17 August 2011 (UTC)[reply]
I'm disturbed to read your vote, because if you can block WMF websites on your child's computer you must be running some sort of "nannyware", but I would expect any remotely professional censorship company to be able to block out the sort of images being discussed here while allowing through the bulk of Wikipedia material. What do they do in schools which are required to run such software? Wnt 14:41, 17 August 2011 (UTC)[reply]
Why would you be "disturbed" by a parent using nanny software? Or (say) a preK-8 school, for that matter? --SB_Johnny talk 16:39, 17 August 2011 (UTC)[reply]
There is no software that will reliably detect inappropriate images (whatever the criteria for inappropriate is). Thus it is simpler and less error prone to block the entire site. John lilburne 15:03, 17 August 2011 (UTC)[reply]
Yup. I seriously doubt the sites will come off the red flag lists until at least some effort is made to change the defaults. OTOH, from other comments on this page about "cultural neutrality" and "radical anti-censorship", the categorization and tagging efforts might fail anyway. --SB_Johnny talk 16:39, 17 August 2011 (UTC)[reply]
  • Strong oppose Oops, I think we slipped. Our boots are losing traction. Internoob (Wikt. | Talk | Cont.) 17:33, 17 August 2011 (UTC)[reply]
  • Oppose, for reasons outlined in previous comments on this talk page. If I oppose opt-in filtering, it's likely of little surprise that I oppose opt-out filtering. If we must have image filtering, it should never be opt-out.
  • Support. No point to doing it any other way. Why would a reader create an account if they're offended by an image in order to hide it in the future? They'll simply leave. So much for disseminating all of human knowledge (like the WMF actually does that anyway). – Adrignola talk 00:06, 18 August 2011 (UTC)[reply]
    Do you advocate that all images (or at least all images known to offend large numbers of people) be suppressed by default? If not, where should we draw the line? At images that seem "offensive" to you? To me? To whom? —David Levy 01:16, 18 August 2011 (UTC)[reply]
    Flickr, picassa, facebook, ipernity, youtube, have vast amounts of content that is considered as acceptable for their international audience, flickr has a huge amount of sexually explicit content that is filtered too. Some of the most prolific uploaders of sexually explicit content to Commons are sourcing the content from flickr. Flickr's rules of acceptability are crowd sourced. Why do you think that WMF sites would be unable to distinguish acceptable from unacceptable in relation to the same audience? John lilburne 06:43, 18 August 2011 (UTC)[reply]
    Flickr is a commercial endeavor whose decisions are based on profitability, not an obligation to maintain neutrality (a core element of the nonprofit Wikimedia Foundation's mission).
    Flickr (or any similar service) can simply cater to the revenue-driving majorities (with geographic segregation, if need be) and ignore minorities whose beliefs fall outside the "mainstream" for a given country. The Wikimedia Foundation mustn't do that. —David Levy 07:33, 18 August 2011 (UTC)[reply]
    And yet flickr has far more controversial images than Commons, is the place where Commons sources much of its material, and where the minority ero community and majority non-ero communities coexist side-by-side. Where ero-content producers often express support for a filtering system where they can browse the site with their children, or work colleagues and not encounter age inappropriate or NSFW images. John lilburne 17:29, 18 August 2011 (UTC)[reply]
    You're defining "controversial images" based upon the standards prevalent in your culture. In other cultures, for example, images of women without veils or women in general are considered highly objectionable. A website like Flickr can make a business decision to ignore minority cultures (on a country-by-country basis, if need be), thereby maximizing revenues. ("Only x% of people in that country object to this type of image, so it needn't be filtered there.") Such an approach is incompatible with the Wikimedia Foundation's fundamental principles.
    Incidentally, Flickr hosts hundreds of times as many media files as Commons, primarily because most fall outside the latter's scope. —David Levy 18:19, 18 August 2011 (UTC)[reply]
    Your position is inconsistent. If true that unfiltered images of women without veils on WMF sites are objectionable, WMF sites are not less objectionable with unfiltered images of a woman's vagina. Flickr, google, and others are in the business of getting the maximum number of viewers on their sites. Both do so by having a click through before revealing images that may be objectionable. Logged in users that have expressed a preference to see certain types of image are not bothered once their preference has been set. John lilburne 19:16, 18 August 2011 (UTC)[reply]
    No, my position is not inconsistent. My position is that the Wikimedia Foundation should filter no images by default, irrespective of who they do or don't offend. (I also oppose the introduction of opt-in filtering, but that's a somewhat separate issue.)
    It's true that someone offended by an image of an unveiled woman also would object to an image of a woman's vulva, and your argument seems to be that there therefore is no harm in filtering the latter. You're missing my point, which is that it isn't the Wikimedia Foundation's place to decide where to draw the line.
    In some cultures, a photograph of a fully nude woman is regarded as perfectly acceptable. In other cultures, any photograph of a woman (even if clothed in accordance with religious custom) is considered problematic. You want the Wikimedia Foundation to filter its content in a manner precisely conforming with your culture's position in the spectrum, thereby abandoning its neutrality by deeming the former cultures excessively lax and the latter cultures excessively strict. (And as noted above, this is merely a single example.)
    Flickr, Google and others are in the business of making money. For them, value judgements catering to majorities at the expense of minorities improve the bottom line. For the Wikimedia Foundation, they would violate a core principle. —David Levy 23:28, 18 August 2011 (UTC)[reply]
  • Oppose. This would block what search engines see and that would block the ability of those who do not want filtering to use search engines that analyse image content to look for images containing certain items. It's key to the general proposal that the filtering should not affect those who do not want filtering, and on by default would do that. Jamesday 03:25, 18 August 2011 (UTC)[reply]
  • I find it kinda difficult to take this proposal at face value. If someone wants to pre-emptively block all images, they can e.g. use AdBlock plus to filter all common image file formats, or they can use this userscript (works like a charm for me). For general image blocking, no separate filter system is required, and therefore implementing a universal default image block provides no benefits but a lot of drawbacks for the majority of people who want to see all or most images by default. --213.168.119.238 17:43, 18 August 2011 (UTC)[reply]
  • Mmmmm, oppose. People ought to be able to look at pictures first to see what they don't want to look at, and then be able to block them. Rickyrab 18:34, 19 August 2011 (UTC)[reply]
  • Oppose – No thanks, most people would like to customize the image filter for themselves. We shouldn't be forcing users to have to opt-out of the image filter; they should make the final choice to "opt-in" to such filtering. Parents can "opt-in" their children, librarians can "opt-in" library computers, etc. Give the readers (and/or the editors) the final decision whether or not to use the image filter; don't enable it by default. mc10 (t/c) 22:53, 20 August 2011 (UTC)[reply]
  • Oppose It is bad enough that the "optional" feature, as described, violates the NPOV pillar. More than bad enough. But making *all* so-called offensive categories, as this proposal suggests, from nudity to any image of the human body to any particular "sacred" element of any religion to images of violence to images of shrimp, making all those images hidden by default? I find this suggestion enormously troubling. --Joe Decker 02:38, 22 August 2011 (UTC)[reply]

Proposal to at least put this in as one of the voting options[edit]

Currently the poll/referendum/whatev doesn't even ask the question (thus "censoring" the poll) ;-). Since it's already mentioned that people can change their votes anyway, is there any reason it can't be added as an option on the poll/referendum/whatev? --SB_Johnny talk 21:04, 17 August 2011 (UTC)[reply]

Considering the rather overwhelming opposition to the opt-out proposal (above), I wouldn't think it necessary to put the question in the poll. 92.18.208.159 16:46, 21 August 2011 (UTC)[reply]

Circumvention of opt-out by institutional administrators is possible, locking students and employees into a censored world[edit]

Or maybe it should be open-ended, so that whatever has a category can be arbitrarily black-listed and blocked - purple people can block images of striped people, Jennifer Aniston can block images of Brad, and I don't ever have to face the horror of gherkins again. Sure, people will want blinkers, but Wikipedia has traditionally had a policy not to provide them - from day 1, we've been all about enlightenment without political barriers. And Christian school admins can set a permanent non-removable cookie on all kiosk machines to block sex-ed and evolution in their classrooms (and also block image filter config pages, so the hiding be reversible: readers should be supported if they decide to change their minds. option can definitely be circumvented! And don't say JavaScript, because they can disable that, too). I don't think people should have a "sort-of-Wikipedia". It dilutes our mission. And just because Google caved in to China's demands at one point (and was rightly castigated for it), doesn't mean we have to. Samsara 22:02, 19 August 2011 (UTC)[reply]

It seems like allowing people to hide things from themselves vs. allowing people to hide things from others are two different issues, yet this proposal doesn't make a clear distinction between them. The list of voting questions includes "that hiding be reversible: readers should be supported if they decide to change their minds." Well of course, why on earth would you have a preference that once you activate it, can't be shut off? What it's really trying to say is "should a parent/school administrator be able to set it up so that a child can't unhide images?" or "should one person be able to hide images from someone else, out of that person's control?" Evil saltine 23:03, 19 August 2011 (UTC)[reply]
I would be really concerned if I thought what we were building would seriously allow one human being to stop another human being from accessing information as they choose to. But nothing we build is going to useful to that effort. Repressive regimes and repressive authorities already have plenty of software on their own. This is really just about the basic need for readers to, e.g. 'not see a gross autopsy picture if they really don't want to'. --AlecMeta 00:10, 20 August 2011 (UTC)[reply]
To follow up-- if somebody can control my browser's behavior, can't they already prevent me from seeing any or all WM content anyway? Would our building this actually help China censor its citizens in a tangible way, or can they that already without our help? --AlecMeta 02:46, 20 August 2011 (UTC)[reply]
Yes it would help because it does the categorization, and promulgates it for them. And it also, apart form the technical side of things, creates a chilling effect, because someone seen to have the "Chinese Orthodox" filter turned off, might be subject to various forms of retribution. Moreover it might be possible to log filter settings. Rich Farmbrough 03:16 20 August 2011 (GMT).
Every organisation and education institution uses a content filtering enabled proxy server anyway and nearly all organisations just block sites they dont support the views of or are off topic. This feature wont make that situation any worse and might make it somewhat better. Keeping in mind we are just talking about images here and I couldn't give a rats arse if I can't see images of dicks or mohammad, like honestly, keep it in perspective. If you think you arn't already censored WAY MORE than what this is doing, your delluded. Promethean 07:11, 20 August 2011 (UTC)[reply]
None of what you brought forward makes any sense, Promethean. It's been pointed out that if we build categories designed for filtering, this will help third parties in the restrictions they may wish to build for users whose machines they control. Secondly, just because I'm exposed to the potential of traffic accidents on my way to work doesn't mean I have to go swimming with saltwater crocodiles. Any level of censorship is bad, and just because we already have some, doesn't mean we should introduce even more. Apply that argument to nuclear weapons, and I think you'll see how massively wrong you've been in your arguments. Samsara 03:38, 23 August 2011 (UTC)[reply]

Legal issues and WMF responsibility[edit]

Liability[edit]

File:Babar.svg
Elephant in the throne room, yesterday

The risk here is that we create liability. Currently you get whatever images are on the page, you take a risk that "any reasonable person" would apprehend - that on a page you might see either A. an apposite but offensive picture B. and erroneous and offensive picture, or C. a vandalistic and offensive picture. Once we have a filter in place, the user with religious views, a sensitive stomach, PTSD, or and expensive lawyer on retainer, may come across images that they believe (rightly or wrongly) the filter should hide. We have in effect given an undertaking to filter, which we are not able to deliver on, for a number of reasons possibly including:

Is this a horror image?
  1. Interpretations of the filter
  2. Error
  3. Backlog
  4. Pure vandalism
  5. Anti-censorship tag-warriors
  6. Perception of the image

And that doesn't touch on the consequences if we hide something we should not.

Rich Farmbrough 23:04 16 August 2011 (GMT).

And yet flickr have filters and aren't sued by expensive lawyers on retainer, when porn images leak past the filters. Why is that? John lilburne 11:56, 17 August 2011 (UTC)[reply]
My understanding is that flickr terminate user accounts when users fail to apply the filters to their own images correctly, which also removes all their images. How would you implement that here? Samsara 22:13, 19 August 2011 (UTC)[reply]
Have you heard of the concept disclaimer? 68.126.60.76 13:48, 21 August 2011 (UTC)[reply]
Indeed, and the project have always resisted adding more disclaimers. Firstly issuing disclaimers does not necessarily absolve from responsibility, secondly it can increase responsibility in other areas, thirdly it establishes a position we may not wish to be identified with. But certainly if the filter(s) were implemented disclaimers galore would be required. Rich Farmbrough 18:37 21 August 2011 (GMT).
Yes, a disclaimer would absolve from responsibility. There is no responsibility in this situation. If there was, Wikipedia would already censor all images with nudity or violence. Do you think people regularly sue Google over every instance where SafeSearch fails to block a pornographic image? You can't search anything without accidentally finding porn. I can't even find a disclaimer from Google, other than a note in their help page that "No filter is perfect, and sometimes adult content shows up even with SafeSearch on".
There is no need for "disclaimers galore", this is utter nonsense. We probably wouldn't need more than one sentence. "Wikipedia is not censored" is hardly less of a disclaimer than "Wikipedia may be censored at your digression but the censorship might not always be perfect" 68.126.60.76 11:09, 22 August 2011 (UTC)[reply]
Rich is right: merely posting a disclaimer does not absolve you from existing responsibilities. If there is no responsibility in this situation, then a disclaimer is completely unnecessary. On the other hand, if there is a responsibility, then a disclaimer is not a guaranteed get-out-of-jail-free card. To give an extreme example, posting a disclaimer that says "BTW, we hereby disclaim any responsibility for the spambot malware that we're deliberately infecting your computer with" will not prevent you from going to jail for breaking the laws about malware and spamming. WhatamIdoing 17:28, 25 August 2011 (UTC)[reply]
Good point. Regardless, I stand by my assertion that if Google can offer an imperfect image content filter without being sued, so can Wikipedia. Notwithstanding Google clearly has more money than Wikipedia and makes a better target for a frivolous case. 68.126.63.156 02:36, 30 August 2011 (UTC)[reply]

If the filter is off by default, the Foundation would probably not lose the "safe harbor" legal protections afforded to ISPs that do not censor their content. It might be possible to have new users and IPs get asked about it if a controversial image is about to be shown without risking those protections. In Islamic countries where we've been blocked, we might even have the defaults set first, although since I'm not an ISP lawyer I have no idea what that would do to the safe harbor provisions. 76.254.20.205 16:38, 24 August 2011 (UTC)[reply]