Talk:Image filter referendum/Archives/2011-08-24

From Meta, a Wikimedia project coordination wiki
Jump to navigation Jump to search
Warning! Please do not post any new comments on this page. This is a discussion archive first created on 24 August 2011, although the comments contained were likely posted before and after this date. See current discussion or the archives index.

Controversy in selection of categories, classification

I put a free text response in the voting page, and I am also sharing my implementation concerns here, since I don't know where else to address them (other than on the ballot). I am understanding of why this feature was proposed, though I don't approve of it; from the discussion it seems that this feature is unavoidable. I apologize if this is very long to read but, as I hold Wikipedia and free (to reuse) culture very dear to me, I took some time to write what I had to say.

Most of the sites I visit that accept user-generated creative contributions - en: Flickr, en: LiveJournal, en: DeviantArt, en: FanFiction.Net, for example - rely on the uploading user to properly classify their content; some have penalties if the contributing users don't do this correctly. This is a single setting, either a yes/no or an age scale, which indicates that the content may be unsuitable for younger audiences or contains graphic content.

The use of categories to label content presents challenges impartial and accurate labeling of existing content. Aside from the difficulty of manually review of millions of images, this process presents a situation where humans (with opinions and biases) are being asked to make subjective judgments on multiple categories. It also presents the opportunity for individuals and groups to request categories for their own purposes, causing a bureaucratic nightmare.

These problems, fortunately, can be muted by a judicious application of clear and sensible written policies outlining exactly how any image hiding feature is to be used. These policies should not be based merely on the "average person" (to quote the en : Miller test) or pure intent, but take into account the context of the image as well. Any resulting policies should reflect Wikimedia community consensus with adequate community input.

We should not forget Wikimedia's educational mission. I've seen enough drama to know how groups use the Internet as a venue to jockey for social and political influence - just as they do in the real world - through controversy-seeking, special accomodations, special rules, and efforts to assert and maintain power over societal norms, policies and beliefs. We don't need to set up another soapbox.

My response to the ballot, heavily edited for space, emphasis added:

First, I would like to express my concern that implementing this feature will turn Wiki editors and/or Foundation staff into the arbiters of decency, with controversy over (selection of) categories. I understand the purpose for the feature: just as search engines have ("SafeSearch"). I encourage the adoption of policies that guide classification (...) into narrowly-defined, conservatively-applied definitions of content that is generally applicable to a wide range of audiences.



The filter, if implemented, should be crafted with the intent of providing an enjoyable experience for Wiki audiences, in keeping with Wikimedia's educational endeavor, and not in response to controversy. (...) if the feature generates "image flagging wars", it has failed to serve its original intended purpose.

Regarding the question that "the feature be culturally neutral (...)." It is important that the Wikimedia Foundation not censor to the lowest common denominator in an attempt to appease every individual or group that may be offended. For example, epileptics may ask for a feature to hide Flashing GIFs. Some readers cannot stand the sight of blood, even in an educational context. Television viewers might want to hide spoilers. (certail religious) adherants object to (...) certain images and text. Adding numerous categories not only makes the filter difficult to understand and use, but lends itself to exploitation by interests that wish to abuse it as a venue for expressing opinions on what image subjects should be deemed controversial (...) which could include "immodest" uncovered women, illustrations of "God", time travel (illegal in China), even photographs of poop.

(...) perhaps we should look to the "viewer discretion" notices on television programming as an example of how to approach it. (...) With regards to implementation, I suggest that the description (in Image Filter settings) be phrased as "Hide" and "Show", instead of "Allowed" and "Blocked", to reflect that the image hiding feature is not a substitute for parental controls.

Books in the public library never came with flaps over pictures to hide them from readers that might be offended (...) If it's encyclopedic, why hide it? If it's not educational, why is it in Wikipedia?

Soapbox

  1. The proposal is another level of administration that will require more policing and generate more disputes over categorisation.
  2. If utilised it is effectively censorship. Wikipedia is supposed to be a reference work, ipso facto the proposal would be morally identical to censoring the dictionary; quite literally in the case of Wiktionary.
  3. This would be far too easy to extend to the censorship pages or words in addition to images. Are we going to end up with a list of articles that don't get shown to people who have the "violence" box ticked?
  4. If this is meant for parents / guardians of children to filter what they can view then it's pointless. Children do not need to go to wikipedia to obtain sexual imagery.
  5. The use of the word report in the fourth question is inflammatory and were the proposal to go ahead should not be utilised elsewhere. Do you want people reporting the dictionary?
  6. Each censored picture would be reliant on others to categorise; what happens when a picture of a bridge is categorised as being "violent"? Does that bridge then get removed from view until someone realises that this has occurred.
  7. Violence, and sex, happen. Often. By allowing people to remove subjects from view we are aiding a disassociation from occurrences that everyone should be aware of.

Vote! Kae1is 12:48, 20 August 2011 (UTC)

A waste of server resources paid for with donated funds?

Wikipedia has only been able to scale as far as it has by placing servers behind Squid or Varnish caches so that a page is not re-rendered (a processor-intensive step) every time it is requested for reading, in normal view, by anyone other than a logged-in user. The last rendered version is stored on a separate front-end "reverse proxy" (I believe these are Squid on WMF projects) and simply retrieved in their already-prepared form from the hard disc cache on that server - a relatively quick and inexpensive operation.

Generate a multiplicity of bowdlerised versions of the same page, and the version with that "peace be unto him" guy´s photo censored can't be served in response to an anon-IP request for the same page but with those pesky vaginae with their sharp teeth and tedious monologues excised in much the same way images of the human breast must be excised to avoid traumatising innocent babies. That means a greater number of these requests, instead of cache hits, become cache misses which require the MediaWiki origin servers to re-render the entire page in some censored, butchered form. This isn't just a violation of established (WP:CENSOR) policy (much like the one-sided e-mails are something beyond just a WP:CANVASS violation) but something which will actually increase server load, requiring a greater number of Apaches be deployed to endlessly render and re-render multiple butchered versions of the same page. If those were pushing for this were paying from their own pockets to run the site, that would be one thing, but WP servers are funded through donations and WP editors are regularly solicited for contributions. This being the case, more servers just to deal with an artificially-created page-render loading caused by an ill-advised censorship scheme is not a worthy use of donated money and not something to which I would wish to contribute. --66.102.83.10 22:37, 20 August 2011 (UTC)

Effort better spent on improving articles?

I don't know why we have such flurries of activism for implementing new features when the basic quality of our articles is still so low, AND we're losing contributors. Yet the changes being proposed recently don't address the problems that exist in the community, we're merely pandering to the readers, and locking more articles from editing than ever before. I also see many contributors entering into a sort of excuse culture, where typical claims are "I can't write articles (but I want to be an admin)" (can't drive a ship but want to be captain? sure, here's your badge *pins*) and "I've no expertise in the field". If you can write a comment, you can write an article, and if you don't have expertise, well, neither so did anybody else before they immersed themselves in their current field of expertise. So neither of those are valid excuses, and I suspect this goes for most others that have been tried. Samsara 22:02, 19 August 2011 (UTC)

Avoiding people being offended by certain content enables them to contribute without worrying about what they might come across. So this proposal is furthering the goal of keeping contributors onsite for longer. Promethean 07:07, 20 August 2011 (UTC)

-

At the other hand we have hundreds of active authors that now spend their time with categorizing images, discuss the pro and cons or doing edit wars to please the admins. Just imagine how many articles could be improved or written in the meantime. --Niabot 16:32, 20 August 2011 (UTC)
Define "offended". Speaking honestly, I find specific categories of religious images offensive. During the middle ages, it was considered a sacrilege to use living babies as models for the infant Jesus, so in many paintings of the Madonna & Child a dead baby -- which was either a stillbirth or often born deformed -- was used. This is why the infant Jesus looks bizarre in those medieval paintings. And then there is the issue of paintings of the Crucifixion, many of which are as grisly as a auto wreck; I wouldn't be surprised if medieval people with a BDSM fetish expressed it through representations of martyrdoms. If this measure goes through, I will make an effort to get those offensive images filtered out by default. -- Llywrch 21:58, 20 August 2011 (UTC)

My old Schoolbooks

Hey, I've just take a look into my old schoolbooks. I still got them. Its my old history book and my old biology book. The history book is a collection of violent scenes of all times. The horror of the WW 1 and WW 2, the holocaust, slavery, and so on. My biology book can't called Pornographic but you can find the explict representation of the human body. Now I'm asking you, Sue and the Board. Do they think, Wikipededia should be not so educational as my old school books? Should Wikipedia not undertake the same en:Didactic method, Teachers expect of there students? As student you can't look the other way if you really wanna learn somthing. It's also a question of real understanding a topic. But it seems the Board of WMF knows better than all the pedagogues and teachers of the world.

Or is this nonsense a question of avoid touble with conservative powers in USA? Hey, if you don't want to learn something you don't have to! All the Authors of Wikipedia spent their time by collecting the knowledge of the world. But the board is not interested in all these questions of what the sense of this work is. I think they don't care. -- WSC ® 10:51, 20 August 2011 (UTC)

Process

Read the statements

{{editprotected}} The phrase "Read the statements" in the How to vote section should probably be linked to the section What will be asked?, where they appear (I assume those are the "statements" being referred to). - dcljr 16:05, 19 August 2011 (UTC)

Ah, right. Yes, I agree. I spent ages trying to read the statements without actually finding them. In fact, I assumed it meant a series of pro and con statements regarding the proposal rather than your interpretation (But in hindsight I think you're right). Definitely agree it should be linked, and would also suggest that users are recommended to read this discussion page before voting.--78.146.25.153 11:38, 20 August 2011 (UTC)
Yes check.svg Done - Hoo man (talk) 16:10, 21 August 2011 (UTC)

Eligibility and emails

Voting eligibility

I have to admit that I simply can't understand the rules for eligibility to vote for this election. My questions include:

  • Why can't Wikipedia readers vote? (should this really be restricted to the editing community?) - this is my key point in leaving this message. I know that this presents technological difficulties, but this should be the same technological difficulty that is present in implementing the image filter, so the difficulties should inherently be solve-able (otherwise this exercise is somewhat moot)...
  • Why should mediawiki developers, WMF staff and contractors, and board members be eligible to vote if they don't meet the 'editor' criteria? Either they should have sufficient experience on-wiki to meet these criteria, or they will most likely be working on topics unrelated to this and hence their vote shouldn't necessarily count
  • If WMF staff, contractors and board members are eligible to vote, then shouldn't the same apply to other Wikimedia staff, contractors and board members - i.e. including those of the Wikimedia chapters?

P.S. my input here is meant to be constructive, and I hope it comes across as being in that spirit. Apologies if it comes across otherwise... Mike Peel 21:27, 23 July 2011 (UTC)

We did try to come up with a way to include Wikipedia readers; however, it is nearly impossible to ensure a representative result. For example, certain countries have minuscule IP ranges, and it would be impossible to tell if votes received were from hundreds of different readers, or a handful of readers voting multiple times; the same could be true for readers from large institutions who operate through a single or very small IP range. As to the remainder of the criteria, I believe this is intended to be the standard group of criteria, with the only variation between votes/referendums being the minimum number of edits. The intention is to be as inclusive as possible while still doing our best to ensure each person casts only one ballot. Risker 23:23, 24 July 2011 (UTC)
How about using cookies to identify individual computers? There is still the potential for people to abuse that, by deleting the cookie / resetting the browser, but it largely avoids the IP address issue. You could also (temporarily) record the additional information of IP address, internet browser, operating system version, etc. in combination with looking for identical votes (e.g. same rating for all questions each time) which would help identify multiple votes from the same location. Since there should be a very large number of people voting, a small number of people voting 2-3 times won't matter that much (it'll be well within the noise / uncertainty) - so you'd only need to pick out cases where someone might have voted over 10 times or so.
Re "standard" - I don't believe that there is a standard yet, since this is only the second referendum that I'm aware of (the first being the license migration), which means this is part of setting the standard. Either way, the voting criteria should be thought through each time to make sure they're appropriate. I'm obviously all for being as inclusive as possible, but I'm not sure that the additional criteria are inclusive in a balanced way. Mike Peel 08:42, 25 July 2011 (UTC)
I don't think this sounds like a good idea at all. Cookies can be deleted very quickly, a new IP address can easily be requested, and there is no law that says that a browser has to send the same identifying information all the time. While each of these may seem like a layer of extra protection, the problem is that the one hacker defeating the vote will know how to get around them all. Of course, it is possible that an editor could start - or hack into - lots of extra accounts to do the same thing, but at least that actually sounds like work.
Also, I really think that someone who only reads, never edits, wouldn't appreciate the issues involved. If all you do is read articles you have no idea of the kinds of rancorous debates that can get started over one word in an article. This "filter" idea might sound perfectly practicable to someone in that position, who doesn't realize how much drama would erupt over any borderline categorization. Wnt 19:56, 30 July 2011 (UTC)
I'd like to know the opinions of the logged-out readers, but it sounds technologically difficult. Because of these difficulties, I probably wouldn't treat the two kinds of votes as being equivalent. That is, it might be nice to know that about ___ percent of unregistered users thought this or that, but you'd want to take that with a huge grain of salt, whereas the same statement about registered users could, I think, be relied on as fairly precise. WhatamIdoing 19:27, 4 August 2011 (UTC)

I don't see how it would be a standard that "Developers" "staff, contractors" of the Wikimedia Foundation, "Board members and advisory board members" can decide about the content of wikipedia. How many contractors does the Wikimedia Foundation have, and who are they, is there a list? If I spend money to the Wikimedia Foundation, do I become a contractor then? --Rosenkohl 13:06, 17 August 2011 (UTC)

Being a contractor means that they pay you to do some job. If you pay them, you become a charitable donor, not an en:independent contractor. WhatamIdoing 18:52, 23 August 2011 (UTC)

I have made more than 10 edits before August 1, but I always edit from an IP address. (And go back the next day to find out who has done a knee-jerk IP-address-edit-revert. And then I talk them into unbiting me. I understand the logistical difficulties in verifying a single vote. If there's any way of getting around that, here's how I would have voted:

On a scale of 0 to 10, if 0 is strongly opposed, 5 is neutral and 10 is strongly in favor:

  • for the Wikimedia projects to offer this feature to readers. 3 (How much slower will this make it?)
  • that the feature be usable by both logged-in and logged-out readers. 10 (There's already too much discrimination against logged out users.)
  • that hiding be reversible: readers should be supported if they decide to change their minds. 10 (Can you imagine if this isn't reversible... but is available against IPs? My ISP dynamically allocates IPs....)
  • that individuals be able to report or flag images that they see as controversial, that have not yet been categorized as such. 0, unless there is a board of administrators who can reverse these markings, which would be just as susceptible to vandalism.
  • that the feature allow readers to quickly and easily choose which types of images they want to hide (e.g., 5–10 categories), so that people could choose for example to hide sexual imagery but not violent imagery. No opinion
  • that the feature be culturally neutral (as much as possible, it should aim to reflect a global or multi-cultural view of what imagery is potentially controversial). Not enough info.

173.206.132.10 23:02, 17 August 2011 (UTC)


IMHO, the 10 edit count eligibility section of this article should mention whether discussion edits count, or just production edits. 173.206.132.10 23:02, 17 August 2011 (UTC)

The software doesn't distinguish between edits based on where they're made. The eligibility requirements include all edits made. Cbrown1023 talk 02:53, 18 August 2011 (UTC)

I received an e-mail stating that I am eligible to vote but the given link states "Sorry, you are not in the predetermined list of users authorised to vote in this election." Is anyone else having this issue? What gives? Joshuaism 21:08, 21 August 2011 (UTC)

(Note: Replied at user's talk page. Mdennis (WMF)}


I'm qualified, but was told I can't vote

I got email to come and vote on this.

I took the time to read the proposal, carefully.

I fulfill the eligibility requirements. (I mostly edit on http://en.wikipedia.org/wiki/Main_Page.)

I click on http://meta.wikimedia.org/wiki/Special:SecurePoll/vote/230.

I am told:

Sorry, you are not in the predetermined list of users authorised to vote in this election.

So why was I EVEN sent the email?

My time is valuable, as is all editors, and all people. Please respect all of us.

Please fix this.

Please also fix the spelling of authorised. It has a 'z', not an 's': authorized.

I wonder if WP is really ready to implement this kind of feature.

If it can't get the voting software right.

If it's programmers, don't use spell-checking software properly.

Thanks. Lentower 14:01, 20 August 2011 (UTC)

Please, I would appreciate a note, when this is fixed and I can vote, at http://en.wikipedia.org/wiki/User_talk:Lentower. Lentower 14:06, 20 August 2011 (UTC)
Hi. :) This is confusing, I know, but you actually need to access the voting page from a Wiki where you qualify. As it says, "Go to one wiki you qualify to vote from. In the search bar, type in Special:SecurePoll/vote/230." If you're logged in at en Wikipedia, you can just click this link: [1]. In terms of the z and s, in "authorised/authorized", this isn't actually a misspelling, but a difference between English and American variations. Custom is not to prefer one over another.
I'll duplicate this note at your local talk page, but I wanted to answer here as well in case others run into the same issue. --Mdennis (WMF) 14:56, 20 August 2011 (UTC)
Thanks. But PLEASE make the instructions MUCH MORE clearer. Lentower 01:58, 21 August 2011 (UTC)

What's a Referendum and what's this

understanding the referendum; WMF Board != Mubarak

Since i (and maybe many other people) didn't read this carefully enough the first time, here's some analysis that may help. i'll sign separately so that people can comment separately. Boud 01:36, 17 August 2011 (UTC)

will or would be implemented?

Let's read the description of the "referendum" carefully - my emphasis throughout:

  • ...feature, which will allow readers...
  • ... Such a feature was requested by the Board of Trustees in June 2011. ...
    • requested = wmf:Resolution:Controversial_content: We ask the Executive Director, in consultation with the community, to develop and implement a personal image hiding feature that will enable
  • The feature will be developed for, and implemented on, all projects.
  • What will the image hider look like?
  • this talk page, above: Hi Danaman5, yes, the Board passed a resolution asking for this kind of feature to be developed; so that's a given. The questions are asking for community input in how it is developed and implemented. -- phoebe | talk 05:00, 15 August 2011 (UTC)
  • this talk page, above: Or can some project choose, not to use this system at all. --Eingangskontrolle 17:37, 16 August 2011 (UTC) / No, the Board of Trustees has directed the Foundation to proceed with this. It will be integrated into the software, as I understand it. There will not be an opt-out. Philippe (WMF) 18:17, 16 August 2011 (UTC)

Will is clearly what the Board has decided and requested of the Executive Director. Boud 01:36, 17 August 2011 (UTC)

why/who?

So this is mostly based on a report by two people, neither of whom is WP-notable, only one of whom has Wikimedia project experience, and that experience seems to be mostly only about the controversial content study on meta, and no practical experience in editing en.Wikipedia articles or Commons content. Maybe Robertmharris edited earlier under another username, but he doesn't choose to tell us that. There's no indication of the degree of Robertmharris' multilingual fluency and worldwide experience of knowledge and censorship questions e.g. in en:Belarus, en:People's Republic of China, en:Saudi Arabia, en:Libya, or en:Egypt.

Shouldn't a minimum criterion for a study that is used as the basis for an irreversible decision (in the sense of "will" above) by the Board be that the study is either by someone well recognised (en:WP:RS) as having credentials in the free software/free knowledge/right-to-knowledge community and/or as an experienced Wikipedian? Boud 01:36, 17 August 2011 (UTC)

I have asked user Robert about his qualifications, as his user page does not show anything pointing in that direction. --91.62.240.106 05:48, 18 August 2011 (UTC)


Hi Boud -- yes, the suggestion came from the report by Robert Harris, who was contracted -- specifically as an outsider -- to write the report, which was then heavily discussed on meta and on list. But the report was not wholesale accepted by the Board; we considered everything seriously, in the light of community discussion, and picked out a few things (and added a few things of our own, or deriving from discussion, as well), and did modify the original recommendation -- stating the principles under which we would like to see such a thing implemented, and so on. In other words, while of course you are more than free to disagree with the decision, I do think you're underselling the board's work a bit, and underestimating the degree to which experienced Wikimedians did look at this. See also my comment below (under "discussion") for a bit more background. -- phoebe | talk 13:40, 18 August 2011 (UTC)

Hmm, it seems to me as-read that the board position does not require a categorization system designed for self-censorship per-se. It seems that a simple button to hide an image would suffice in many cases.

"Eeeuw! that squicks me out!" , click button (typically a cross), image hidden until you re-show it. Easy enough!

If a preference is asked, it could be "hide all images".

If no categorization system is used, we don't provide the weak point for 3rd parties to exploit.

--Kim Bruning 16:07, 19 August 2011 (UTC)

Robert M. Harris is a broadcaster and journalist who has worked for the Canadian Broadcasting Corporation for the past thirty years. His daughter Dory Carr-Harris is just finishing her MA in Cultural Studies at Goldsmiths, University of London 2010 Wikimedia Study of Controversial Content/Archive --Bahnmoeller 18:37, 21 August 2011 (UTC) And surprise: Sue Gardner began her career on Canadian Broadcasting Corporation (CBC) [3]. It smells badly. --Bahnmoeller 19:13, 21 August 2011 (UTC)

hypothesis

en:Sunk_costs#Loss_aversion_and_the_sunk_cost_fallacy: Maybe the Board paid Robertmharris for his "consultancy" and so felt that the result had to be followed without giving unpaid Wikipedians the chance to say no. Is this really what the Board is supposed to be about?

en:Sunk_costs#Loss_aversion_and_the_sunk_cost_fallacy argues that this behaviour is irrational. The consultancy was done and a report written. But a decision on implementation should be not be hardwired by the previous decision to pay (?) for the consultancy. Whether or not money was paid for the report is only of minor relevance here. The report was written and nothing will change that. Whether or not recommendations 7 and 9 will be implemented could, in principle, still be prevented or, at least, reverted into a decision-not-yet-made. Boud 01:36, 17 August 2011 (UTC)

what to do

It seems clear to me that if the community wants to say "no" to this feature, then the community has to somehow convince the Board so that the Board reverts its request to the Executive Director. The referendum is not going to do that (as some people have already said on this talk page).

Maybe one way to say "no", or at least to give the community the possibility to say no, is to write a formal proposal that is likely to get overwhelming consensus from the community and says that the community

  • asks for the so-called referendum to be temporarily withdrawn and replaced by a true referendum which:
    • includes a question like "Do you support the implementation of an opt-in personal image-hiding feature in the WMF projects? scale: 0 = strongly opposed, 10 = strongly in favour".
    • is written using the conditional "would" instead of the definite "will"
  • asks for the Board to withdraw its request to the Executive Director pending community consensus.

The primary decision would then go to the community, which is not presently the case.

Maybe an RFC? i'm not much involved in internal WMF project community organising - other people here almost certainly have more experience in that than me....

The Board is not Ben Ali or Mubarak (and let's hope it's not Gaddafi)! We can, in principle, still say "No" or "We will decide" to the Board. Boud 01:36, 17 August 2011 (UTC)

I would hope, and expect (and it seems the case), that at least some of the board is following this discussion. The basic premise that "a method for not seeing stuff that is really bad" is a good thing, all else being equal is sound. Unfortunately careful thought and discussion have demonstrated that all else is not equal, and the game is not worth the candle without a fundamental rethink. I hope, and expect, that the board will share that opinion, having reflected on the difficulties the proposed system would bring. The matter needs to be taken back to the drawing board to find if there is a viable alternative. Rich Farmbrough 02:40 17 August 2011 (GMT).

It's enough! We, the ordinary Wikipedians have to recognize that the Foundation no longer stands on our side (at least it looks like that). They just started to waft and lost contact to the base. They deside on their own and they don't bother to ask us. The so called "referendum" is a farce. We can't change anything. The intoduction of the censorship filter has already been decided. Alea iacta est. That's not a bit democratic. Such a hubris destroys the keystones of this project. But the Foundation may never forget that WE the Wikipedians created the content of Wikipedia. WE made Wikipedia to what it is now. I have to warn you: It was never a very splendid idea to ignore the populace... (and you are wondering why we lose contributors) *facepalm* Chaddy 05:37, 17 August 2011 (UTC)

i have posted a page move proposal. See Image filter referendum and go to Meta:Proposed_page_moves#Image_filter_referendum for support or opposition to the move proposal. This alone will not override the Board's decision, but hopefully we should be able to clarify the situation. Boud 10:26, 17 August 2011 (UTC)
Thank you for the information. Chaddy 15:30, 17 August 2011 (UTC)

Referendum?

I hardly think it is. The questions are exceedingly odd for a referendum. More like a discussion of certain narrow points. The debate above has raised several objections to the proposal on the lines that this would make it technically easier for a third party to censor wikipedia. Then there were the objections that it will waste editors time, and ultimately please no one.

I read most of the above, but did not find reasoned technical responses to the reasoned technical objections. Does that mean that none exist? At first glance it seems a nice idea to give people more choice about images they might personally find objectionable. However, if doing this makes it easier for them to be fed distorted information, then I oppose the scheme. I could not support it unless a convincing case is made why the weaknesses introduced by the scheme are outweighed by a widening of wikis reach. I dont believe in censorship and I think compromising over censorship is still just censorship but with an official label of acceptability. Sandpiper 21:14, 20 August 2011 (UTC)

I agree it's not a 'true' referendum in the usual sense of the word. It's more of a values survey. To be a referendum, we'd need to know in advance how votes will affect policy. But that's okay-- referendum or survey, it's still good to have one, and discussion is still the most informative way to share your view. :) --AlecMeta 21:39, 20 August 2011 (UTC)
The fact that this is not a referendum has been pointed out at least ten times on this page. Is no one reading this who can actually change the wording? 86.179.1.163 23:40, 20 August 2011 (UTC)
No. No one is paying attention. The whole wording of the first page of the referendum is written in a way which puts the filter in a positive light (not very neutural for those who are conducting the referendum). The FAQ is completely PRO filter and this discussion page is hard to find. No wonder the whole discussion page is filled with criticisms of exasperated users. No one is listening. --94.225.163.214 01:28, 21 August 2011 (UTC)

Design ideas and issues

suggestion from Dcoetzee

I'm pasting this from a mail I sent to Philippe the other day on this topic.

I can't claim to represent general community consensus, although I have some idea what that consensus is like. In particular, in discussions of image filtering the community has made very clear that they don't want WMF projects to have any direct involvement in the construction or maintenance of blacklists, which are lists of articles/files or categories that are blocked for a particular user.

I have a vision for what I think a personal image filtering solution should look like, but not the time and resources to implement it. Basic outline follows.


Customer

Persons interested in self-censorship and censorship of small children by parents or guardians. In particular, censorship of teenagers or technically adept persons is not a goal for me. This is helpful because it means strong technical measures against circumvention are unnecessary.

If it's really intended to prevent random users from sexual contents and/or contents of violence, the official blacklist for users without an account will have to contain any file dealing with such matters.

Requirements

  1. If an article/page/entry is included in the user's blacklist, and is not in their whitelist, attempting to view it will produce an error message and show no content. If the user is using the software in self-censorship mode, they may proceed to view it regardless. Otherwise, a password is required to proceed to view it.
  2. If a file is included in the user's blacklist, and is not in their whitelist, it will be hidden, showing it its place an icon indicating that it was blocked. If the user is using the software in self-censorship mode, they may click it to reveal it. Otherwise, a password is required to view it.
  3. A user may at any time add a page or file they're looking at to their blacklist, hiding it and preventing it from appearing in the future.
  4. When a password is entered to view a page/file, either a time limit is given, or the page/file is permanently added to the user's whitelist.
  5. Both blacklists and whitelists may be specified using a combination of individual pages/files as well as categories of pages/files, which will include subcategories. Whitelists take priority over blacklists.
  6. A public distribution center is available (ideally on a third party/unaffiliated site) where users can share their blacklists/whitelists, and import those of others. This allows communities to specialize the software to meet their community standards, while still being able to individually adapt to their personal standards and the needs of the specific child. Several blacklists can be imported and merged into one.

Architecture

  • Filtering can be done either on the server side (with the cooperation of Mediawiki software, by associating blacklists with user accounts) or on the client side, using web browser extensions. I favor the latter approach because it avoids Mediawiki ever having to deal with blacklists, which is a point of contention in the community. This also avoids trivial circumvention methods like logging out or clearing cookies.
  • Blacklists/whitelists can be stored on local storage. Initial blacklists/whitelists are not shipped with the web browser extension, but are instead selected from the above-mentioned public distribution center.
  • Categories can be extracted using the Mediawiki API.

Design notes

  • To avoid circumventing the filter by editing to remove categories, recent removal of categories can be ignored. If the edit stands and is not reverted as vandalism it will eventually be accepted.
  • In addition to retrieving the categories of each article/file used in the article, it's necessary for each category to walk up the category tree to get its ancestors in case they're listed in a blacklist/whitelist, which can lead to a lot of Mediawiki API calls per page load. This can be mitigated by caching categories.

I pointedly avoid the idea of "labelling" specific articles or files with a "type" of content, although a blacklist may be intended for a specific type of content. I also strongly oppose the idea of having a single one-size-fits-all blacklist, or even just a small number of them to choose from - I think it's essential to let the user community build lists to their tastes, and to force users to choose among a wide, fine-grained collection to discourage overly broad filtering.

Any feedback is welcome. Dcoetzee 07:48, 2 July 2011 (UTC)

I heartfully agree and support Dcoetzee proposal above. It will be much less contentious and more productive if the whole debate about what is censored and what is not stays out of Wikimedia, as well as the way of implementing it.--- Darwin Ahoy! 02:00, 3 July 2011 (UTC)
I second your heartfelt agreement. :D This seems like exactly what I wanted when I heard of this idea earlier. Whitelist is a nice addition, blacklist with individual and broader options is great.
I don't like the idea of the filter being enabled "by parents or guardians". If the small children is considered grown up enough to surf the web, he's certainly smart enough to control the filter as requested by his parents or guardians. It seems an unnecessary complication (management of passwords?!) with great danger of abuse e.g. by network managers. Nemo 07:37, 3 July 2011 (UTC)
A password seems too much. That would conflict with the idea that no readers are prohibited from seeing any image. Better for such a system to simply let readers change the default display setting (hidden or not) for categories of images. SJ talk | translate   11:42, 3 July 2011 (UTC)
I see nothing wrong if this is made based on existing content labelling standards. This would allow existing parental control softwares to work immediately with such labels. It's not our job to determine if these labels are working or not, or can be easily avoided by people wanting to pass over. But at least those that want those filters for themselves will be satisfied. It's just enough and if it allows more people visiting or contributng without fearing to see such images, we will win something. (Note that enyway, content filters based on the textual content of articles are already working, we made absolutely nothing to prevent this, but if the existing softwares can't easily parse images, all they will finally do is to block images from Commons completely, whatever they are showing in any article : this is already occuring in third party sites that are reindexing and republishing selectively the content of Wikimedia sites).~
We are NOT proposing any kind of censorship, only content labelling based on existing standards. Censorship only occurs elsewhere and we have absolutely no control on those third party sites or softwares. Users will setup their own filters themselves or will have to live anyway with these third party sites and softwares; or to their national legislation, if it is applicable to them, but we may (and probably should) still help them comply to their law). verdy_p 16:56, 3 July 2011 (UTC)
Wrong. Categories should be broad and small in number. One either doesn't mind seeing photos of sexual activity, or chopped up body parts or one doesn't. Broad banding is preferred unless one really wants to give users the choice of saying don't mind porn but none of that gay stuff, do you really want them to be able to say no to piss Christ but yes to piss Mohammed, or no piccies to dead Westerns yes to piccies of dead Africans? John lilburne 17:11, 3 July 2011 (UTC)
John - Can you articulate why it would bother you if someone else chose to see one of those complicated sets of categories? SJ talk | translate   18:59, 4 July 2011 (UTC)
Principle of least astonishment. Imagine someone who is taken aback upon seeing a sexually explicit image, decides to use this feature, and finds them self looking at a menu that catalogues every conceivable particular sex act or fetish. It could be a double-take aback. ~ Ningauble 19:39, 4 July 2011 (UTC)
That's a good principle for interface design - as is broad banding. But that doesn't necessarily preclude having an option to create a custom set of categories (or choose from a longer user-defined list) for one's own use -- which is how I read John's comment.

Lets ban all works of art with nudety. Just to be on the safe side. --Eingangskontrolle 15:32, 16 August 2011 (UTC)

"We are NOT proposing any kind of censorship, only content labelling based on existing standards." So does that mean you are proposing a special "content labelling" for articles like en:Tiananmen Square protests of 1989, en:Falun Gong, en:Tibetan independence, en:Taiwan independence, en:corruption, en:police brutality, and en:anarchism? These are considered to be unacceptable articles according to a well-established and reliably-sourced (RS'd) existing standard. The NPOV name for that existing standard includes the word "censorship" (whether i agree or not is irrelevant). Boud 22:10, 16 August 2011 (UTC)


Implementation: overloading not advisable

I don't actually think overloading the purpose of categories in the way that some people here suggest is a good idea. We've overloaded other features (e.g. revert button at one time got reverted contributors shortlisted as vandal candidates, with further automated reversions more likely in future). And to the best of my knowledge, no WMF site can give you any information about a main page due to namespace constraint (main space in article rather than project space, and yes, it's configurable). The volume of advisories about what we should and shouldn't do on-wiki is already unbearable - SOFIXIT culture has been eradicated, apparently. So having more advisories about applying special tags and categories to express levels of violence, sexualisation or religious/other political correctness is just going to suck our life out imo. Samsara 22:02, 19 August 2011 (UTC)

Whichever categories are chosen to be part of the filter, the filter feature will become the category's whole purpose. I assume we'll figure that out and just make special categories like Filter Requested due to....
Until the system is infinitely customizable, there will be unending wars over what images go in which categories. But is that really a bad thing? I don't see how censors edit-warring over their preferred forms of censorship is inherently a bad thing. Maybe it will stop them asking for deletions or edit-warring over article-space, for example. --AlecMeta 22:19, 19 August 2011 (UTC)
It would leave people with a normal/average sense for controversial content no option. Either they choose to activate the filter, in which case more content is filtered as necessary (thanks to a very conservative but loudly screaming minority), or they choose to not use the filter and see what we already have. But of course more time will be spend on an new issue that wasn't present before. --Niabot 22:35, 19 August 2011 (UTC)
There is no "normal/average sense for controversial content" I'm afraid. I don't mean that even in a moral or metaphorical sense, I mean it in a mathematical one. There is no "average" to be found, there will be no clear lines.
The slippery slope will inexorably slide towards overfiltration-- but that's okay. It doesn't have to be perfect, it just has to be "a new feature that is not inconsistent with our values". There it succeeds. --AlecMeta 00:04, 20 August 2011 (UTC)
I strongly disagree with that point. We just open a new battlefield which has nothing to do with the creation of an encyclopedia. It doesn't increase the quality of our content. But instead it binds time, increases the effort and costs money that could be invested in projects that would truly help the goal of the project. --Niabot 08:04, 20 August 2011 (UTC)


Feature suggestion

having predesigned categories of what one wants to filter is limiting to the individual. For example, I noticed there is no option to block out religious symbols/images, or images that a reader might consider heathenous if they are religious. maybe there could be a setting where the reader can add or create his own categories, and images will be tagged by editors with keywords, and the reader must choose which keywords they must filter. 129.93.232.201 11:58, 20 August 2011 (UTC)

This is what I call the 'perfect proposal'-- infinitely customizable. I think the only issue with it might be resource cost. Making a tiny few categories will be easier on the developers and the servers, but an infinitely customizable filter will be harder to build. I haven't seen any good estimates on the resource difference to allow me to intelligently choose between the two, but my heart is with the 'perfect', '100% culturally neutral' filter. But, not if we have to sell the farm for it. --AlecMeta 18:41, 20 August 2011 (UTC)

Usage of the filter

Combination with content filters

When designing this feature, in the case it gets accepted, perhaps you also could find a way to allow content filtering software to integrate it in their products? Or perhaps a little software that automatically enforces certain filters, protected with a password? This would mean that parents who don't know how to set up the filtering themselves, or who want to prevent their children from disabling it, can also profit from this. Adelbrecht 19:44, 5 August 2011 (UTC)

This is a thorny but important issue. Currently we don't have supervised accounts where parents can set preferences for their children, hence this proposal where an 8 year old would be only a click away from overriding a filter. I would anticipate that any content filtering software should be programmable to use these filters - the bigger question is which content filters filter out the whole of Wikimedia and which only filter out certain images or categories. WereSpielChequers 04:49, 9 August 2011 (UTC)
Such an application would fail the precept that Wikipedia is not censored. The feature must make it provably impossible or at least provably extremely impractical to set up such a filter. If this is not possible, this proposal cannot be carried out. --Kim Bruning 10:51, 21 August 2011 (UTC)


How will "the community will decide the category structure and determine [image categorization policy]"?

What images will be hideable?
Hideable images will be grouped into categories. The community will decide the category structure and determine which images should be placed within those categories.

As others have said above, (I redundantly reiterate their sentiments merely to bump the issue), this will just lead to categorization edit-wars and contentious, unproductive debates over which categories are worthy of being censorable (or possibly significant server strain to accommodate ridiculously flexible censorship options). I could perhaps envision semi-private categories created by subcommunities, but this would have obvious ownership+neutrality issues. On the whole, I think the entire thing is much better left up to third-party content filter providers; there is no need for Wikimedia to so entangle itself. I, for one, shall therefore vote this down. There are the best of intentions at work here and some good ideas, but I fear the actual plan of execution is fatally marred and frustratingly vague. Good day, --Cybercobra (talk) 07:27, 9 August 2011 (UTC)

I tend to agree. However worthy the idea, I don't think our category systems are robust enough to make filtering reliably workable. Some of the ideas about the category system (which I take to really mean the Commons category system) assume that mis-categorization, or malicious (re)categorization would be noticed and fixed in short order - I don't think this can be expected to happen reliably, there aren't enough eyes at work here, and as far as I know you can't "watchlist" the contents of a category. I think any filtering or censorship needs to be done outside the current projects. If there aren't already such projects I think a children-safe wikipedia project would be one approach. --Tony Wills 09:39, 9 August 2011 (UTC)
I see from the Personal image filter page that the idea is to have a separate flat category branch for filtering, that would allay my fears (it might cause other problems, but wouldn't impinge on the current categorization of files). --Tony Wills 10:59, 9 August 2011 (UTC)
Even if the categorisation is currently incomplete that isn't an argument not to do this. If someone wants to filter out penises and the filter only filters out 95% of them that is a job more than 95% done, not least because a complaint about one of the 5% is easily resolved with hotcat rather than as at present an argument as to whether something is encyclopaedic or educational. WereSpielChequers 18:05, 9 August 2011 (UTC)
An interesting side effect of this sort of filtering is that editors may be less inclined to self-censor, and will add more provocative images to articles on the basis that those who do not like them can filter them out. --Tony Wills 22:55, 9 August 2011 (UTC)
Or the opposite could happen with people choosing to illustrate articles with images that people are willing to see. But my suspicion is that the vast majority of current editors will censor few if any images that are actually used in articles. This is something that a small minority of editors are very keen on and which will have little or no effect on editors who choose not to use it. WereSpielChequers 02:18, 10 August 2011 (UTC)
Images are not meant to be decorative, instead they must be instrumental. And NOTCENSORED policy is not the escape route for adding provocative images into the article needlessly. -- Sameboat (talk) 15:47, 19 August 2011 (UTC)

The separate filter categorization system is very obviously exploitable, in at least 2 different ways that I can think of off the top of my head. There exist parties which are willing, able, and capable of executing these exploits on a routine basis. What soft and/or hard security measures are being taken to prevent exploits? What are the costs of these measures as expressed in community time, CPU time, or currency ? --Kim Bruning 15:38, 19 August 2011 (UTC)

Firstly, let me state that because i am making this comment i by no means support the creation of this system. But if it will be created than any person should be allowed to create a category and then place images into that category for the purposes of filtering. The only rule should be "is it true". For instance, the only thing that i would even consider using such a filtering system for is for pro-capitalist propaganda. So i would then be able to create "Category:Pro-capitalist" and place any image or other media which is uncritical of capitalism, i should then be able to use this category for my filter, and others can do that also. Beta M 18:25, 21 August 2011 (UTC)


Concerns

Complicated editing

If this feature is implemented, then when I'm editing the article, I will need to think "hey, how this article will look if user uses this filter? Or that filter? Or that combination of dozen of filters?" In many cases images are essential part of the article, and making article comprehensible without them (or worse - with an arbitrary combination of them), will be an editor's nightmare. And while I am not involved in editing of sexual or religious articles, I am not sure that this tool won't evolve, for example, to allow filtering out "all images which are related to Microsoft Windows/Linux/..." - if it ever happens, then I will find my editing severely affected. Ipsign 07:23, 16 August 2011 (UTC)

As you can see from the mock-ups, it will be immediately clear that an image is hidden. The box shape and size shouldn't change at all, and the caption won't be hidden, so this shouldn't make images look different at all. Cbrown1023 talk 15:14, 16 August 2011 (UTC)
Because of the bug that is "being investigated" (see below), i cannot check the referendum questions yet, but the "What will be asked?" section on "Image filter referendum" does not seem to propose a question like "How prominent should the warning that an image is hidden be?" Are you making this statement as a promise by a member of the tech team working on this? Or as a promise by the WMF Board? A mockup is just a possible implementation, not a definite tech proposal. i agree that the mockup proposal seems sufficiently prominent that it's hard not to notice it. (In fact, it is probably so prominent that censoring authorities would not be satisfied, since it would encourage the curious to circumvent the filters.) But i don't see any promises that it would remain that prominent.
As for the main point being made here, i too would expect that in articles where a picture is important, there would be many editors who would feel that the article needs to be primarily readable by those with filters rather than those without. Given that "a picture is worth a thousand words", this effectively means several thousand extra words may often have to be added to articles, making them seem longer and more complicated to readers with filters turned off. Boud 21:35, 16 August 2011 (UTC)

Simple editing: All reported images have a white space with option “some users found this image offensive or inappropriate, click to show”. Hide or not image by default based on users’ set threshold – example 1% of visitors reported this image as offensive. --unsigned

The only problem is that not only is that game-able, not only do the tools exist to game that, but there are parties who use those tools and actually thrive on gaming such systems; and these parties are active on the internet today. I'd put the odds of an attack on a reader-provided-input system to be a near certainty in the first 12 months of operation. --Kim Bruning 13:08, 21 August 2011 (UTC)

Other

useful

in my view, this feature is very useful for slow speed internet users like me. -RAJASUBRAMANIAN from tamil wikipedia--தென்காசி சுப்பிரமணியன் 09:57, 20 August 2011 (UTC)

Dude, that means you only like looking at pages with psuedo-porn and violence :D This isn't meant to block all photos, though admittedly, I suppose that might be a nice feature for people with dial-up or old monster comps. Rodgerrodger415 16:25, 20 August 2011 (UTC)
If you want to block all photos because of slow internet, your web browser already provides that feature. Yes, all current browsers do. That said, we don't really know the implementation details of this feature, so it's possible that with the filter active you might be receiving MORE data from the site.(ie you might get the original picture sent to your browser(even though it doesn't display) but you will also get additional images for covering up the filtered ones, as well as way more client side javascript to slow down your already slow computer even more. It is actually more likely that this feature will increase page load times even if you are filtering every image. Once again, this type of filtering shouldn't be Wikipedia's concern. It should be the end users. Wikipedia is in no way responsible (nor should they feel they are) for tagging or providing the means for someone to self censor themselves. This sort of feature is outside the scope of Wikipedia as a content provider. People like to mention that there are things such as Google safe search, but fail to make the distinction that Google isn't a community policed content provider, as Wikipedia is. Google has safe search becuase the way the do page rankings (programmaticly) is susceptible to mis-tagging and other such things that enable people to get there X-rated pages to come up for normal search terms. This was never an issue on Wikipedia, and it never will be, os it's irrelevant to mention it's existence in the context of this discussion. Pothed 17:06, 20 August 2011 (UTC)
To suppress all images, look at en:Wikipedia:Options to not see an image for instructions. Rich Farmbrough 18:50 21 August 2011 (GMT).