Talk:Image filter referendum/Next steps/en

From Meta, a Wikimedia project coordination wiki

Is it really time to move forward?[edit]

We have sofar not agreed were we are and in which direction we are looking.

Do we want to continue with editors, who carefully decide which image is appropiate for the text written and the language version they are writing in. And with discussions with other editors, who have other opinions and are willing to discuss that on the talkpages?

Or do we want some censors to place somewhere in the back images into categories, so that they will not be shown in articles? Do we want to mark images as "controversal in the US", "not to be used in DE", "not to be shown, if the user is indicating in his userpage that he is a muslim/christ/jew/atheist".

Perhaps its better to stop now, have a break and organize a better referendum with clear qustions and real options. --Eingangskontrolle 09:18, 6 September 2011 (UTC)[reply]

Good points. -- Carbidfischer 10:29, 6 September 2011 (UTC)[reply]
I added your last line as a section on the page. SJ talk | translate   10:45, 6 September 2011 (UTC)[reply]
I find that "controversial in the US", "not to be used in DE" is a critical fallacy when discussing policy in Wikimedia projects. "Chinese Wikipedia" doesn't mean it's "PRC Wikipedia" or "ROC Wikipedia", likewise, "German Wikipedia" isn't "Germany Wikipedia". I suppose you can see the difference now. If you want to state that Germany people have the absolute right (or privilege whatever) to determine the policy for DE WP, this is effectively excluding (or worse, discriminating) other de.wp users who were not born in Germany or German family. -- Sameboat (talk) 08:59, 10 September 2011 (UTC)[reply]
We don't speak about the "German Wikipedia". We speak about the "German speaking Wikipedia" (mostly includes Germany, Switzerland, Austrian contributers). Anyone that participated inside the German language Wikipedia can participate inside the poll. It's a projects decision not a state decision. Can't see your problem. --Niabot 09:40, 10 September 2011 (UTC)[reply]
Then what's the point of stating "controversial in the US"? English Wikipedia isn't the "US Wikipedia", but what I see the poll in de.wp is more like based on the German citizens' POV on treating controversial graphic, the customary/trend of a nation/country. -- Sameboat (talk) 11:07, 10 September 2011 (UTC)[reply]
It's a trend set by the german speaking community. This includes the average position of users that are participating in the german language project. Any problem with that? Did i ever speak about US? I guessed i was always talking about EN and not US. If i wrote US, replace it with EN. --Niabot 11:34, 10 September 2011 (UTC)[reply]
It was Eingangskontrolle who started this thread with some strange remarks about "the US". Adornix 13:04, 10 September 2011 (UTC)[reply]
But we also can face it: The US have much more problems with such stuff, because of their weird morals in some parts. The US have a tradition in this kind of censorship. Neither in Great Britain, France or Germany for example are those things this strong like virginity rings, the strong religious communities and so on. This filter is suiting the US wishes much more and doesn't take care of other positions to this. Maybe the Board should have first looked around in the world ... Julius1990 11:58, 10 September 2011 (UTC)[reply]
Goodness! I apologize for my fellow-germans. Prejudices, ressentments and lack of knowledge...
The english WP is much more than "the US" or even the more conservative part of the US. As far as I can tell, the english Wikipedia is a project with contributors all over the world with very mixed backgrounds. To say that this filter thing has something to do with an alleged "US tradition of censorship" is more than ridiculous. In fact "the US" is much more free speech and anti censorship than most germans can imagine. The only example most germans have when talking about US censorship is "Nipplegate" :-) They usually know exactly nothing about american media or television.
Most of the screaching knee jerk protest against this filter referendum even on the english language meta pages comes from four or five german wikipedians. Must be a german problem. Adornix 13:04, 10 September 2011 (UTC)[reply]
The foundation and organs are set up in the USA, their decisions seems to be highly influenced by this fact. That has nothing to do with prejudices. Ans you also know that if teh Foundation would sit in Lo don, Berlin, Paris, we wouldn't even discuss this filter. Julius1990 13:29, 10 September 2011 (UTC)[reply]
No, I don't know that. And you don't know that either. It's just a prejudiced assumption. Adornix 15:35, 10 September 2011 (UTC)[reply]
Maybe! But the board ist heavily american domiated. So it seems to be obvious to criticise that. But try to explain, Adornix, why this filter is so rejected in german WP? So the argument by Julius is not wack. Of cause, it's not a question of where the board is located. But it is a question witch perceptions they follow. It seems that these perceptions are not german ones. Anohter question is, what Jimbos role is. He is the key player of the "anti-porno movement" in the past. And he is still member of the board. -- WSC ® 11:28, 11 September 2011 (UTC)[reply]

Marking pictures and Categories[edit]

In my view, pictures of partially undressed persons are offensive and should not be shown to neither me nor children. What this project needs now is an input for categories, that offend people. Those categories or genres can go right into the image filter. It would be efficient to bundle offensive categories in metacategories like violence or nudity. My Question is: when can we start to collect offensive categories and mark offensive images? I can not wait to prevent myself and others from seeing evil things. A User who is concerned about the moralic standards of the encyclopedia

What you can do right now is not to mark images as being "offensive", but to go to Commons:Category:Media needing categories and put images into suitable categories. The filter as proposed is not going to have a "Category:Images that are offensive because they show naked bodies"; it will shutter things in a list of community-identified existing categories, like Category:Photographs of sexual intercourse. It currently appears that completely uncategorized images will always be allowed through—even if the uncategorized image shows people having sexual intercourse. Commons really needs people to help with these thousands of images. 99% of them are inoffensive.
Additionally, we need people to look in categories (any subject you know something about) and make sure that the stuff in each cat really belongs there. We don't want pictures of kittens in Category:Photographs of sexual intercourse, and we don't want photographs of sexual intercourse in Category:Kittens.
Because the cat tree is enormous and complex, most folks at Commons seem to specialize in a certain subject area, like graphs or photographs of music groups or images showing words in a particular language. If we could get a couple more people looking at just ten uncategorized images a day for the rest of the year, it would make an enormous difference in the backlog and also an enormous difference in the everyday usefulness of these images. (I suggest logging in and turning on the "HotCat" gadget in your preferences.) WhatamIdoing 15:56, 6 September 2011 (UTC)[reply]

How "partially undressed" is offensive? Bare ankles? A beach picture in an article on some resort town showing people sunbathing or swimming? An image showing a swimming or diving technique in swimming-related articles? A medical article showing an example of a limb injury?

As you can see, someone who writes that "pictures of partially undressed persons are offensive" is probably going to have serious trouble with a whole range of images far beyond those which most would consider "offensive". FT2 (Talk | email) 13:08, 7 September 2011 (UTC)[reply]

Me too for "Coren's four woes"[edit]

  • Endorse and considering reversing my view: I found Coren's points hard-hitting. In particular the potential for man-in-the-middle forced filtering and social peer pressure to filter are issues, as well as filtering of public access resources. (Libraries and schools are access points for those lacking their own computers, so any citizens lacking a computer and using the library or school which for4cibly filters will not have the freedom we intend of seeing all content by default and then an option of hiding).

    Most people with their own computers and no national/ISP censorship system will have the intended choice. Unfortunately a huge part of the world does not fit that model, and forcible or socially coerced censorship of such readers in some places (China? Middle East? Indonesia? Some African countries?) is a real concern.

    I would like to see a filter available, but the possibility of forcible application to deny knowledge is a really problematic one for me. Unless there's a solution to this dilemma then it's a case of lesser of 2 evils. The world will adapt to non-censorship over time and that issue will pass. I'm not sure it will show the same restraint when it comes to using a free content hiding tool. I'd especially like to hear the views of people in countries such as those named. FT2 (Talk | email) 09:12, 7 September 2011 (UTC)[reply]

    They are compelling points, and summarize many strong arguments made so far. FT2: I do wish we had better data on how people with currently restricted access to WP are allowed to use it. SJ talk | translate  
    We already know, especially about China and Iran, that they aren't very concerned about images alone. They are concerned about the whole content. Implementing an image filter would have nearly no effect on the decision, if Wikipedia is blocked or not. This goes especially for countries that are struck by censorship already. The only groups that could possibly see some measurable advantage are the ones in developed countries. But the problem (for the censors) still remains, as long the filter is not mandatory. So they will still block Wikipedia (Any sources on who does it, and for what reason?).
    On the opposite: Now, some might think about implementing such a mandatory filter, if we provide the data. Blocking Wikipedia as a whole seams sometimes unbearable and would raise complains by users. But blocking only the selected part of it, might be much more interesting, since we provide the tool and the categories. In that way we would support the censors.
    This means: The implementation of any image filter, that can be omitted by the user itself, will not increase the possible access on Wikipedia. Instead it can be used as an additional tool (data provider) for censoring. --Niabot 20:54, 9 September 2011 (UTC)[reply]
    I strongly disagree with your final suggestion. There's no reason to think the only people who will use force the use of the tool are those who are currently allowing complete access. You've definitely presented zero evidence for the suggestion. Far more likely, some of those who would force the use of the tool are those for who there is currently sufficient concern including about images that they decide just to block wikipedia completely but for who once the image filter is in place they would force its usage but allow wikipedia. And just to be clear, I'm not denying there would almost definitely be those who are currently allowed access to wikipedia in its entirety but who would be forced to use the filter. Both of course would include in developed countries and in some developing countries (I'm not talking about China or Iran or national governments, but cybercafes, schools, libraries and other places like that). In other words, the more relevant question would be how many people would be allowed to use wikipedia when they are currently denied due to any ability to enforce the filter and how many people would be forced to use the filter when they are currently allowed complete access to wikipedia. Unfortunately this is a question almost impossible to answer. Nil Einne 11:47, 14 October 2011 (UTC)[reply]
  • Coren: could you relate those four concerns to the simplest two options below it (a toggle for showing/hiding all images, and a personal blacklist)? SJ talk | translate   03:08, 9 September 2011 (UTC)[reply]
    • Interestingly(?) enough, I wouldn't even classify those two options as "image filters" for most reasonable definitions of the term. A global toggle is necessarily neutral (because it makes no judgement about classes of images), and a personal "don't show this image again" setting is no easier to abuse than no filter at all; i.e.: a potential censor would not be helped by the presence of that feature, given that it makes no pre-emptive category judgement so that they are back to square one having to make the classification themselves.

      If the board and the vocal minority would find one of those options sufficient, they neatly sidestep (or at least mitigate) my concerns. Of course, that is a big "if": I would expect that almost everyone that feel a filter is needed do so under the presumption of a priori image classes to hide without having had to see them in the first place. — Coren (talk) / (en-wiki) 11:08, 9 September 2011 (UTC)[reply]

"Have to"[edit]

The text "Wikipedians will have to invest their working capacity into installing and maintaining filter" has been added to basically all of the items, including the non-filter option of "show no images". I think it's especially silly in that instance, and so have removed it, but I think it's generally wrong: Wikimedians are volunteers. They don't "have to invest their working capacity" in anything at all.

The "installing" claim is wrong, because the community is not capable of installing features: only the WMF can do that. The "maintaining" claim seems misleading, because if you personally don't want to work on maintaining the filter, then you personally do not "have to" do that.

I think this might be re-phrased to indicate the valid concern of en:opportunity cost, i.e., that if I freely choose to spend one hour "maintaining" the filter, then that hour cannot be used for some other WMF-related task (or for cooking dinner, or going for a walk, or whatever else I might have chosen to do with that hour). What do you think? WhatamIdoing 17:17, 8 September 2011 (UTC)[reply]

Part of the "have to" point, I expect, isn't just the opportunity cost but that the expense in effort is necessary for the feature to work at all. I.e.: any scheme that relies on volunteers doing tagging or classification will fail to function (properly) unless you do find enough volunteer time to perform that task. — Coren (talk) / (en-wiki) 19:06, 8 September 2011 (UTC)[reply]
The Problem ist, that sects will want us to include things in the filter you would have never thought of. Think of people in bathing clothes. There are sects, which believe, that modern bathing clothes are forbiden and offensive. An image filter would be an invitation for such sects too add pictures they believe to be offensive. The "normal man" will not see a picture of people in bathing clothes as offensive, but some people will think so and they will use the filter. It would give them unique possibilities to spread their moralic believes. I have seen bad things happen in the german Wikipedia, like the Wiki-Watch-Case. We will have to spend ressoruces (mainly time) on controlling the pictures in the filter. Otherwise, we would see the filter taken into absurdity by its supporters. --Liberaler Humanist 20:55, 8 September 2011 (UTC)[reply]
I have been a little reluctant to bring this up on account of WP:BEANS, but it is not just the supporters who could reduce the categories into absurdity. And even assuming perfect good faith on everybody's part, do we really want to debate for 1000s of pictures the different views on whether or not they are sexually suggestive, or sufficiently exposed to be nude. Or would we will need to break with the principle that everyone can edit and all editors are equal, and have them assigned by an Authority who can disregard consensus. Censorship does imply authority, and the authority's POV. That makes 3 of the 5 pillars gone. DGG 04:03, 9 September 2011 (UTC)[reply]
I agree with the opportunty-cost phrasing. How about "Some unknown portion of the time editors spend maintaining the labels/categories for the filter would likely be at the expense of other tasks on the projects."--Trystan 13:55, 9 September 2011 (UTC)[reply]

Authority vs. consensus[edit]

I agree that identifying controversial content is - well, controversial, to the point where a consensus-based model would result in endless conflict. One way to mitigate this is to develop very specific, objective criteria for the labels (objective in their application to images, not in their existence.) It is relatively easy to determine whether an image depicts genitilia, a nipple, a bleeding wound, etc. It's actually much easier than determining if a picture is about those things, as one would apply a category. That way, the disagreement takes place more at the label-development level, rather than the label application label.
I think the proposed alternatives for filters adress this in two ways. The closed-list options imply authority, as you suggest, decreeing what will be filtered and what will not be. The open-list options use a sort of endless-POV forking approach. If two user groups can't agree on how a label should be applied, they can split it into two distinct labels that (hopefully) makes the distinction clear. So you could theoretically have [[Filter label: Bare midriffs]], rather than just [[Filter label: Nudity]].--Trystan 14:22, 9 September 2011 (UTC)[reply]
The problem is that any label that is specific enough to be reasonably objective is overly specific to be useful as a filtering criterion. Certainly, almost everyone will be able to agree on whether a picture does or does not contain "external sexual organs of a human" (though do you then make a different label for artistic representations thereof? How about simulacra?), but then you'd end up with very many hundred of such labels, and you'd just be moving the dispute to "which of these labels should be filtered if someone wants to hide 'nudity'?" — Coren (talk) / (en-wiki) 14:29, 9 September 2011 (UTC)[reply]
(Please remember when you estimate the probability of everyone agreeing on anything that you are talking about the website where people have edit warred for months over whether mayonnaise traditionally includes lemon juice or not). — Coren (talk) / (en-wiki) 14:38, 9 September 2011 (UTC)[reply]
For the closed-list proposals, labels with very clear scope could be set, such as arbitrarily defining nudity to mean any depiction, regardless of medium or perceived purpose, of human genitals or nipples. That certainly would not form an ideal nudity filter for many people, but it would likely form a useful one for some of them. It will be over and underinclusive for many, but has the virtue of being quite clear, so thd reader knows what to expect.
For the open-list options, they might well result in hundreds of different labels, and one could easily conceive of a couple of dozen related to nudity alone. But the number would likely not be unmanageable with a well-designed drill-down interface.--Trystan 18:20, 9 September 2011 (UTC)[reply]
I have one simple guess. The people that are currently the loudest to implement this feature, wouldn't be satisfied with your solution. They would not longer call for the implementation, but for an expansion of the implementation. For example: Softer criteria. You said in your example "nipples" and "genitals". They would cry about "small/tight swimsuits" as well. On the other hand I doubt that people, that are able to look at a swimwear like this [1] [2], would have a big problem to view at "nipples" or "genitals".
It comes back to the POV decision on where to draw the line. Your example is simple and clean, but it would be not sufficient for the ones that want the filter. But as soon you extend the rule it gets blury and the decision by the rule itself will result in POV. A typical unsolvable problem, because of wide variation in POV itself.
Simply said: Any filter that works with neutral categories will not be sufficient for the needs. Any filter that works with soft rules/categories can be exploited and will result in POV. The only way would be to drop categories/labels to begin with. --Niabot 20:32, 9 September 2011 (UTC)[reply]
I would agree that it wouldn't satisfy many. As the closed-list alternatives note, the line would be drawn in a necessarily arbitrary place, and it would be a rare person indeed looking for exactly that filter. It doesn't follow that it would be useless, or that further concessions would be made.--Trystan 04:29, 10 September 2011 (UTC)[reply]

Even "any depiction, regardless of medium or perceived purpose, of human genitals or nipples" is nowhere near as clear as you imagine it to be. Which of [3] (not human) [4] (not actually penis shaped), [5] (clearly bare breasts, but no visible nipple given the art style) [6] (depiction of Jesus and Mary) [7] (visible nipples) [8] (prehistoric "vulva" ideograms) [9] (unmistakably suggestive fruit) [10] (ridiculously revealing bikini)?

Babies don't count? At what age exactly do they start counting? Because unless you have a very precise answer that everybody agrees to, you'll have edit wars aplenty over where that line is. What about that last bit of fairly translucent clothing? How opaque exactly does clothing need to be to avoid classifying an image? Does [11] suffice? How about [12]? Or [13]?

More importantly, how certain are you that everyone agrees with each and every assessment you made here? — Coren (talk) / (en-wiki) 23:22, 9 September 2011 (UTC)[reply]

There will always be liminal cases when classifying. Applying the level epistemological doubt that you suggest to every classification decision would prevent us from ever classifying anything. It's far easier to argue about whether an image depicts a nipple than whether an image is about nipples. The joy of an arbitrary label is that it can be arbitrarily refined as needed, for example by decreeing that the age of the subject is irrelevant, etc.
I don't really see it as ripe ground for edit warring, because there is a relatively strong objective basis to work from (compared to many other much more subjective decisions we make every day), and, being so arbitrary, it doesn't really cleave along controversial lines. Is someone who doesn't think that they can quite spot a nipple through a semi-transparent dress really going to be adamant that the image not be added to the filter on that basis?--Trystan 04:29, 10 September 2011 (UTC)[reply]
The problem would be not the content of the rules itself. But which labels/rules will be provided inside the interface? This question will cause the debates. The content of the rule would be relatively safe, but the rule itself will be questioned, said to be ineffective, going not far enough, going to far. It would be adjustable trough many rules at once, but this would make things overcomplicated. --Niabot 08:32, 10 September 2011 (UTC)[reply]
"Is someone who doesn't think that they can quite spot a nipple through a semi-transparent dress really going to be adamant that the image not be added to the filter on that basis?" Yes. People have edit warred for months over whether the canonical image of a bathrobe should or should not have a human in it(!) In fact, that's the least of the problems, the bigger and more acrimonious definitions of the rules themselves are going the be where the real battles are fought (as Niabot points out above). — Coren (talk) / (en-wiki) 17:26, 10 September 2011 (UTC)[reply]
Well, to the extent that people will edit war over anything, then yes, they might war over whether a nipple is visible or not.
I agree that the more acrimonious dispute would be over the rules themselves; that was the point I was trying to make. If the labels, including a very detailed description of their scope, are set as official policy by a board resolution and not open to community editing, there isn't a particularly effective venue for such a war to take place. I'm not advocating this approach, just suggesting that it is one option to mitigate the amount of discord introduced into the WMF projects.--Trystan 17:40, 10 September 2011 (UTC)[reply]

I don't have a problem with the existence of "grey areas" (see Coren's list) or the idea that some images will be warred over. Grey areas and edge cases are central to categorizing human culture and experience and we broadly cope with them even at the cost of disputes at times. Most things here have grey areas, trying to exclude them isn't usually avoidable, which is why AFD have "no consensus" as a common finding. Involuntary censorship is a concern but grey areas and edge cases aren't. If we specify a selection of "is" and "isn't" then the grey areas will resolve well enough. Our existing policies have far more grey and they seem to manage. FT2 (Talk | email) 09:25, 13 September 2011 (UTC)[reply]

A filter with "yes" or "no" as the two only options has no grey areas. An image is either clean (can be viewed with filter) or it is dirty (can't be viewed with filter) In case the discussion would be 50/50, 55/45 or 45/55 you will directly make a huge mistake which is pretty far away from any kind of consensus. Especially if you are forced to make a decision now and not later on. That is one big issue, that stems from non-neutral judgment expressed as yes/no itself. --Niabot 09:54, 13 September 2011 (UTC)[reply]
FT2's point is that there are images whose classification as "clean" or "dirty" will be difficult, because the image is in the gray area between "white" and "black". Overall, I agree with FT2's belief that the community can handle this. WhatamIdoing 15:39, 13 September 2011 (UTC)[reply]

So called "analysis" of the German Wikipedia opinion poll[edit]

Two paragraphs in the section "Analysis" on the content page read:

>>A wiki-referendum this past week on the German Wikipedia asked simply whether editors want such a feature to be available on that project. Over 80% of participating editors do not.[1] The German poll differs from the committee's referendum in attracting far fewer responses and not providing respondents with the protection of a secret ballot. It has also just allowed users with at least 200 edits to vote, while the foundation's referendum was open to anyone with at least 10 edits in a wikimedia project. In consequence, the german referendum represents the active community.
The German poll asks a question, which can be answered with yes or no. A question the foundation failed to ask. The initiators of the german referendum did not have access to the mail adresses of all users via the database. If the WMF will pay the DE:WP a referendum on a secure server for a secret ballot, we would glady accept that.<<

I don't know who has written this, if this is the opinion of the Wikimedia Foundation, or of an unattributed single editor of the content page. Also I don't understand who "we" is. However some of the statements made here are wrong or misleading:

  • The "referendum" on the German Wikipedia did not run "this past week", but runs over three weeks.
  • The German Wikipedia opinion poll is not finished yet, but will be finished on 15 September 2011.
  • The German Wikipedia opinion poll does not ask whether editors want such a feature, but it asks whether editors want that the "feature" (image filter and local filter categories) will not be implemented on the German Wikipedia.
  • Since the German Wikipedia opinion poll is not finished, it is unknown yet how many participating editors will vote for or against the proposal.
  • Since the German Wikipedia opinion poll is not finished, it is unknown yet how many responses it will attract.
  • Opinion polls on the German Wikipedia are open for votes with anonymous accounts. There are no opinion polls with secret ballots according to rules of the German Wikipedia, rules which have been established in own opinion polls. I can't see that voters are not enough "protected" in this system.
  • The German Wikipedia voting rules say that editors are eligible with 2 months of activity on the project, and 200 edits in the article names space, and 50 of these in the last year.
  • The Foundation's referendum was also open for developers, staff and contractors of the Foundation and current and former members of the Board of Trustees and the Advisory Board
  • I'm not convinced that the German Wikipedia opinion poll represents the active community. There is quite a number of voters who have not contributed much more than 200 formal edits like orthographic corrections. Some very active and engaged editors on the other hand tend to ignore opinion polls at all. Actually opinion polls just represent not more or less than those who vote.
  • As explained above I don't expect that the German Wikipedia would gladly accept to hold a referendum with a secret ballot, at least not under the current rules for opinion polls.

--Rosenkohl 21:05, 9 September 2011 (UTC)[reply]

Thanks for the corrections, updated.
Feel free to edit the page directly. Nothing on the page represents the opinion of the WMF -- it is a synthesis of community feedback so far, from the various discussion pages and forums. Eingangskontrolle added the 'we' sentence. I would say it belongs on the talk page instead, though if there is a serious request by a group on DE:WP to run a new WMF-hosted poll, that would be worth linking to and mentioning in the analysis. SJ talk | translate   23:55, 9 September 2011 (UTC)[reply]
The text above was written by multiple people. I suspect that some of it was accurate at the time and is only badly expressed, e.g., the first phrase probably meant that the poll began this past week, not that the duration was one week.
Other aspects are indisputably accurate: The chance of that poll attracting more than 24,000 respondents is zero, it requires participants to have made 20 times as many edits as the WMF poll, and it is not a secret ballot. The differences in the way the polls are conducted may account fully for the differences in their results. WhatamIdoing 18:53, 10 September 2011 (UTC)[reply]
To prove that, we would need the result of the "referendum" by language/project. But so far it wasn't released and you will also have to consider that there are two big differences between both polls.
  1. A completely different question. (importance of something VS yes/no)
  2. A referendum with a description and a poll which lists support and oppose arguments.
--Niabot 20:34, 10 September 2011 (UTC)[reply]
Rosenkohl is mixing up a lot of issues in order to construct something.
  • He is complaining that the poll is open for votes with anonymous accounts. Thats also true for the referendum. This poll represents the active community perhaps better then the referendum, as the voters had to be active with a certain amount of edits in the last month, not just 10 edits at anytime in the last years or no edits at all (see the rules for the "referendum"). Both actions do not represent the users who did not care, they both represent only those who vote.
  • As the WMF has already decided for us, that they want to introduce the filter, the question in the German poll was limited to the question if it should be implemented in the German wikipedia. The difference to not wanting the filter is marginal in my opinion. We would be happy if other communities would start a similar move.
  • The Germanspeaking community would like to have a secret ballot about the introduction of image filter.
  • The organaziers had no access to the mailing address of all users to send invitations to all, so the only mean to attract voter is via the regular pages used for this purpose with all votes and polls in the DE:WP.
  • The rules for this poll were not changed for the purpose of this poll like the the rules for the "referendum", which clearly intented to include as much casual voters as possible.
  • We are confidend, that this poll will attract a very high number of voters.

--Eingangskontrolle 22:39, 10 September 2011 (UTC)[reply]

Eingangskontrolle, I believe that Rosenkohl is actually saying that the German-speaking community would not like to have a secret ballot. You say it would, and name no proof; Rosenkohl says it would not, and names proof (existing policies prohibit it). It is impossible for me to say that you are correct. WhatamIdoing 02:18, 12 September 2011 (UTC)[reply]
Sj, thank you very much for your carefull clarifications on the content page; and for the invitation to edit there.
Eingangskontrolle, I don't complain at all that the poll is open for anonymous voters; on the contrary, do I believe the option to vote anonymously in a public ballot can guarantee a much better protection for the voters than any secret poll can, where it is often uncertain for the single voters and for the public who really has access to the raw vote data. Also I may remind of the opinion poll from December 2010 de:Wikipedia:Meinungsbilder/Offene oder geheime Wahlen zum Schiedsgericht#Frage 2: Soll die Stimmabgabe bei der Schiedsgerichtwahl geheim sein ? where a 2/3 majority opted against a secret ballot in the arbitration commitee election, Greetings --Rosenkohl 19:25, 12 September 2011 (UTC)[reply]

Editing nitty-gritty[edit]

I took the liberty of merging "no such feature" and "let others continue to do it", trying my best to condense them - also adding some bits of my own for what I perceive as clarity... I removed one point about how this option would have more publicity because I didn't understand it. The edit is here - feel free to fix anything I broke. Wnt 16:36, 10 September 2011 (UTC)[reply]

I'm a bit confused by "Outside filtering schemes and related measures (such as monitoring) will inevitably be used for purposes most of us would deplore; something we develop ourselves might in the end be more acceptable to most of us than more restrictive filtering by outsiders." Isn't this exactly the opposite of what people have been arguing? It's certainly the opposite of what I have been saying. I don't mind if someone wastes their time doing something ethically reprehensible – I don't want us to do it.

Specifically, if a "Safepedia" was to start to provide a filtered (according to whichever criteria they think potential customers would want) version of Wikipedia, that'd be perfectly acceptable to me as long as we are not helping them do so: readers will always be able to get to the "real" Wikipedia (or, more accurately, they would not be prevented from doing so any more than they currently are). Hell, there might even be a reasonable business model to do so; that'd potentially add value to some readers they'd be willing to pay and/or suffer ads for.

(Personally, I'd think such a business is doomed to fail: I'd bet a bundle that most of the tiny minority of people clamouring for a filter wouldn't pay to have it inflicted on them. The moral minority isn't concerned with protecting themselves from the world. In fact, I'd argue that the very fact that no such service already exists is ample proof that the supposed target audience for filtering doesn't even exist in the first place as a significant group. A business would need to justify the millions of dollars' worth of editor time and effort to make that classification exercise). — Coren (talk) / (en-wiki) 16:59, 10 September 2011 (UTC)[reply]
It's an ends-and-means issue.
For some people (e.g., you), the means are extremely important: you must remain untainted by refusing to participate in the process of creating a filter, no matter what the end results of your decision are.
For other people, the ends are more important: if by creating a mild, least-offensive filter, we remove the incentive for someone else to create a very oppressive filter, then perhaps the ends (readers see whatever they want) justify the means (creating the filter ourselves).
It is the same type of decision that one makes when faced with a murderer: The committed pacifist will not—indeed, can not—kill the murderer himself, even to save the lives of dozens of innocent people. Other people would kill the murderer with barely a second thought.
Neither position is morally untenable, and most people in this dilemma avoid it by trying to argue the facts: How do you know that the murderer really will kill those people? How do you know that creating the filter will result in fewer censorship tools being produced? We don't—but we can answer the general case: "Faced with a murderer, then I would chose this", or "Faced with the threat of oppressive censorship, then I would chose that."
And I disagree with your claim that "Alice's" request for a filter is all about preventing "Coren" from seeing images. We've seen several comments just in these discussions from people who want a filter for their own personal use, exactly like Firefox offers an add-on to stop en:Rickrolling[14] and people voluntarily install it so that they don't have to see what they don't want to. Zero comments from the arachnophobes indicate that they want pictures of spiders kept off your computer screen; it's all about keeping what they don't want to see off their computer screens. Surely it is no more morally reprehensible for the reader to voluntarily keep Rick Astley's videos off his computer screen than it is for the reader to voluntarily keep en:Goatse vandalism on Wikipedia off his computer screen, or for the arachnophobe to keep spiders off his computer screen. WhatamIdoing 19:12, 10 September 2011 (UTC)[reply]
Yes, but the position you hold presumes that the absence of a filter we implement will or is likely to cause another, greater, filter to come into existence; and that us making a "gentle" filter will in fact reduce the probability that a stronger one is made. I'm saying that (for a number of reasons) it is unlikely or impossible for someone to make it without our help – and that even if it were, I'd rather have someone else maybe making a filter than us certainly making one. The alternatives I see is "we don't make one and maybe someone else manages to make one" and "we make one, and someone else makes a stronger one with thanks to our efforts". — Coren (talk) / (en-wiki) 20:32, 10 September 2011 (UTC)[reply]
Thanks very much for that philosophical explanation :-) For the german readers of this page - and I know there are some - I would like to explain that the conflict described by WhatamIdoing is quite the same as described by Max Weber with his terms "Gesinnungsethik" and "Verantwortungsethik." Most contributors of the german Wikipedia are - like most germans and Coren - Gesinnungsethiker, not Verantwortungsethiker :-) Adornix 19:35, 10 September 2011 (UTC)[reply]
This doesn't get to the point. You can't simply draw a line between both terms, neither you can assign supporters and opposers to one of the terms. That wouldn't be true. What you said would imply that the opposing group would only have a good will, but wouldn't think about what might be the result. That simply isn't true. I would suggest to implement the filter, since it would have benefits for some readers. Of course it would. But i also have to think about the disadvantages and problems that will unavoidably arise. The current proposal has so many problems that it is at least a tie. Considering the effort (money/time/...) i can't consider it as something that would help the goal of the project. Lets invest the time to improve our articles, create better illustrations and to communicate. This would definitely help. --Niabot 20:31, 10 September 2011 (UTC)[reply]
I retained the point in editing because I can understand it, though I don't agree. To give a simple example, a Wikipedia "filter" category will probably classify a gay kiss and a straight kiss in the same category. However, if we leave it up to a right-wing Christian decency group to set up a filter, they almost certainly will not. I understand the complaint, but ultimately, I would rather not be responsible for filtering than try to butt into the Christian group's business and try to change their beliefs by force. Wnt 21:40, 10 September 2011 (UTC)[reply]

Text re: separating cataloguing and labelling[edit]

The following text was removed from the point under the label-based filter option about keeping the labeling system separate from the cataloging system: "This may help to prevent conflicts between users working at these two distinct purposes, makes it easier to maintain good practices for applying categories, and allows us to acknowledge and minimize negative effects of labeling." I'm not sure I understand the edit comment: "Not possible in any way... The system is the same, except that the technical implemention is different."

I have reintroduced the text, but would be interested in improving it if the intent is not clear. The bullet is intended to convey that if both "filterers" and "categorizers" are using the existing category system, they will be in conflict because they are trying to do two incompatible things using the same system, but that if labels are implemented as a separate system, that difficulty will be avoided. The categorizers can continue to maintain good practices without being negatively affected by those trying to apply categories for a different purpose. The part about acknowledging and minimizing the negative effects of labeling ties into the closed-list proposal that would attempt to identify and avoid certain types of labels that are particularly damaging.--Trystan 21:22, 11 September 2011 (UTC)[reply]

I meant that applying non-neutral labels will do the same ideological damage as using categories like [[Category:<someprefix>-violence]]. Thats why i found the wording "allows us to acknowledge and minimize negative effects of labeling" disturbing. You will only minimize the effect on the category system, but not the negative effects that will occur anyway if your label content in a non-neutral way:
  • Possible misuse for censoring
  • Discrimination of content
  • Editwars over categorization/labeling (no compromise option: Yes or No)
  • Still going against the project priciples and so on...
It should be made clear, that it only servers to avoid a second, parallel category tree. --Niabot 22:41, 11 September 2011 (UTC)[reply]
"applying non-neutral labels will do the same ideological damage as using categories" - That's very much the point I was trying to make, but viewed from the other side. If we try and avoid the negative effects of warning labels by using our category system, people will be motivated to simply apply categories as warning labels. It would be very difficult to try and mitigate some of the worst effects of labeling if we don't admit that is what we are doing. If we call a spade a spade and have a separate system for warning labels, we can perhaps set and enforce stricter rules about how they are used.--Trystan 23:17, 11 September 2011 (UTC)[reply]
I think it will be far easier to limit Category:Photographs of sexual intercourse to images that are (1) photographs and (2) show humans engaged in sexual intercourse (which we do now, and which we would continue doing in exactly the same way if that category was listed in a filter) than to set and enforce strict rules about what belongs in "Filter:Sex and porn". WhatamIdoing 02:22, 12 September 2011 (UTC)[reply]
Perhaps, because images tend not to contain sexual intercourse in a way that is incidental to the central subject. To compare apples to apples, [[Filter label: Photographs of sexual intercourse]] would be no more difficult to maintain. Nudity- and violence-related categories would likely be quite a bit more problematic, because it often is incidental, or a sub-aspect of a broader theme.--Trystan 04:19, 12 September 2011 (UTC)[reply]
Thats exactly what i meant. Seen from the point what will be the end result, there is no real difference between categories, used like warning labels, or warning labels. The only advantage would be that they are actually called labels. But thats it already. This doesn't solve any of the other problems the filtering system has. --Niabot 06:41, 12 September 2011 (UTC)[reply]
Even [[Filter label: Photographs of sexual intercourse]] is dificult to define. Can Autofelatio be defined as sexual intercourse? You mentioned that sexual intercourse would be limited to Humans. What about Bestiality or Monkey doing it on Discovery Channel? When has a Photograph been photoshoped hard enough to cease being a photographic representation of reality? I personally believe the only way forward are very broad criteria like [[Filter label: Human Nudity]] filtering the Venus of Milo as well as the Venus Awards, and [[Filter label: Sexual Activity]] (but then again what is sexual for some is a turn off to others). Furthermore such filters would be so broad that no one would use them for anything else but censorship. I just think that the concept cannot work. --Arcudaki 11:21, 13 September 2011 (UTC)[reply]

Post mortem[edit]

I'm not convinced that any of the post-mortem complaints (e.g., was "referendum" the right word?) belong on this page. WhatamIdoing 23:01, 12 September 2011 (UTC)[reply]

I'm quite convinced that none of your comments belong here either. I guess we'll all have to deal with it. --84.44.183.86 17:00, 13 September 2011 (UTC)[reply]
I don't really think that non-constructive comments like this will gain anything.--Jorm (WMF) 20:54, 14 September 2011 (UTC)[reply]
Is that the official answer to all opposition? --Eingangskontrolle 21:15, 14 September 2011 (UTC)[reply]
No, it's my personal answer to someone making a personal attack. Ease up, ne? We're all on the same side.--Jorm (WMF) 21:27, 14 September 2011 (UTC)[reply]
Did you ever read a sounding comment by WhatamIdoing? (nomen est omen?) --Niabot 21:29, 14 September 2011 (UTC)[reply]
"A sounding comment" isn't a sensible phrase in English. Is this perhaps a too-literal translation of a German idiom? WhatamIdoing 17:33, 16 September 2011 (UTC)[reply]

Final Results of the WP:DE Election[edit]

86,13 % of those voting voted against the image filter. 13,77 % voted for the introduction of the filter. (Final Results Section of the WP:DE Election). --Liberaler Humanist 16:17, 15 September 2011 (UTC)[reply]

  • de:Wikipedia:Meinungsbilder/Einführung persönlicher Bildfilter/en is not an election but an opinion poll,
  • the voters were voting neither "against the image filter", nor for "for the introduction of the filter". In fact, voters could vote about a proposal which is demanding that image filter and filter category should not be implemented in the German Wikipedia,
  • 86.23 % (not 86.13 %) of thoses who voted for or against this proposal voted for this proposal, and 13.77 % voted against the proposal

--Rosenkohl 10:04, 18 September 2011 (UTC)[reply]

A little correction: 13,77 % voted, that the they would have nothing against the introduction of the filter. Considering the comments, not many opposing votes favored the introduction of the filter. I guess that should be made clear. --Niabot 23:41, 18 September 2011 (UTC)[reply]

A smaller survery on fr.wiki[edit]

fr:Discussion Wikipédia:Sondage/Installation d'un Filtre d'image#Bilan du sondage

Although only 80 or so people paricipated, much fewer than on the de.wiki, proportionally the opposition is pretty similar at around 80%. I suppose it's a matter of hard-core users or content creators vs. average reader. ASCIIn2Bme 21:46, 4 November 2011 (UTC)[reply]