Talk:Censorship

From Meta, a Wikimedia project coordination wiki
see also Wikimedia_projects_blocks.


From 2003-2005[edit]

We can define censorship as an action that goes against the freedom of speech, the freedom of speech being limited by the respect, of course. It appears legitimate to say that anything reflecting basically the reality is respectful, as it shows things as they are. The censorship is not necessarily something so obvious. It could just be embedded in a cultural prejudice. This notion of cultural prejudice is quite important to understand censorship and how a text reflects reality as well. History shows clearly that contents were selected and removed according to what was commonly admitted or not. For instance, if you look up in the Britannica 1911, you won't find too offensive words like "fuck" or definition of sodomy. But, you can find a long text about "negro", [[1]] saying clearly that "negro" are people with a reduced intelligence.(which of course, I deny). Today, an article speaking about black people as Britannica 1911 did would be clearly censored, and this, is justified as it doesn't reflect reality. And today, people are more prone to define clearly the word "fuck" as it is, and there is no objection possible, as it's just a mirror of what exists. Unfortunately, some parts of the reality remains still unacceptable for people, as the cultural and religious heritage are still heavy. For instance, a picture of naked people is something that could be considered as offensive for people, for children, (which is immediately linked to porn) when it's only the reflection of our own humanity, that we have learned to deny.Why are you being unconventional?

Personally I find the taboos against nudity rather silly, and my only distress is that I've internalised the same taboos. --mrd

This is an interesting discussion that reminds me of a couple of saying that have become trite in some circles, the first is, 'freedom of the press is freedom for the owner of the press' and 'freedom of speech does not mean you have the right to shout fire in a crowded theatre'. In the American tradition the liberty interest has been held in the highest esteem, the expression of the individual voice, no matter how outrageous, as been held as sacred. This has lead to a free and open debate, but it has also lead to the American Civil Liberties Union defending nazi holocaust denial theorists. In other countries who owe their tradition of human rights to the more modern rights movement we see that there are recognized limits on freedom, as the will of the individual is balanced with the needs of society such as the w:limitations clause in the w:Canadian Charter of Rights and Freedoms. In these systems that have been inspired by the w:Universal Declaration of Human Rights it is the balancing of individual and group rights that concerns the lawgivers. How to apply that to a collective endeavor such as Wikipedia. That is the question worth discussing. Alex756


Deletionistic or Exclusionistic?[edit]

On Wikipedia, due to the fact that Wikipedians who censor images, are in fact excluding them, some have called censorship a form of exlusionism.

Others, such as the Association of Inclusionist Wikipedians, claim censorship as a form of deletionism.

Wikiconservatives, advocates of exclusion of imagery considered inapproperiate, deny this.

Canadianism 20:18, 2 October 2005 (UTC)[reply]

From 2010[edit]

Letter to FOSI[edit]

It would help to receive some expert advice. See commons:User:TheDJ/fosi. They may represent one side of the debate, but I am sure they will give good info to help us make a decision. --John Vandenberg 10:29, 9 May 2010 (UTC)[reply]

A Librarian's view[edit]

In our sphere, we librarians also decided this issue some time ago: Labeling is censorship. Those who wish to censor have their own purposes, and have the ability to devise their own methods. We have metadata on our objects, both titles and categories. We have the responsibility to provide information about our images as well as the images, for educational value--and usefulness generally-- depends on context .

Some people in the world think, rightly or not, that there is a need to censor certain kinds of material for at least certain audiences. What we already have should provide sufficient information for any ordinary need for censorship, formal or informal. More sophisticated systems can use image analysis.

I myself think there is in fact no genuine need for censorship of any sort. But we would be wrong to adopt technical measures either to prevent the censoring of our material, or to promote it. WP has, as it ought to have, a free license that deliberately permits people to fork or modify or select from it. Our purpose is to provide free material in every sense of the word, and this freedom includes the ability to make good or bad use of it and we do not judge that. We differ in this respect from many other good sites, many of which are free except that they prohibit commercial use; we have always maintained our lack of distinction between subsequent uses as a basic principle, and should continue to maintain it. DGG 15:38, 9 May 2010 (UTC)[reply]

I've never found this argument (labeling=censorship) very thoughtful, since labeling really is categorization and classification, one of the central librarian skills. So the question is, why do intermediaries who think of 'labeling' as a way to 'filter' rather than to 'find and discover' always use such crappy classification schemes? why can't they work within existing vibrant and widely-used ontologies? SJ · talk | translate 16:41, 10 May 2010 (UTC)[reply]
the labeling referred to is such labeling as: a label on a book saying this book contains violence, or sex. That has the effect of deterring users from selecting the book, for fear of being publicly seen with it. Applying classification terms to a book, or arranging it by subject, is done on a taboo-neutral basis across all subjects, and we can certainly do the same with tags on images. DGG 17:53, 10 May 2010 (UTC)[reply]
Hear, hear. I object to being made a forcible censor by having to label my works. Particularly if we use something like he ICRA which requires us to tag gambling and drinking - categories which censor whole ranges of encyclopedic material, from the crucifixion (the soldiers cast lots for Jesus' clothes), the Paris Commune (much of it planned in alehouses, and these appearing in much art), to the infamous prints of card-playing dogs. If people want to censor Wikipedia, don't force Wikipedia's editors to do their work for them. Adam Cuerden 22:17, 9 May 2010 (UTC)[reply]

Some reflexions following the censorship polemic of May 2010[edit]

For the sake of the argument, let's consider a moment that for each existing image in Commons or Wikipedia, one of us in this mailing list is offended by its presence.

Can we find a solution from this frame of work?
If we submit to each sensibility, the projects become a desert, visually, and our encyclopedic goal is diminished.
If we ignore each sensibility, absolutely everyone is offended.

We need a intelligent solution that respects both goals: a database of knowledge not offending but complete.

The best proposal I've heard is this one:
Neutral, descriptive categories (ie, "nudity" instead of "pornography") allowing for PERSONAL filters set in one's profile: this way we're user friendly enough to allow SELF-protection for the prude (or prudent) user, but we only remove images on internal or legal concerns, letting all the censorship debate up to the user choice (toggling with his settings). By self-protection I mean that each one is responsible of choosing his own filter. We don't want to impose the filter, or help imposing them. The "protected" persons must agree about this protection. Thus, I think it is not our role to help imposing authoritary censorship and we can draw a clear line.

At the very maximum of cooperation, we can maybe propose some "preset" profiles to comply more easily with external, frequently asked policies: like the "no humour" profile, "no religion" or "no nudity" or "no violence" who would automatically avoid the corresponding categories, like BDSM + sex + porn + etc. This would be a realist concession to censoring institutions where we choose to be friendly enough to be acceptable for them. However, each user should remain free to design and modify his own filter, at least from our technological side. If the institution want to force each user to set their filter this way or that way, this is out of our hands. We cannot and shouldn't control or choose what's happening between the chair and the keyboard.

This solution seems extremely easy to implement since the categorization is already mostly done.

This would leave but one issue: what happens for a non-logged profile access? by default no filter should be active. So what would happen if a minor accesses a pornographic (according to the local standarts of the country) page? Are we legally responsible? (maybe a jurist would be needed for each country to answer that)

Can we avoid these illegal situations without imposing censorship? (which is what we mustn't do if we don't want to lose freedom in the libertarian oasis that are Wikipedia and Commons)

Should we filter the default non-logged profile? Should the default profile be child-friendly. But according to which laws?

Is this paradigm of thoughts the best or are there totally different but satisfying approaches to the problem?

I don't know, but once again let's push the thought experiment to the limits:
How can we please totalitary countries like China or Iran without corrupting our libertarian wish to reach everybody with ALL the human knowledge? If someone knows the answer to that, he's a saviour. Maybe it already happened some time in history?

I know only of smart, indirect cultural resistance or big massive common will of the inhabitants of the country to allow such entry of forbidden ideas. That is, a proactive, oriented action that risk being perceived as hostile to the moral, political or spiritual leaders and that may be unrespectful anyway of the integrist culture.

Do we have a right to invade a culture (or government) with ideas they don't want to know?v That's a dangerous colonial question. A simple yes is brutal and history has shown its undesirable consequences. A no may be a coward and hypocrite passivity in a time of crisis where enslaved strangers were asking for our help, sometimes unknowingly.
Up to each point are we ready to assume a messianic role and accept hurting the sensibility of dominating subcultures?

And if we consider necessary this invasion of privacy in the affairs of self-determination of a culture, if we're not absolutely neutral after all but wish equality of access of knowledge for each culture instead of closing our eyes upon obscurantism, then we must define very well the few universal values that we're fighting for, the ones we will never deny, whatever the pressure and situation.

After all, as I said at the beginning, we CAN adapt the WMF to every external demand, ensuring we have no enemies, hoping that what is left of our prime mission is enough to do at least a little good. Or we can fight entirely for the greatness of our vision.

Personally, I chose without looking back the second one: I want to think, live, talk, act and hope for a great goal, not just for short-term survival.

So I think that China and Iran is a big problem since we can't comply with their demands but we can't declare open war either without going secret complot or official war.

Since I don't think most of us want to extremise the community into subversive activists (but maybe a poll is needed, who knows?), we won't enter a information war with those censoring countries and we'll have to ignore them and the question or accept being censored or blocked, without solving the problem, hoping for better times (and leaders) to allow a new dialogue with those cultures, wich may happen in centuries... Or never before a worldwide nuke.

Currently we lack solutions. We only have bad options with those countries.

I think we should contact Google and ask them if they can share some of the strategical options they found when dealing with censoring countries. We need better ideas because we're currently forced into a grave dilemm. Pronoein 20:17, 10 May 2010 (UTC)[reply]

there is no neutral consistent basis to select what categories or tags to use, unless we apply the same level of categorization or tagging equally to all images. If we follow sexual taboos, which ones do we follow? Some Moslem and non-Moslem groups object to the depiction of any part of the anatomy, some to depiction or exposure of certain parts only. Some extend it to males. Some object to the portray of certain objects in an irreverent manner--there have been major commotions over such displays of Christian symbols in artworks.
Different cultures have different taboos on the depiction of violence, taboos not connected with religion. And then there's the field of political depictions: there are often taboos or even legal restrictions against some manners of display of national symbols--the US flag will serve as a convenient example.
How can we go by majority views here? Protecting minority interests is part of the concept of free culture. If we go by "significant" minority, what counts as significant? do we mean significant in world-wide terms?
There is no way to limit censorship. The only consistent positions are either to not have external media at all, a position adopted by some religious groups, or to not have censorship at all. DGG 17:42, 10 May 2010 (UTC)[reply]