User:Eloquence/Beliefs

From Meta, a Wikimedia project coordination wiki

After a long time on Wikimedia, I think I should write down some principles which I think are important for a large scale online community to work, some risks that I see, and some ways to avoid them. This page may eventually grow, but right now, it is just a brief presentation of some core beliefs, none of which are unique, but which may serve as inspiration, or to improve the understanding we have of each other. This page is POV and personal in nature, but I appreciate comments on the discussion page, and edits within reasonable limits (these are my beliefs, after all).

Neutrality is the seed of change[edit]

The Neutral Point of View policy is the unquestionable dogma of Wikipedia. It is, as we often say, not negotiable. Accept it or leave. Violate it and be banned. Be neutral or perish. If you're not with us, you're against us. You get the idea. In an environment where everything is editable, where we want to be bold and ignore all rules, this is the one policy to rule them all and in the wiki bind them: the atheists and creationists, the Israelis and Palestinians, the socialists and the libertarians, the Nintendos and the Sonys, the dwarves and the elves.

But if NPOV is dogma, then why do we fight so much over it? If the Bible is God's word, then why have people killed each other over it? The issue, of course, is interpretation: Do we really have to include the views of the creationists in the article about evolution? Is it unfair for an article to contain a lot of criticisms about a person, even if they are properly attributed? Should we correct common misconceptions, or do we have to treat them as valuable opinions? What are common misconceptions?

I believe that there are reasonable answers to these questions. For example, I think that the first question we should ask about any article is, what field does this article belong into? Is it about matters of faith or of science? If it is about faith, which religions? If it is about science, which disciplines? In the case of evolution, we can state very simply that this is a scientific topic, and the standards of science apply. Creationism is largely irrelevant in the article about evolution, as all experts in the field utterly reject it. Creationism itself is a topic that falls into both fields -- religion and science. Insofar as creationists challenge established scientific conclusions, science must be given the opportunity to answer.

As for controversies and criticisms, I strongly believe that neutrality should be largely considered separate from balance, and that the general answer to a lack of balance is: Add what's missing, and summarize sections regardless of their content when the article as a whole gets too large. Don't split away criticisms just because they dominate an article -- expand the article and then create topic-specific articles that allow the reader to zoom into any aspect that interests them.

I believe that, yes, we should challenge misconceptions. If virtually any reader who is educated about the facts of a matter will be convinced by them, then we do not need to treat this belief as equally valid as its refutation. For example, if millions of moviegoers think that cars explode when they crash and that they need to run away as fast as possible, it's fine for us to point out that this is not the case.

NPOV, if properly used, can lead to great articles which give an overview of all relevant views on a topic. There are at least five problems with it:

  • Nonsense overload: When rationalism is confronted with irrationalism, an article will accumulate an endless number of claims without logic, evidence, or a basis in reality. Since irrationalists do not have to use logic to defend their beliefs, these exchanges between arguments and responses can essentially go on forever and, due to NPOV, we have to always include both sides, because after all, there are people who believe those things.
  • Interpretation: My personal interpretations of NPOV above are exactly that: my interpretations. Edit wars about these issues happen constantly.
  • Stability: Articles like Creationism, Circumcision, Homophobia, Jesus, Nuclear power, Child sexual abuse go from one NPOV dispute to the next; with brief interruptions, they are constantly being fought over. This will continue to be the case; the only hope is that, in the future, it may at least be possible to take stable revision snapshots when there is no ongoing dispute. But even then, people will call into question the past snapshot and demand a new one to be taken.
  • Length: NPOV articles tend to get very long. Theoretically, it should be possible to structure an article so that the reader can go from an overview to any level of detail. In practice, most articles simply keep growing and growing.
  • Lack of emotion: NPOV articles can lead to a long-term improvement in education and, through this, to social progress, but due to their lack of emotional content, they are unlikely to ever be an immediate causative factor of change. The lack of emotions also makes it more difficult to use them as learning tools for children, where the emotional association of new information is essential for memories to be stored.

Given all this, you might think that I oppose NPOV. The exact opposite is the case. I believe it is a brilliant policy and the only way for Wikipedia and our other projects to work if we want to retain the principle of maximal openness. However, one should not assume that NPOV is the only valid way to present information about a certain topic.

We allow anyone to take our content and develop it in ways which are not compatible with our own rules. NPOV is essentially the most maximally inclusive policy that makes sense. Therefore, in theory, it should be possible to build the next editions of both the Skeptic's Dictionary and the Catholic Encyclopedia on the basis of Wikipedia content. You could use and develop a Wikinews story on Indymedia or on Little Green Footballs.

If Wikimedia is successful, I believe that this kind of forking from NPOV to POV will become the norm rather than the exception. There will be a relatively stable article about creationism that will be a very scientific and refreshingly partial dissection of this bizarre belief system, and it will be partially based on Wikipedia material. There will be closed groups with strict rules and shared beliefs who will use our articles to advocate certain agendas.

One could be fearful and argue that this should not be done, that we are arming both sides, instead of the right one, that we're giving ammunition to the creationists and the biologists. But the right side, whichever one it may be, does not need to fear this, because surely, if each side will develop a perfect presentation of their point of view, the correct view will prevail. If we cannot make this basic assumption, humanity is doomed to live in ignorance forever.

I therefore believe NPOV is a useful and desirable dogma, and that we should create the best possible NPOV article on any subject. We are not just building a big house, we are also giving everyone bricks and mortar. We are revolutionaries without a cause. This makes us harmless from the point of view of those who hold power and seek to prevent change. But change is inevitable. And slowly but surely, we are laying the groundwork for it.

Consensus and compromise are valuable[edit]

I was very skeptical about consensus-finding as a decision making process on Wikimedia when I joined the project. My belief was that, on virtually any controversial issue, this would be impossible, and people would just argue endlessly without reaching any conclusion. This belief is shaped by the fact that both of my parents are members of the so-called "1968 generation", where political discussions that went all through the night with no outcome whatsoever were common. Clearly, that is not desirable.

However, in practice, I found that consensus and compromise can be achieved, even in cases where it seems to be very difficult. If you approach the problem with an open mind, and you are genuinely open to accepting a solution other than the one that is clearly laid out in your mind, you will often find that there is one which you find only slightly problematic, and that will peacefully resolve matters. This is, of course, not easy, and we all will fail at it -- sometimes due to our own behavior, sometimes because the people we are talking to are unwilling to compromise.

Furthermore, there are matters on which we should not compromise. Neutrality is such a principle, as it is the very foundation that makes it possible to achieve consensus in a large, heterogenous community. It is not desirable to corrupt core principles for the simple reason that dispute should be avoided.

Voting is a valid last resort, but needs a defined framework[edit]

While I have been pleasantly surprised by the achievement of consensus in situations where I considered it impossible, there are situations where it is unattainable. There are cases where can simply not compromise because to do so would be a violation of our principles; there are cases where the group of editors is too large and the spectrum of opinions too diverse to find a middle ground.

I believe that voting is a valid last resort if

  • it has been preceded by a discussion period with consensus as its goal
  • if all the arguments are clearly laid out
  • if the options themselves have been openly debated
  • if we can agree on what thresholds are required for a decision of a certain type to pass
  • if the voting system is simple but reasonably immune to strategic voting (e.g., approval voting).

I believe that our current approach to voting is somewhat schizophrenic: We use it all the time, but we refuse to admit it, or to clearly lay out the rules. We refer to voting with an 80% threshold as a "consensus decision", when it is not consensus by any reasonable definition.

I think Wikimedia would be greatly served by a standard decision making process that distinguishes different types of decisions and defines the framework within which votes can take place. The truth is that even votes I have organized have not always followed the criteria laid out here -- it is too tempting to simply run a poll and use the result to achieve a desired outcome. But without such criteria, voting is likely to be overused and misused.

Lack of processes leads to cabals[edit]

Many people believe that documenting practices is bad, that instead, people should simply be educated by the existing community on what the established practices are, and that this is sufficient to build a healthy community in the long run. They argue that instruction and policy creep scare away newbies, that they lead to statism, and that they encourage trolling.

I strongly disagree with that notion. I believe openly documenting practices is an absolute requirement in a wiki before they should be used as an argument, or worse, as policy, or even grounds for banning. Otherwise, what happens is that the people who form the core of the wiki, the most active editors, will gain a very strong influence that is very difficult to challenge. I think that this is the main cause of stagnation, not the other way around.

The argument that something or other is "established practice" can be quickly used to kill discussions, revert edits, or even block users. It doesn't require backup other than that of the social community; that is, if the core group does not challenge an assertion of "established" practice, it is likely to be accepted. The core group will be disinclined to do so in order to avoid conflict, or to avoid falling in disfavor with core members.

It is also worth noting that, in a large wiki, you will eventually have core factions, who will have different histories, and different views on what the established practices are. These factions will engage in battles with each other: wars of attrition, where those with the most time to dedicate to the issue decide the future of the community.

Wikis are text-editing tools, something which is not text, which is not written down, cannot be edited and developed. You could argue that by not writing down rules and guidelines, you strenghten the community aspect of wiki culture. There is certainly truth to that -- the existing "club" may be more comfortable. However, if you want to build a scalable community, where new members can easily join, that does not appear to be true. A club, a cabal, leads to close-mindedness and stagnation.

Another common claim is that, once you start making rules, bureaucracy and instruction creep is inevitable. First, it's important to understand what these terms refer to. The Wiktionary entry is not very useful, so I'll quote one of the definitions of bureaucracy provided by the American Heritage Dictionary instead:

An administrative system in which the need or inclination to follow rigid or complex procedures impedes effective action.

In other words, bureaucracy can refer to a system of overly complicated rules that are rigidly followed. Interestingly, it can also refer to a hierarchy of non-elected officials following fixed procedures -- something which I contend will result from the lack of policy. But let's look at the problem of overly complicated rules. The process of the evolution of rules from simple to complex ones is sometimes called instruction creep. This process needs to be distinguished from the notion of bureaucracy per se.

If you believe in the ability of human to govern themselves, then you will easily agree that bureaucracy will result if there is no community-driven process for developing policies. But in practice, on most wikis, any new policy has to meet strong community approval before it becomes relevant. For example, my own policy proposal to remove personal attacks from discussions has never reached the sufficient level of support, hence, it has never become established policy. It's also quite common for proposals to be made and ignored for years. There's nothing wrong with that. They are useful to reference in later discussions, as long as they are clearly tagged as proposed policies. If something is a bad idea, that, too, is useful to document.

Even though the English Wikipedia, for example, has a rules-centric culture, (or maybe because of it), it has found ways to distinguish between proposed policies, semi-policies and hard policies which can be used to ban users (such as Wikiquette and NPOV). The English Wikipedia guideline on creating new policy is, in fact, quite careful in warning users that policies should reflect nascent practices of the community, and should be built in consensus.

Instruction creep is a possible cause of bureaucracy, but should be looked at separately. It is in the nature of wikis to drive an expansion of documents. While it is easy to regulate the creation of a policy, it is harder to regulate its editing. There are, however, reasonable ways to deal with it. First of all, there's something I would call the threshold principle which generally applies: as soon as a problem in a wiki reaches a certain threshold, a power shift occurs. Those who are responsible for the problem suddenly find themselves in the minority, while those who pointed it out before -- and were ignored -- find strong support. To put it in specific terms, as soon as a policy page becomes intolerably complex, it is quite likely that pruning will occur.

This, of course, is not sufficient: it would be preferable to never reach this level of painfulness (and some pages will linger below the threshold and still be too complex). One way to deal with this issue is to prescribe size limits for policy pages, and to enforce them. Furthermore, examples can be separated from the policy itself -- into tutorials, FAQs or subpages. Finally, every policy should be carefully watched by those interested in it, and edits should be reviewed as they occur; major changes will have to find consensus on the discussion page before they are implemented.

Certainly, not everything needs to be written down. If a certain principle is never violated, and everyone except trolls will quickly agree with it it is explained, referring to established practices is often sufficient. However, as soon as you're dealing with non-obvious cases, the argument "if there's no document, there's no law" should prevail. Reasonable people can disagree about pretty much anything. Don't whack them over the head unless you've got policy to back you up.

Now, here is my key argument: If you completely oppose voting, cabals are inevitable. That's because certain policies, due to lack of consensus, simply cannot be codified without a decision making process -- and as such, will always be nothing but mere "established practices," with all the problems described above. Therefore, I hope that people who are genuinely interested in maintaining an open community will work together to establish decision making processes, and that they will not oppose a standard body of rules that can be changed and challenged.

Cabals are harmful[edit]

I believe that openness in Wikimedia should be maximized, and that the community should generally decide the direction of the projects. I believe that, if we allow small cabals and factions to form, if we have too many private wikis and mailing lists, we sacrifice the power of true collaboration for a deceptive increase in efficiency.

There is a valid need for private groups, however, their number should be as small as reasonably possible, their existence should be known, and the processes for joining and leaving should be documented. Security by obscurity -- private channels only the small core group knows about, mailing lists that do not officially exist, and so on -- will reduce useful external input and growth and eventually lead to allegations of corruption.

Once you have secret clubs, you generally have the problem that matters are discussed in the secret club simply because "everyone who matters" is in it, even if the issues are not confidential to begin with. This, again, contradicts the spirit of open collaboration. Therefore, any private groups should be clearly defined in their functions, and moderated to serve only those functions. It should be policy that anything which can be openly discussed should be openly discussed.

Conflict is inevitable[edit]

We all desire peace. An easy way to get peace is to shut up everyone who disagrees with you. That is of course not the kind of peace we want. We should accept the fact that peace is not always possible, and that conflict and controversy are a natural part of human existence. Instead of denying conflict, we must improve our ways of dealing with it, by trying to stay cool, focusing on the issues, rather than the person, ignoring personal attacks, and so forth.

Of course, there are reasonable ways to avoid conflicts, but if you are editing an article about homosexuality, abortion, George W. Bush, homeopathy, Scientology: you should expect controversy. You should expect conflict. You should expect disagreements. As noted in the first section, we should try to genuinely achieve compromise and resolution. And, something I am not particularly good at, we should, beyond the mere requirement of being civil, also make an attempt at being friendly and welcoming.

However, we should not brand anyone who participates in a discussion about a controversial topic as a troll, nor should we label such discussions as a general waste of time, or undesirable. They are a necessity, and those who, in good faith, make a real attempt to deal with these controversies, and to find reasonable solutions, are people who deserve a lot of respect. It is easy to run away as soon as conflict erupts, and it is easy to never challenge bullies or trolls. But it is essential that some people do not run away. The reasons should be obvious: If we do not face this challenge, we essentially abandon our policies to those who have the impunity to cleverly circumvent them.

Don't make enemies[edit]

In spite of the above, it is important to keep the general rules of discourse in mind. Stay civil. When possible, try to avoid creating an enemy. The person you are having a heated debate with right now might be your best ally tomorrow. In the wiki world, where debates cover the whole spectrum of human existence, it is hard to find an editor you don't agree with on something. Forgive and forget past disputes, and don't hold grudges.

There are two types of users who can become enemies:

  • Users who are hostile to the very fact that you dare to disagree with them. I strongly believe that such users need to be challenged in important cases, but you can let them have little victories.
  • Users who do not share the values of the community: trolls, people who simply want to push an agenda, and so forth.

There is no general recommendation on how to deal with such users; you can often avoid making enemies of them if you agree with them, or at least give them the impression that you are open to their arguments. They are often people who have a life history of being rejected, dismissed and bullied, in short, people who "need a hug". If you feel like playing social worker, you can do that, but don't expect any actual results from non-physical interaction other than keeping them busy. My general recommendation would therefore be to ignore these users except in cases where it matters. In these cases, you will likely find support from the larger community, so even if they do become enemies, you will not necessarily be isolated.

Finally, there are users who are not hostile to disagreement per se, but who are hostile to cold, emotionless logic. If you are like me, jump over your own shadow and try to give them what they need, namely, a positive emotional context for your arguments. People have different life histories, different upbringings, and the way they process information varies greatly. Don't expect others to think like you do, even if you believe that is the only way people should think. :-)