Bias will always remain a characteristic of Wikipedia, a perennial problem which arises because it is the work of a large and diverse volunteer community; anyone who so wishes can edit Wikipedia.
Some cases of bias are due to points of view of individuals, which manifests in the content and way articles are written. In every project, there would be topics which are covered in detail while other topics have little or no coverage. In case of contents and point of view, the usual Wikipedia processes, of ability to rewrite, to undo, to debate and develop consensus on talk pages, get resolution from noticeboards, and so on, makes that a handle-able issue. In case of skewed coverage, gaps can be pointed out and community encouraged by positive inducements to remedy the gaps.
Systemic biases are harder to resolve, examples being the gender bias, the notability bias, and the ideological bias. Such biases also result in skewed representation despite the diversity, under-representation and wide gaps in knowledge on certain topics, but more importantly, they may result in unfair and uneven application of policy for new article deletion, different standards of notability, hostility to points of view, excessive rule-making, and even edit warring.
Biases arising from this nature of Wikipedia can be engaged by greater awareness, pointing out behaviour that is restrictive, abusive or prejudiced. Reasoned discourse, and encouraging positive forms of social behaviour are useful. Positive discrimination in outreach events could also help in getting under-represented voter communities to join the movement.
Awareness of such bias, informing and educating the Community through discussions both online and offline is important because often bias arises from ignorance. The Community needs to be provided informatory resources for the same, in which the WMF staff can play a role.
However, ensuring a safe environment and providing safe spaces is very important. There must be designated safe places where respectful, discourse can take place, where the unpopular opinion is not punished for being articulated, and where ad hominem attacks are not tolerated, not just in Wikimedia project spaces, but also in social media forums on external platforms. If there are such places, then honest debate can be done, without harassment of anyone for their point of view, and greater understanding and/or even consensus could be reached on these issues.
Language environment is a huge bias. While certain Wikimedia projects run fully in the native languages of community, important projects such as Wikimedia Commons, Wikidata, even Wikimedia Meta-Wiki, are all in English language. Discourse across Wikimedia movement is dominated by English language as the medium. The internet itself is predominantly English-language based. Making important standalone projects inter-operable in languages other than English is the first step.
Resource bias across communities, amongst editors of a community, amongst the projects run by a community are all important biases. This would be partly alleviated by the creation of hubs with dedicated budgets. However, a new look at how resources are distributed across communities and projects is necessary and should guide future policy at WMF. The Board of Trustees has a role to play in this.
Research in aspects of systemic bias is important so that any discussion or action could have a basis on facts, rather than only on anecdotes. Wikipedia Research Team is useful for this.
The biggest challenge involving bias, however, is that of communities which are dominated by ideological factions that conspire to destroy the fundamental principles of Open Knowledge in general, and Pillars of Wikimedian culture in particular, which hijack the content and the editing processes, where online and real world persecution happens of the dissenting editors, and where such suborning of Wikimedia projects has the sanction, encouragement and protection of governments. This requires a lot of discussion and analysis as it requires collective action of Global Community, that language/project’s community and the WMF for any form of resolution.
It is here that the Board of Trustees has an onerous responsibility and which must institute an initiative to address the issue. If elected I pledge to work towards the resolution of bias on Wikimedia projects on priority.
) 11:21, 13 July 2021 (UTC)