Jump to content

Wikimedia Foundation Annual Plan/2023-2024/Goals/Safety & Inclusion

From Meta, a Wikimedia project coordination wiki


Protect against growing external threats.

Defend our people and projects against disinformation and harmful government regulation. Work across the Movement to Provide for the Safety of Volunteers.

Now more than ever at a time of geopolitical and economic uncertainty, the world needs high-quality, well-sourced free knowledge. But the legal and policy environment in which the Wikimedia projects and communities have thrived for more than twenty years is also in a period of rapid change. The communities who build the projects, and the content they create and share, face new challenges and threats.

As volunteers across the globe fill knowledge gaps and improve the quality and accuracy of information on the projects, everyone needs to feel safe, welcome, and respected. People need legal frameworks that allow them to safely contribute to the projects and the Foundation should work to ensure that the content available to become free knowledge is as broad as possible.

To support this goal, the Wikimedia Foundation aims to protect the Wikimedia projects and communities from external threats, to advance the environment for contributing to free knowledge, and to support community self governance and risk mitigation efforts.

As our communities face unprecedented challenges to legal and regulatory frameworks that make it possible for the projects to survive and thrive, the threat to free knowledge posed by mis- and disinformation is compounded by machine-learning tools able to generate and spread massive amounts of disinformation. Government interference and surveillance threaten volunteers' safety.

In the coming year, we will help strengthen local capacity to advocate for policies and laws that enable communities to thrive. We will support communities in adapting to changes in laws and regulations that affect the projects. We will collaborate with volunteers to track and counter mis- and dis- information. We will work to strengthen trust and safety processes. And we will support efforts to strengthen community self-governance.

Protect our projects from external threats

[edit]

Laws and regulations

[edit]

This year, we expect to see additional laws and regulations focused on hosts of online platforms. Because many new or proposed laws are not aimed at the Wikimedia Foundation or the communities, we can help regulators understand how the Wikimedia communities work so they can draft better laws that protect the communities and our values. Even so, several proposed laws take a one-size-fits all approach designed with large commercial platforms in mind. Laws that do not take the Wikimedia model into account can have harmful consequences for our projects and the people who build them, regardless of whether the harms are intended.

We provide here a few examples of how we approach this challenge. Legal developments can occur sometimes unexpectedly throughout the year, so this list cannot be comprehensive.

  • We are planning to implement the proposed updates to the Terms of Use in the new fiscal year, to ensure that the Terms of Use continue to reflect our legal compliance obligations and help us to effectively protect the projects from bad actors.
  • We will be looking closely at the Digital Services Act (DSA) from the European Union, which is a significant new legislation that changes how platform hosting works and applies to nearly all websites worldwide so long as they provide services to EU residents. In particular, we will be monitoring the process for new requests under the DSA and how they are handled and reported to help protect the communities from disruption or bad faith takedown demands.
  • We will be closely monitoring the US Supreme Court's cases including Gonzalez v. Google and Netchoice v. Paxton. These are two cases that have the potential to significantly change the legal rules for hosting user-generated content and could present risks not only to the Foundation but to individual Wikipedians. We will be working to defend the projects if the cases give rise to new lawsuits trying to remove content or harm users. We will also be looking at legislative options that may come forward in light of these cases.
  • Our Global Advocacy team will work in partnership with affiliates to educate lawmakers about the Wikimedia model, and about the potential impact of over-broad laws that fail to take our model into account. We will work with volunteers to build local capacity to advance a positive vision for the legal and regulatory frameworks our communities need in order to thrive.

Disinformation

[edit]

As Wikimedia projects are increasingly regarded as trusted sources of knowledge across the world, some politicians and governments have made deliberate efforts to discredit Wikipedia through disinformation campaigns in mainstream and social media. The open nature of the projects themselves make us vulnerable to information warfare between governments and political movements within and across borders. The 2020 Foundation-wide human rights impact assessment flagged project capture and manipulation as a serious human rights risk. Since then, the Human Rights Team has tracked targeted surveillance as well as physical threats against editors who work to correct disinformation. Meanwhile, government regulations that claim to combat disinformation often include measures that increase their surveillance and censorship powers.

In 2023−2024, we will collaborate across the Foundation and work with communities across the world to address the threat to the mission, content, and people caused by disinformation in a number of ways:

  • Strengthen investigations to identify disinformation on-wiki in collaboration with volunteer functionaries and support from research and partnerships.
  • Strengthen and expand moderator tools, including transparent and accountable use of machine learning in service of volunteers' efforts to identify and address disinformation.
  • Support efforts to increase reliability of sources.
  • Implement technical improvements to strengthen security and privacy of volunteers on-wiki, protect against surveillance, and enhance the communities' ability to effectively govern themselves and address disinformation and human rights risks.
  • Strengthen risk assessment and human rights due diligence to prevent unintended consequences when implementing technical changes & innovations to editing systems and tooling.
  • Support and strengthen sharing of information, tactics, and resources among functionaries and other key self-governance stakeholders and affiliates working to counter disinformation in their communities.
  • Work with communities to educate policymakers about the Wikimedia model and help them understand what types of laws and public policies can protect and support volunteers' ability to fight disinformation.

Countering on-wiki misinformation (information that is inaccurate but not deliberately meant to mislead) is the core work of volunteers. Supporting volunteers' work in improving the accuracy of the wikis is a key priority in 2023−2024. In addition to Product & Technology work, several teams including Trust & Safety, Partnerships, and Research can support volunteers with resources to improve the reliability of sources and connect communities with experts who can help identify patterns of off-wiki misinformation about particular topics. Global Advocacy helps our communities advocate for laws that will protect volunteers' legal right to remove misinformation, and Legal Affairs is there to defend volunteers' rights, when necessary, in court.

The best defense against mis- and disinformation remains a strong, diverse, safe, well-supported community that is able to fill knowledge gaps. These gaps often contain missing information about historically marginalized groups or communities that are underrepresented in the Wikimedia Movement, which is why the environment for contributing to free knowledge is so important.

Advance the environment for contributing to free knowledge

[edit]

As part of our legal and advocacy work, we not only defend against threats and problems but also look for ways to make the legal and regulatory environment of the Movement more favorable for safe and inclusive contribution. Several examples of this work include:

  • Advocating for laws and regulations that protect the projects and the Movement's people, and which advance everyone's right to access and contribute to the projects. Such laws must include strong protections for freedom of expression and privacy in alignment with international human rights standards.
  • Looking for useful test cases to help clarify and expand the public domain and to help ensure that defamation and right to be forgotten issues are clearly understood so that users know what types of content are safe to contribute to the projects in different places around the world.
  • Implementation of the CC BY-SA 4.0 license for text on the projects following the Terms of Use update. This will bring the project licenses up to the latest version of Creative Commons, helping to make Wikipedia safer and easier for reuse and opening it to new contributions internationally!
  • Supporting and coordinating the work of affiliates and allies to advocate for copyright reforms to ensure that the law protects and supports the use of free and open licenses.

Support community self-governance for impact and risk mitigation

[edit]

Our collaboration with communities in guiding our projects constructively forward is arguably our greatest strength. The Wikimedia Movement is based on the power and agility of crowdsourcing, through the strength of which we are able to achieve incredible impact, from the flourishing of content to the rapid moderation and review of changes to the creation and enforcement of policies. Our crowdsourced self-governance is a key mechanism for the mitigation of risks in our projects, allowing us to be far more nimble than a pure staff model could ever be in not only keeping our sites legally compliant, but also in handling complex review of Movement coordination challenges.

Committees largely or even often exclusively comprising volunteers are one way we support self-governance. They help bring in diversity of experience and perspectives and help ensure that key workflows reflect our international Movement. The success of what we describe as governance committees (excluding resource-allocation committees) is integral to key aspects of the success of the Movement, yet the individuals sitting on these committees remain volunteers. Many have limitations on time available to support the work or on training in areas necessary to succeed, and some face personal threat when their work involves curtailing abuse of our systems. We are here to help them.

We are aligning our support of key governance committees to enhance their success via a range of services from training to administrative support, tailored to the needs of the committee. Our support commitment encompasses:

In all cases, our goal is to support these committees in achieving their functions as well as they can, sustaining our self-governance model while observing the degree of independence appropriate to the type of committee.


EquityEffectiveness