Strategy/Wikimedia movement/2018-20/Recommendations/Iteration 2/Community Health/8

From Meta, a Wikimedia project coordination wiki

Privacy and security for everyone[edit]

Q 1 What is your Recommendation?[edit]

We should actively ensure and safeguard the privacy, security and confidence of all individuals who participate in the movement, or who interact with the content we provide to the world. We should provide privacy and security to readers, who may be affected by the content of our projects, to IP editors, who put themselves at risk by contributing, as well as to registered contributors who can face complex challenges due to their participation.

  1. We should
    1. Introduce digital security plans and trainings throughout our movement; making access to them more equitably available through trainings in peoples language and tailored to their circumstances off- and online (see also Recommendation “Building an inclusive global community”),
    2. Strengthen our public policy and advocacy profile and competencies, and consistently articulate our voice(s) in public policy conversations concerning privacy and security to counteract efforts by governments and other external actors to put pressure on our movement and contributors.
    3. Introduce and adhere to industry best practices on suicide prevention related to support for vulnerable readers.
    4. Develop physical security and rapid response capabilities to protect contributors from pressure or harm by hostile third parties.
    5. Develop technological abilities that can be put into the service of improving the privacy and security of people.
  2. We should invest strongly in partnerships with organisations and individuals who are able to support our efforts concerning privacy and security.
  3. We strengthen our technology platforms to comply with the international best practices around privacy, security and data collection and storage. This includes but is not limited to
    1. Supporting anonymizing technologies like TOR and VPNs for the users that would require the support.
    2. Anonymising of IP addresses in public domain to protect IP contributor privacy. The scope would cover all IP contributions going forward from the point of technical enactment. Access to raw IP information would be restricted to user rights groups who currently have access to non-public information on the platform already, requiring them to sign Non-Disclosure agreements with the platform provider. Tools ensuring ongoing effectiveness of administrative and anti-vandalism work by community members would have to be available no later than the first activation of IP masking itself.
    3. Adopting industry best practices for data management like distributed storage and encryption for long term storage.

Q 2-1 What assumptions are you making about the future context that led you to make this Recommendation?[edit]

The annual Freedom of the Net report 2018 detailed the 13th successive year of internet freedoms being curtailed around the world. The working group projects that these concerning trends will continue around the world and pressures on the platform provider, individuals and communities, and affiliates will continue to increase.
We currently do not have a systematic approach to support users at risk due to their participation in movement activities and lack the essential local capacity to support users as demonstrated in recent events like the political instability in Venezuela and Turkey.
We will continue to face state and state-sponsored censorship efforts impacting the movement and currently do not have publicly available material to support contributors who are affected by such measures.
There is a structural and growing gap between the growth of the world’s online population and the parts thereof the Wikimedia movement succeeds in trying to engage. The current platform industry best practices on security, privacy and data practices are not implemented in the Wikimedia platforms and it puts users and contributors at unnecessary risk. This combined with the changing socio-political landscape around the world the movement depends on require additional infrastructure, capacity and resources to support our vision of knowledge equity.

Q 3-1 What will change because of the Recommendation?[edit]

This set of recommendations will strengthen the protections for contributors and users of the wikimedia platforms to engage with the platform with less risk and better confidence.

The security plan and training section closes a key gap in the movement’s risk profile. Historically, the platform relied on a “We are good and no one can wish us any harm.”-approach to platform and individual digital security. It also opens the opportunity to build regional centers of digital security excellence through partnerships of affiliates with digital human rights organizations. The latter is a key complementary aspect to platform security: the platform provider is by default not best positioned to cater to all social needs of a social movement projecting social good into the world through said platform.

The physical security and rapid response recommendation is closely aligned with one on security plans and trainings. The biggest source of physical risk to people volunteering on a platform as open as Wikimedia’s flows from their online contributions or the desire of hostile third parties to pressure volunteers to change content on it or harm the platform as a whole.

The alignment with industry best practices on suicide prevention is a step to meaningfully extend the notion of the social movement to readers both as participants in this role and as potential future active contributors. Wikimedia has been identified as a place where people seek information about how to commit suicide and how to find help when they feel suicidal or interact with others who feel suicidal (Wulczyn, Thain & Dixon, 2017). Wikimedia should not continue to provide less support to readers in considerable distress than other major platforms. Implementing such best practices also reduces the current risk facing the movement to be subjected to content regulation by governments.

The public policy advocacy section acknowledges that governments and other external actors will continue to step up efforts to impose their views on the movement through pressure or laws targeting individual volunteers, affiliates, and the platform provider. If the movement is to serve the world through freely curated freely licensed knowledge in an environment where privacy and security cannot be taken for granted to the same extent we relied on historically, we have to not just fundamentally overhaul the platform and spread digital security skills but also consistently articulate our voice(s) in the public policy conversations.

The IP masking recommendation section improves the protections for IP editors, who are both a considerable group of individuals and often the first stage of more longer-term contribution through registered accounts. Implementation will also align the platform better with privacy best practices across the industry. The related overhaul of anti-vandalism tools will significantly improve the ability of communities to effectively govern and protect themselves in the face of long-term abusers, paid editors breaking the Terms of Use, and hostile governments. To broaden the talent pool of potential volunteers interested in filling such roles, these tools themselves will have to be redesigned guided by an equity lens and better training provided to prepare people (see “Building and inclusive global community”).
These admin tools would allow admins to access information about the IPs (A system like this is implementable in various ways and a capable engineering team could decide on the approach but to provide an example on one of the simple ways to do this : the ip addresses could be masked using any of the well known hashing algorithms and could be mapped when necessary )
The section’s recommendation on providing administrators with continued access to raw IPs strikes the appropriate balance between protecting IP editors and protecting the ability of local project communities to protect themselves from unhelpful contributions. To succeed, the recommendation is inherently dependent on making admin tools, the selection process (see “Investing in equity-centered technologies”), and the self-care trainings for functionaries (see “Building an inclusive global community”) available to communities. These four parts

Q 3-2 Who specifically will be influenced by this recommendation?[edit]

  • WMF as platform provider
  • Admins and other functionaries: it will affect their workflows and authorities
  • Individual contributors - IP contributors will have greater anonymity, registered contributors will have better access to support
  • Affiliates - will play a bigger role in security issues, especially through local partnerships, providing training and advocacy
  • Readers - will be supported when facing challenging content.

Q 4-1 Could this Recommendation have a negative impact/change?[edit]

IP masking entails a significant change in anti-vandalism practices currently observed by project communities that will have to be appropriately adapted and fully available on the day IP masking is being enacted technically.

The building of digital security capacity, especially regionally in regions with elevated risk profiles, will raise the movement’s profile among governments pursuing authoritarian policies to control their citizens.

While Wikimedia has taken several public steps in said direction ever since switching the platform to HTTPS and endured more state pressures in response, fostering alliances with digital human rights organizations adds to these risks.

It also comes with a temporary challenge to be navigated between having announced trainings and partnerships on the one hand and having effectively educated those community leaders who do face elevated risks due to their Wikimedia contributions in digital security in order to protect themselves better.

Q 4-2 What could be done to mitigate this risk?[edit]

The movement will have to step up advocacy on public policy globally, invest in enhanced tools for functionaries to address behavior harming content and communities, and broaden the talent pool to include more, and more diverse people.

Q 5 How does this Recommendation relate to the current structural reality?[edit]

It adds new capabilities and resources while addressing several of the biggest structural privacy and security risks the platform and its position in the changing world poses to a thriving movement as Wikimedia embarks on its journey towards 2030.

Q 6-1 Does this Recommendation connect or depend on another of your Recommendations? If yes, how?[edit]

Yes, it is connected to building an inclusive global community, joint set of rules, and building equity-centered technologies

Q 7 How is this Recommendation connected to other WGs?[edit]

Technology Working Group, Advocacy, Resource Allocations, Partnerships, Diversity