Jump to content

IP masking discussions at Wikimania 2019

From Meta, a Wikimedia project coordination wiki

Individual summary reports

[edit]

Wikimedia Foundation staff working on the IP Editing: Privacy Enhancement and Abuse Mitigation project who attended Wikimania 2019 in Stockholm summary documentation of discussions.

My mission at Wikimania this year was to talk to as many people as I could about the enhanced privacy protection for IP editors project (aka IP masking). I got a wide-array of insights from people working on various different projects and languages with different levels of involvement on the wikis. Captured below are some highlights/pervading themes from my conversations:

  • Most people I talked to expressed support for the project, including some expressing surprise that this has not been implemented after so many years. There was a variation in to what extent users were okay with hiding IP addresses. While some were only okay with hiding them from unregistered users, others expressed interest in making them even more protected. Perhaps expectedly, this opinion varied based on where people were from and how familiar they were with the importance of online security.
  • There was a fear about how this would affect anti-vandalism efforts on various wikis, including detecting LTAs and sockpuppets. I learnt how some projects have specialized workflows for dealing with these and have publicly documented IP addresses going back many years. There was skepticism about how those workflows will work post IP-masking. At the same time, I also heard general agreement for tools that could provide a similar level of information that IPs do currently, rather than exposing IPs. This could actually lead to less workload for users who have to perform a manual whois on the IP addresses.
  • There was an enthusiastic need expressed for better tools. Especially, a need for improvements to CheckUser was mentioned in several conversations.
  • I heard a need for training for tools - there is little documentation or training provided for tools like CheckUser and people are expected to learn on the fly.
  • There was a request for more data and research for impact of unregistered editors on our projects.
  • A more concrete concern was about how the banners that indicated to unregistered users that their IP addresses would be recorded were inconspicuous (small fonts, non-catchy colors) on many projects and hence users were probably not aware that their IP address would be recorded. VisualEditor enhances this problem even more. This needs to be looked into more deeply.
  • Somewhat expected, there was a general distrust towards WMF for handling sensitive projects like this one. To the best of my ability, I reassured people that this we will use extreme caution with this project and make sure we proceed in a consensus-driven manner on every major decision that happens. Better tools are the foremost priority before anything else happens.
  • One interesting takeaway I had was that a lot of people were unwilling to engage on the meta project talk page. The reasonings varied from unfamiliarity with the community to hostility they saw on that page. People expressed feeling much more comfortable openly expressing their opinions in their local community discussions. While this makes complete sense, it opens the challenge of hosting discussions at several different places at once and consolidating all opinions raised in a coherent and consensus-based manner.


At Wikimania in Stockholm in 2019, I had a very large number of conversations about IP editing. We get input from various sources – the Meta talk page, discussions on various Wikimedia wikis and so on – and in the name of transparency I wanted to write a public report of my impressions from Wikimania. This was a topic brought up again and again over the course of five days in a very unsystematic way, so forgive me if this is a bit jumbled.

In general, a good number of editors – with a heavy bias towards stewards and checkusers – were concerned. Many don’t trust they’ve actually got any real opportunity to change what’s happening, to change the plan. While there’s certainly a strong push and pressure on the team to come up with a good solution that includes hiding the IP addresses of unregistered users, we do not yet have a plan for how to proceed. The lack of replies to ”how do you plan to handle …” is because we wanted to shape this plan together with the communities, knowing full well that our detailed understanding is limited to the few wikis where we have volunteer experience of patrolling, handling harassment, anti-vandalism workflows and so on.

Additionally, there’s a lack of confidence in us building the tools we’re saying we’ll be building. There are tools that have been suggested for a long time to make the lives of e.g. checkusers easier; this (lack of) trust won’t be repaired until the tools are built and working and in the hands of the communities that need them. There are no shortcuts here. Until the tools are fully functional, there’ll be some suspicion that the development of them will be abandoned.

I had three separate conversations specifically about the public's ability to monitor how e.g. government bodies edit Wikipedia, if unregistered. I also had a couple of persons specifically comment on this as a red herring – not what the system was built for, not a core part of our workflows, not something we need the public to know to be able to protect the wikis as long as we have torpor tools to handle spam, COI, vandalism and harassment – and reasonably insignificant compared to protecting the privacy of unregistered users. Some suggested solutions, such as giving the currently working Twitter bots access to specific IPs of public institutions, or a tag system which could work together with tags showing if something is a school, library, university etc to avoid indefinitely blocking public institutions.

There seemed to be a certain cultural split for some areas – for example, people from continental Europe seemed more likely to be concerned about the privacy of IP users; editors from North America more likely to be concerned about the potential harm to the anti-vandalism/harassment/spam workflows. People outside of the movement were generally baffled that we show IP addresses of users who are trying to help by editing, sometimes editing controversial subjects. I spent some time explaining how this has become part of the anti-spam and anti-vandalism work and can’t just be hidden tomorrow without creating problems for the wikis and the people who spend their time protecting them.

Some were concerned about the dominance of English discourse on this, and wanted us to seek out more opportunities for other languages to have their voices heard. This is something we need to do anyway, because we need to find the different workflows different communities are using, making sure that we take them all into account when we build new tools for patrollers/admins/functionaries and don’t leave some communities with insufficient tools to handle harmful edits.

While I was at Wikimania 2019 in Stockholm, I spoke with attendees about the IP Editing: Privacy Enhancement and Abuse Mitigation project, commonly called IP masking. This page is a summary of my conversations. Thank you to everyone who shared your opinions and ideas.

The majority of people that I spoke with were long time contributors to Wikimedia projects and held advanced level permissions such as checkuser and steward. I also spoke with people that were administrators on local wikis, people who edit abuse filter, current and former Arbitration Committee members, global sysops, and users on local wikis manage vandalism and spam. I also talked with a some people who were interested in privacy, safety, and security for users who might be especially vulnerable because of their geographic location, political, ethnic, or religious affiliation, or other personal characteristics such as sexual orientation or identity.

Themes:

  • Strongly expressed opinions that limiting who has access to ip address of unregistered users will be very disruptive for the workflows of local volunteers who are the first to find spam, vandalism, and harassment. This applies to registered users with and without administrator permissions.
  • Mixed opinions about whether a person editing without an account and exposing an ip address is giving consent to expose the ip address and therefore there are no privacy concerns.
    • A common opinion expressed is that unregistered users are giving up right to privacy since they are knowingly exposing their ip address when they make an edit.
    • A common opinion expressed is that many unregistered users do not understand that exposing their ip address could reveal private information.
    • Other people expressed the view that people accidentally expose their ip address and unintentionally reveal private information.
    • Other people also say that people can later change their mind for a good reason and there is not a way to decouple edits from an ip address.
  • Deep concern that Wikimedia Foundation will not follow through with development of a new set of tools or improvements to existing tools and users will be hampered in their ability to manage vandalism, spam, and harassment.
  • Opinion that there needs to be more research about the problems that might or might not happen if ip addresses are masked from users without advanced permissions. Concern that WMF does not understand the scope of the problem.
    • How many blocks depend on knowing an ip address?
    • How many and what type of tools and bots depend on ip address?
  • Concerns about lack of regard for privacy, safety, and security by users commenting in discussion because they live in places that are safe or they are not vulnerable to threats.
  • Concern that the primary location of the discussion is on Meta and in English.
  • Specific ideas about ways to improve anti-abuse tools and mitigate loss of viewing ip addresses of unregistered users.
    • Tags to identify edits made from a school or library.
    • Create a “session of editing for unregistered users” where all edits from one session are logged as coming from the same user.
    • Sockpuppet detection tools for users who regularly counter abuse on wiki.
    • More bots and automatic detection tools to replace manual work of users.