Edição de IP: Aprimoramento de Privacidade e Mitigação de Abusos

From Meta, a Wikimedia project coordination wiki
Jump to navigation Jump to search
This page is a translated version of the page IP Editing: Privacy Enhancement and Abuse Mitigation and the translation is 17% complete.
Other languages:
Bahasa Indonesia • ‎British English • ‎Deutsch • ‎English • ‎Türkçe • ‎español • ‎français • ‎italiano • ‎magyar • ‎polski • ‎português do Brasil • ‎suomi • ‎беларуская (тарашкевіца) • ‎български • ‎українська • ‎فارسی • ‎বাংলা • ‎ગુજરાતી • ‎中文 • ‎日本語

About IP Editing (discuss)
About IP Addresses (discuss)  · Login Required Experiment (discuss)

Visão geral

Nos últimos anos, os internautas se tornaram mais conscientes da importância de entender a coleta e o uso de seus dados pessoais. Governos de vários países criaram leis em um esforço para proteger melhor a privacidade dos usuários. As equipes de políticas legais e públicas da Wikimedia Foundation estão continuamente monitorando os desenvolvimentos em várias leis ao redor do mundo, como podemos proteger melhor a privacidade do usuário, respeitar as expectativas de os usuários e manter os valores dos movimentos Wikimedia. Com esse cenário, eles nos pediram para investigar e embarcar em uma melhoria técnica para os projetos. Precisamos fazer isso junto com você.

O MediaWiki armazena e publica os endereços IP de inexistido contribuintes (como parte de sua assinatura, no histórico da página e nos registros), e as tornas visíveis para qualquer pessoa que visite nossos sites. A publicação desses endereços IP compromete a segurança e o anonimato desses usuários. Em alguns casos, pode até convidar o perigo de colocar as pessoas em risco de perseguição governamental. Embora informemos aos usuários que seu endereço IP será visível, poucos entendem as ramificações dessas informações. Estamos trabalhando no aumento da proteção de privacidade para contribuintes não registrados, ocultando seus endereços IP, quando eles contribuem para os projetos, como o leitor médio não pode ver o IP de um usuário registrado. Isso envolverá a criação de um nome de usuário "IP mascarado", que será gerado automaticamente, mas legível por humanos. Temos ideias diferentes sobre como implementar isso melhor. Você pode comentar para nos dizer o que você precisa.

Os projetos Wikimedia têm uma razão muito boa para armazenar e publicar endereços IP: eles desempenham um papel crítico na manutenção do vandalismo e assédio fora de nossas wikis. É muito importante que patrulheiros, administradores e funcionários tenham ferramentas que possam identificar e bloquear vândalos, batedores de meias, editores com conflitos de interesse e outros maus atores. Trabalhando com você, queremos descobrir maneiras de proteger a privacidade de nossos usuários, mantendo nossas ferramentas anti-vandalismo trabalhando em pé de igualdade com o funcionamento agora. A parte mais importante disso é o desenvolvimento de novas ferramentas para ajudar o trabalho anti-vandalismo. Uma vez feito isso, esperamos trabalhar na proteção de endereços IP de nossas wikis – incluindo restringir o número de pessoas que podem ver os endereços IP de outros usuários e reduzir a quantidade de tempo que os endereços IP são armazenados em nossos bancos de dados e registros. É importante notar que uma parte crítica deste trabalho será garantir que nossas wikis ainda tenham acesso ao mesmo – ou melhor – nível de ferramentas anti-vandalismo e não estejam em risco de enfrentar abusos.

O objetivo da Fundação Wikimedia é criar um conjunto de ferramentas de moderação que tirem a necessidade de todos terem acesso direto a endereços IP. Com essa evolução de nossas ferramentas de moderação, poderemos mascarar os IPs de contas não registradas. Estamos muito cientes de que essa mudança afetará os fluxos de trabalho de moderações atuais, e queremos garantir que as novas ferramentas permitam moderação eficaz, protejam os projetos contra vandalismo e apoiem a supervisão da comunidade.

Só podemos chegar a esse ponto de decisão trabalhando em parceria com Verificadores de Contas, stewards, administradores e outros caças vândalos.

Este é um problema muito desafiador, com riscos para nossa capacidade de proteger nossas wikis caso falhemos, e é por isso que foi adiado ao longo dos anos. Mas à luz da evolução dos padrões de privacidade de dados na internet, novas leis e mudanças nas expectativas dos usuários, a Fundação Wikimedia pensa ser agora a hora de enfrentar esse problema.

Atualizações

30 August 2021

Hello. This is a brief update about Portuguese Wikipedia’s metrics since they started requiring registration to edit. We have a comprehensive report on the Impact report page. This report includes metrics captured through data as well as a survey that was conducted among active Portuguese Wikipedia contributors. All in all, the report presents the change in a positive light. We have not seen any significant disruption over the time period these metrics have been captured. In light of this, we are now encouraged to run an experiment on two more projects to see if we observe similar impact. All projects are unique in their own ways and what holds true for Portuguese Wikipedia might not hold true for another project. We want to run a limited-time experiment on two projects where registration will be required in order to edit. We estimate that it will take approximately 8 months for us to collect enough data to see significant changes. After that time period, we will return to not requiring registration to edit while we analyse the data. Once the data is published, the community will be able to decide for themselves whether or not they want to continue to disallow unregistered editing on the project.

We are calling this the Login Required Experiment. You will find more detail as well as a timeline on that page. Please use that page and its talk page to discuss this further.


10 de junho de 2021

Olá a todos. Já se passaram alguns meses desde nossa última atualização sobre este projeto. Aproveitamos esse tempo para conversar com muitas pessoas — em toda a comunidade de edição e dentro da Fundação. Consideramos cuidadosamente a ponderação de todas as preocupações levantadas em nossas discussões com membros experientes da comunidade sobre o impacto que isso terá nos esforços anti-vandalismo em nossos projetos. Também ouvimos de um número significativo de pessoas que apoiam essa proposta como um passo para melhorar a privacidade de editores não registrados e reduzir a ameaça legal que expor os IPs ao mundo representa para nossos projetos.

Quando falamos sobre esse projeto no passado, não tínhamos uma ideia clara da forma que esse projeto tomará. Nossa intenção era entender como os endereços IP são úteis para nossas comunidades. Desde então, recebemos muitos comentários sobre essa frente a partir de uma série de conversas em diferentes idiomas e em diferentes comunidades. Somos muito gratos a todos os membros da comunidade que tiveram tempo para nos educar sobre como a moderação funciona em suas wikis ou em seu ambiente específico cross-wiki.

Proposta para compartilhar endereços IP com aqueles que precisam de acesso

Agora temos uma proposta mais concreta para este projeto que esperamos permitir que a maior parte do trabalho anti-vandalismo aconteça sem impedimentos, ao mesmo tempo, em que restringe o acesso a endereços IP de pessoas que não precisam vê-los. Quero enfatizar a palavra "proposta" porque não é de forma alguma, molda ou forma um veredicto final sobre o que vai acontecer. Nossa intenção é buscar seu feedback sobre essa ideia – O que você pensa que vai funcionar? O que você pensa que não vai funcionar? Que outras ideias podem fazer isso melhor? Desenvolvemos essas ideias durante várias discussões com membros experientes da comunidade, e as refinamos em colaboração com nosso departamento jurídico. Aqui está o esboço:

  • Os Verificadores de Contas, stewards e administradores devem ser capazes de ver endereços IP completos optando por uma preferência em que eles concordam em não compartilhá-los com outras pessoas que não têm acesso a essas informações.
  • Os editores que participam de atividades anti-vandalismo, conforme vetado pela comunidade, podem ter o direito de ver endereços IP para continuar seu trabalho. Isso poderia ser tratado de forma semelhante à administração de nossos projetos. A aprovação da comunidade é importante para garantir que apenas os editores que realmente precisam desse acesso possam obtê-lo. Os editores precisarão ter uma conta que tenha pelo menos um ano de idade e tenha pelo menos 500 edições.
  • Todos os usuários com contas com mais de um ano e pelo menos 500 edições poderão acessar IPs parcialmente desmascarados sem permissão. Isso significa que um endereço IP aparecerá com seu octeto(s) de cauda – a última parte(s) – escondido. Isso será acessível através de uma preferência onde eles concordam em não compartilhá-la com outras pessoas que não têm acesso a essas informações.
  • Todos os outros usuários não poderão acessar endereços IP para usuários não registrados.

O acesso ao endereço IP será registrado para que o devido escrutínio possa ser realizado se e quando necessário. Isso é semelhante ao registro que mantemos para verificar o acesso do usuário a dados privados. É assim que esperamos equilibrar a necessidade de privacidade com a necessidade das comunidades de acessar informações para combater spam, vandalismo e assédio. Queremos dar as informações para quem precisa, mas precisamos de um processo, precisamos que seja opt-in para que apenas aqueles com uma necessidade real vejam e precisamos que os acessos sejam registrados.

Gostaríamos de ouvir seus pensamentos sobre essa abordagem proposta. Por favor, dê-nos sua opinião sobre a página de discussão. O que você pensa que vai funcionar? O que você pensa que não vai funcionar? Que outras ideias podem melhorar isso?

Atualização sobre o desenvolvimento da ferramenta

Como você já deve saber, estamos trabalhando na construção de algumas novas ferramentas, em parte para suavizar o impacto do Mascaramento IP, mas também apenas para construir melhores ferramentas anti-vandalismo para todos. Não é segredo que o estado das ferramentas de moderação em nossos projetos não dão às comunidades as ferramentas que merecem. Há muito espaço para melhorias. Queremos construir ferramentas que facilitem o trabalho dos combatentes anti-vandalismo. Também queremos reduzir a barreira à entrada nessas funções para contribuintes não técnicos.

Já falamos sobre ideias para essas ferramentas antes e fornecerei uma breve atualização sobre estas abaixo. Observe que o progresso nessas ferramentas tem sido lento nos últimos meses, pois nossa equipe está trabalho na revisão do SecurePoll para atender às necessidades das próximas eleições do Conselho da WMF.

Recurso de informação de IP

Modelo das informações de IP

Estamos construindo uma ferramenta que exibirá informações importantes sobre um endereço IP que é comumente procurado em investigações. Normalmente, patrulheiros, administradores e verificadores dependem de sites externos para fornecer essas informações. Esperamos tornar esse processo mais fácil para eles, integrando informações de fornecedores de IP confiáveis em nossos sites. Recentemente construímos um protótipo e realizamos uma rodada de testes de usuários para validar nossa abordagem. Descobrimos que a maioria dos editores do conjunto de entrevistas achou a ferramenta útil e indicou que gostariam de usá-la no futuro. Há uma atualização na página do projeto que eu gostaria de chamar sua atenção. Perguntas-chave que gostaríamos de ter o seu feedback na página de discussão do projeto:

  • Ao investigar um IP que tipo de informação você procura? Em qual página você provavelmente está ao procurar essa informação?
  • Que tipos de informações de IP você acha mais úteis?

Característica de correspondência do editor

Este projeto também foi referido como "Editores próximos" e "detecção de Sockpuppet" em conversas anteriores. Estamos tentando encontrar um nome adequado para ele que seja compreensível até mesmo para pessoas que não entendem a palavra escaramuvamos. Estamos nos estágios iniciais deste projeto. Wikimedia Foundation Research tem um projeto que poderia ajudar na detecção quando dois editores exibem comportamentos de edição semelhantes. Isso ajudará a conectar diferentes editores não registrados quando eles editarem em diferentes nomes de usuário de conta gerados automaticamente. Ouvimos muito apoio para este projeto quando começamos a falar sobre isso há um ano. Também ouvimos falar sobre os riscos de desenvolver tal característica. Estamos planejando construir um protótipo a curto prazo e compartilhá-lo com a comunidade. Há uma página de projeto desnutrido projeto para este projeto. Esperamos ter uma atualização para ele em breve. Seus pensamentos sobre este projeto são muito bem-vindos no página de discussão do projeto.

Dados sobre a Wikipédia em português desativando edições IP

A Wikipédia em português baniu editores não registrados de fazerem edições ao projeto no ano passado. Nos últimos meses, nossa equipe vem coletando dados sobre as repercussões dessa mudança na saúde geral do projeto. Também conversamos com vários membros da comunidade sobre sua experiência. Estamos trabalhando nos trechos finais para compilar todos os dados que apresentam uma imagem precisa do estado do projeto. Esperamos ter uma atualização sobre isso em um futuro próximo.

Atualizações anteriores

30 de outubro de 2020

Atualizamos a FAQ com mais perguntas que foram feitas na página de palestras. O departamento jurídico da Wikimedia Foundation adicionou uma declaração a pedido à discussão da página de conversação, e nós adicionamos aqui na página principal também. Na página de conversa, tentamos explicar mais ou menos como pensamos em dar aos lutadores de vândalos, acesso aos dados de que precisam sem eles terem que ser Verificadores de Contas ou administradores.

15 de outubro de 2020

Esta página ficou em grande parte desatualizada e decidimos reescrever partes dela para refletir onde estamos no processo. Era assim que costumava ser. Atualizamos com as últimas informações sobre as ferramentas em que estamos trabalhando, pesquisamos, criamos motivações e adicionamos algumas coisas ao FAQ. Especialmente relevantes são provavelmente o nosso trabalho sobre o recurso de informações IP, a nova ferramenta de Verificação de Contas que agora está ao vivo em quatro wikis e nossa pesquisa sobre a melhor maneira de lidar com a identificação de IP: deixe-nos saber o que você precisa, os problemas potenciais que você vê e se uma combinação de IP e um cookie poderia ser útil para seus fluxos de trabalho.

Tools

Like mentioned previously, our foremost goal is to provide better anti-vandalism tools for our communities which will provide a better moderation experience for our vandal fighters while also working towards making the IP address string less valuable for them. Another important reason to do this is that IP addresses are hard to understand and are really very useful only to tech-savvy users. This creates a barrier for new users without any technical background to enter into functionary roles as there is a higher learning curve for them to work with IP addresses. We hope to get to a place where we can have moderation tools that anyone can use without much prior knowledge.

The first thing we decided to focus on was to make the CheckUser tool more flexible, powerful and easy to use. It is an important tool that services the need to detect and block bad actors (especially long-term abusers) on a lot of our projects. The CheckUser tool was not very well maintained for many years and as a result it appeared quite dated and lacked necessary features.

We also anticipated an uptick in the number of users who opt-in to the role of becoming a CheckUser on our projects once IP Masking goes into effect. This reinforced the need for a better, easier CheckUser experience for our users. With that in mind, the Anti-Harassment Tools team spent the past year working on improving the CheckUser tool – making it much more efficient and user-friendly. This work has also taken into account a lot of outstanding feature requests by the community. We have continually consulted with CheckUsers and stewards over the course of this project and have tried our best to deliver on their expectations. The new feature is set to go live on all projects in October 2020.

The next feature that we are working on is IP info. We decided on this project after a round of consultation on six wikis which helped us narrow down the use cases for IP addresses on our projects. It became apparent early on that there are some critical pieces of information that IP addresses provide which need to be made available for patrollers to be able to do their roles effectively. The goal for IP Info, thus, is to quickly and easily surface significant information about an IP address. IP addresses provide important information such as location, organization, possibility of being a Tor/VPN node, rDNS, listed range, to mention a few examples. By being able to show this, quickly and easily without the need for external tools everyone can’t use, we hope to be able to make it easier for patrollers to do their job. The information provided is high-level enough that we can show it without endangering the anonymous user. At the same time, it is enough information for patrollers to be able to make quality judgements about an IP address.

After IP Info we will be focusing on a finding similar editors feature. We’ll be using a machine learning model, built in collaboration with CheckUsers and trained on historical CheckUser data to compare user behavior and flag when two or more users appear to be behaving very similarly. The model will take into account which pages users are active on, their writing styles, editing times etc to make predictions about how similar two users are. We are doing our due diligence in making sure the model is as accurate as possible.

Once it’s ready, there is a lot of scope for what such a model can do. As a first step we will be launching it to help CheckUsers detect socks easily without having to perform a lot of manual labor. In the future, we can think about how we can expose this tool to more people and apply it to detect malicious sockpuppeting rings and disinformation campaigns.

You can read more and leave comments on our project page for tools.

Motivation

We who are working on this are doing this because the legal and public policy teams advised us that we should evolve the projects’ handling of IP addresses in order to keep up with current privacy standards, laws, and user expectations. That’s really the main reason.

We also think there are other compelling reasons to work on this. If someone wants to help out and don’t understand the ramifications of their IP address being publicly stored, their desire to make the world and the wiki a better place results in inadvertently sharing their personal data with the public. This is not a new discussion: we’ve had it for about as long as the Wikimedia wikis have been around. An IP address can be used to find out a user’s geographical location and institution and other personally identifiable information, depending on how the IP address was assigned and by whom. This can sometimes mean that an IP address can be used to pinpoint exactly who made an edit and from where, especially when the editor pool is small in a geographic area. Concerns around exposing IP addresses on our projects have been brought repeatedly by our communities and the Wikimedia movement as a whole has been talking about how to solve this problem for at least fifteen years. Here’s a (non-exhaustive) list of some of the previous discussions that have happened around this topic.

We acknowledge that this is a thorny issue, with the potential for causing disruptions in workflows we greatly respect and really don’t want to disrupt. We would only undertake this work, and spend so much time and energy on it, for very good reason. These are important issues independently, and together they have inspired this project: there’s both our own need and desire to protect those who want to contribute to the wikis, and developments in the world we live in, and the online environment in which the projects exist.

Research

See also: IP masking impact report.
A Wikimedia Foundation-supported report on the impact that IP masking will have on our community.

IP masking impact

IP addresses are valuable as a semi-reliable partial identifier, which is not easily manipulated by their associated user. Depending on provider and device configuration, IP address information is not always accurate or precise, and deep technical knowledge and fluency is needed to make best use of IP address information, though administrators are not currently required to demonstrate such fluency to have access. This technical information is used to support additional information (referred to as “behavioural knowledge”) where possible, and the information taken from IP addresses significantly impact the course of administrative action taken.

On the social side, the issue of whether to allow unregistered users to edit has been a subject of extensive debate. So far, it has erred on the side of allowing unregistered users to edit. The debate is generally framed around a desire to halt vandalism, versus preserving the ability for pseudo-anonymous editing and lowering the barrier to edit. There is a perception of bias against unregistered users because of their association with vandalism, which also appears as algorithmic bias in tools such as ORES. Additionally, there are major communications issues when trying to talk to unregistered users, largely due to lack of notifications, and because there is no guarantee that the same person will be reading the messages sent to that IP talk page.

In terms of the potential impact of IP masking, it will significantly impact administrator workflows and may increase the burden on CheckUsers in the short term. If or when IP addresses are masked, we should expect our administrators' ability to manage vandalism to be greatly hindered. This can be mitigated by providing tools with equivalent or greater functionality, but we should expect a transitional period marked by reduced administrator efficacy. In order to provide proper tool support for our administrators’ work, we must be careful to preserve or provide alternatives to the following functions currently fulfilled by IP information:

  • Block efficacy and collateral estimation
  • Some way of surfacing similarities or patterns among unregistered users, such as geographic similarity, certain institutions (e.g. if edits are coming from a high school or university)
  • The ability to target specific groups of unregistered users, such as vandals jumping IPs within a specific range
  • Location or institution-specific actions (not necessarily blocks); for example, the ability to determine if edits are made from an open proxy, or public location like a school or public library.

Depending on how we handle temporary accounts or identifiers for unregistered users, we may be able to improve communication to unregistered users. Underlying discussions and concerns around unregistered editing, anonymous vandalism, and bias against unregistered users are unlikely to significantly change if we mask IPs, provided we maintain the ability to edit projects while logged out.

CheckUser workflow

We interviewed CheckUsers on multiple projects throughout our process for designing the new Special:Investigate tool. Based on interviews and walkthroughs of real-life cases, we broke down the general CheckUser workflow into five sections:

  • Triaging: assessing cases for feasibility and complexity.
  • Profiling: creating a pattern of behaviour which will identify the user behind multiple accounts.
  • Checking: examining IPs and useragents using the CheckUser tool.
  • Judgement: matching this technical information against the behavioural information established in the Profiling step, in order to make a final decision about what kind of administrative action to take.
  • Closing: reporting the outcome of the investigation on public and private platforms where necessary, and appropriately archiving information for future use.

We also worked with staff from Trust and Safety to get a sense for how the CheckUser tool factors into Wikimedia Foundation investigations and cases that are escalated to T&S.

The most common and obvious pain points all revolved around the CheckUser tool's unintuitive information presentation, and the need to open up every single link in a new tab. This cause massive confusion as tab proliferation quickly got out of hand. To make matters worse, the information that CheckUser surfaces is highly technical and not easy to understand at first glance, making the tabs difficult to track. All of our interviewees said that they resorted to separate software or physical pen and paper in order to keep track of information.

We also ran some basic analyses of English Wikipedia's Sockpuppet Investigations page to get some baseline metrics on how many cases they process, how many are rejected, and how many sockpuppets a given report contains.

Patroller use of IP addresses

Previous research on patrolling on our projects has generally focused on the workload or workflow of patrollers. Most recently, the Patrolling on Wikipedia study focuses on the workflows of patrollers and identifying potential threats to current anti-vandal practices. Older studies, such as the New Page Patrol survey and the Patroller work load study, focused on English Wikipedia. They also look solely at the workload of patrollers, and more specifically on how bot patrolling tools have affected patroller workloads.

Our study tried to recruit from five target wikis, which were

  • Japanese Wikipedia
  • Dutch Wikipedia
  • German Wikipedia
  • Chinese Wikipedia
  • English Wikiquote

They were selected for known attitudes towards IP edits, percentage of monthly edits made by IPs, and any other unique or unusual circumstances faced by IP editors (namely, use of the Pending Changes feature and widespread use of proxies). Participants were recruited via open calls on Village Pumps or the local equivalent. Where possible, we also posted on Wiki Embassy pages. Unfortunately, while we had interpretation support for the interviews themselves, we did not extend translation support to the messages, which may have accounted for low response rates. All interviews were conducted via Zoom, with a note-taker in attendance.

Supporting the findings from previous studies, we did not find a systematic or unified use of IP information. Additionally, this information was only sought out after a certain threshold of suspicion. Most further investigation of suspicious user activity begins with publicly available on-wiki information, such as checking previous local edits, Global Contributions, or looking for previous bans.

Precision and accuracy were less important qualities for IP information: upon seeing that one chosen IP information site returned three different results for the geographical location of the same IP address, one of our interviewees mentioned that precision in location was not as important as consistency. That is to say, so long as an IP address was consistently exposed as being from one country, it mattered less if it was correct or precise. This fits with our understanding of how IP address information is used: as a semi-unique piece of information associated with a single device or person, that is relatively hard to spoof for the average person. The accuracy or precision of the information attached to the user is less important than the fact that it is attached and difficult to change.

Our findings highlight a few key design aspects for the IP info tool:

  • Provide at-a-glance conclusions over raw data
  • Cover key aspects of IP information:
    • Geolocation (to a city or district level where possible)
    • Registered organization
    • Connection type (high-traffic, such as data center or mobile network versus low-traffic, such as residential broadband)
    • Proxy status as binary yes or no

As an ethical point, it will be important to be able to explain how any conclusions are reached, and the inaccuracy or imprecisions inherent in pulling IP information. While this was not a major concern for the patrollers we talked to, if we are to create a tool that will be used to provide justifications for administrative action, we should be careful to make it clear what the limitations of our tools are.

FAQ

Q: Will users with advanced permissions such as CheckUsers, Admins, Stewards still have access to IP addresses after this project is complete?

A: We don’t yet have a definitive answer to this question. Ideally, IP addresses should be exposed to as few people as possible (including WMF staff). We hope to restrict IP address exposure to only those users who need to see it.

Q: How would anti-vandalism tools work without IP addresses?

A: There are some potential ideas for achieving this goal. For one, we may be able to surface other pertinent information about the user instead of the IP to the functionaries that provide the same amount of information. In addition, it may be possible to automatically verify if two separate user accounts link to the same IP, without exposing the IP - in cases of sockpuppet investigations. It’s also possible that anti-vandalism tools will continue to use IP addresses, but will have restricted access. We will need to work closely with the community to find the optimal solutions.

Q: If we don’t see IP addresses, what would we see instead when edits are made by unregistered users?

A: Instead of IP addresses, users will be able to see a unique, automatically-generated, human-readable username. This can look something like “Anonymous 12345”, for example.

Q: Will a new username be generated for every unregistered edit?

A: No. We intend to implement some method to make the generated usernames at least partially persistent, for example, by associating them with a cookie, the user’s IP address, or both.

Q: Will you also be removing existing IP addresses from the wikis as part of this project?

A: We will not be hiding any existing IP addresses in history, logs or signatures for this project. It will only affect future edits made after this project has been launched.

Q: Is this project the result of a particular law being passed?

A: No. Data privacy standards are evolving in many countries and regions around the world, along with user expectations. We have always worked hard to protect user privacy, and we continue to learn from and apply best practices based on new standards and expectations. This project is the next step in our own evolution.

Q: What is the timeline on this project?

A: As mentioned above, we will not be making any firm decisions about this project until we have gathered input from the communities. We'd like to figure out sensible early steps that a development team could work on soon, so we can get started on what we think will be a long project, but we're not hurrying to meet a particular deadline.

Q: How do I get involved?

A: We would love to hear if you have ideas or feedback about the project! We would especially like to hear if you have any workflows or processes that might be impacted by this project. You can drop your thoughts on the talk page or fill out this form and we’ll reach out to you. Some of us will be at Wikimania and would love to meet you there as well.

Q: Why is this proposal so unclear?

A: It’s not really a proposal and shouldn’t have been described it as such. We don’t have a solution, but are trying to work out the best solutions with the communities. It might be helpful to understand this as a technical investigation trying to figure out how IP masking could work.

Q: Why don’t you just turn off the ability to edit without registering an account?

A: Unregistered editing works differently across different Wikimedia wikis. For example, Swedish Wikipedia has discussed unregistered editing in the light of this investigation and decided they still want unregistered editing. Japanese Wikipedia has a far higher percentage of IP editing than English Wikipedia, but the revert rate of those edits are only a third – 9.5% compared to 27.4% – indicating that they are also far more useful. We think that deciding for all wikis that they can’t have IP editing is a destructive solution. The research done on IP editing also indicates IP editing is important for editor recruitment.

Q: Who will have access to IPs of unregistered users now?

A: We are not going to leave this burden to e.g. the CheckUsers and the stewards alone. We will have a new user right or the ability to opt in to see the IP if you fulfill certain requirements. Others could potentially see partial IP addresses. We are still talking to the communities about how this could best work.

Q: Has this been decided?

A: Yes. The Wikimedia Foundation’s Legal department has stated that this is necessary. As the entity legally responsible for protecting the privacy of Wikimedia users, the Wikimedia Foundation has accepted this advice and is now working to find the best way to implement this while supporting and listening to the user communities. Some Wikimedians will be unhappy with, this, but legal decisions like these have not been a matter of community consensus. What the communities can be part of deciding is how we do this. That very much needs to be defined with the Wikimedia communities.

Q. Will masks be global, for all Wikimedia wikis, or local, for one wiki?

A: Global. A masked IP will look the same across all Wikimedia wikis.

Q: Will all unregistered users be unblocked when this happens? If not you could track the information in the logs.

A: No. This would wreak havoc on the wikis. This solution will have to be a compromise. We have to balance the privacy of our unregistered editors with our ability to protect the wikis.

Q: Will those who have access to the IPs of unregistered users be able to unmask more than one IP in one action?

A: Yes. We don't want this to be a tedious and time-consuming task, when necessary. We will include this in our proposal.

Statement from the Wikimedia Foundation Legal department

July 2021

First of all, we’d like to thank everyone for participating in these discussions. We appreciate the attention to detail, the careful consideration, and the time that has gone into engaging in this conversation, raising questions and concerns, and suggesting ways that the introduction of masked IPs can be successful. Today, we’d like to explain in a bit more detail how this project came about and the risks that inspired this work, answer some of the questions that have been raised so far, and briefly talk about next steps.

Background

To explain how we arrived here, we’d like to briefly look backwards. Wikipedia and its sibling projects were built to last. Sharing the sum of all knowledge isn’t something that can be done in a year, or ten years, or any of our lifetimes. But while the mission of the communities and Foundation was created for the long term, the technical and governance structures that enable that mission were very much of the time they were designed. Many of these features have endured, and thrived, as the context in which they operate has changed. Over the last 20 years, a lot has evolved: the way societies use and relate to the internet, the regulations and policies that impact how online platforms run as well as the expectations that users have for how a website will handle their data.

In the past five years in particular, users and governments have become more and more concerned about online privacy and the collection, storage, handling, and sharing of personal data. In many ways, the projects were ahead of the rest of the internet: privacy and anonymity are key to users’ ability to share and consume free knowledge. The Foundation has long collected little information about users, not required an email address for registration, and recognized that IP addresses are personal data (see, for example, the 2014–2018 version of our Privacy policy). More recently, the conversation about privacy has begun to shift, inspiring new laws and best practices: the European Union’s General Data Protection Regulation, which went into effect in May 2018, has set the tone for a global dialogue about personal data and what rights individuals should have to understand and control its use. In the last few years, data protection laws around the world have been changing—look at the range of conversations, draft bills, and new laws in, for example, Brazil, India, Japan, or the United States.

Legal risks

The Foundation’s Privacy team is consistently monitoring this conversation, assessing our practices, and planning for the future. It is our job to look at the projects of today, and evaluate how we can help prepare them to operate within the legal and societal frameworks of tomorrow. A few years ago, as part of this work, we assessed that the current system of publishing IP addresses of non-logged-in contributors should change. We believe it creates risk to users whose information is published in this way. Many do not expect it—even with the notices explaining how attribution works on the projects, the Privacy team often hears from users who have made an edit and are surprised to see their IP address on the history page. Some of them are in locations where the projects are controversial, and they worry that the exposure of their IP address may allow their government to target them. The legal frameworks that we foresaw are in operation, and the publication of these IP addresses pose real risks to the projects and users today.

We’ve heard from several of you that you want to understand more deeply what the legal risks are that inspired this project, whether the Foundation is currently facing legal action, what consequences we think might result if we do not mask IP addresses, etc. (many of these questions have been collected in the expanded list at the end of this section). We’re sorry that we can’t provide more information, since we need to keep some details of the risks privileged. “Privileged” means that a lawyer must keep something confidential, because revealing it could cause harm to their client. That’s why privilege is rarely waived; it’s a formal concept in the legal systems of multiple countries, and it exists for very practical reasons—to protect the client. Here, waiving the privilege and revealing this information could harm the projects and the Foundation. Generally, the Legal Affairs team works to be as transparent as possible; however, an important part of our legal strategy is to approach each problem on a case by case basis. If we publicly discuss privileged information about what specific arguments might be made, or what risks we think are most likely to result in litigation, that could create a road map by which someone could seek to harm the projects and the communities.

That said, we have examined this risk from several angles, taking into account the legal and policy situation in various countries around the world, as well as concerns and oversight requests from users whose IP addresses have been published, and we concluded that IP addresses of non-logged-in users should no longer be publicly visible, largely because they can be associated with a single user or device, and therefore could be used to identify and locate non-logged-in users and link them with their on-wiki activity.

Despite these concerns, we also understood that IP addresses play a major part in the protection of the projects, allowing users to fight vandalism and abuse. We knew that this was a question we’d need to tackle holistically. That’s why a working group from different parts of the Wikimedia Foundation was assembled to examine this question and make a recommendation to senior leadership. When the decision was taken to proceed with IP masking, we all understood that we needed to do this with the communities—that only by taking your observations and ideas into account would we be able to successfully move through this transition.

I want to emphasize that even when IP addresses are masked and new tools are in place to support your anti-vandalism work, this project will not simply end. It’s going to be an iterative process—we will want feedback from you as to what works and what doesn’t, so that the new tools can be improved and adapted to fit your needs.

Questions

Over the past months, you’ve had questions, and often, we’ve been unable to provide the level of detail you’re hoping for in our answers, particularly around legal issues.

What specific legal risks are you worried about?

We cannot provide details about the individual legal risks that we are evaluating. We realize it’s frustrating to ask why and simply get, “that’s privileged” as an answer. And we’re sorry that we cannot provide more specifics, but as explained above, we do need to keep the details of our risk assessment, and the potential legal issues we see on the horizon, confidential, because providing those details could help someone figure out how to harm the projects, communities, and Foundation.

There are settled answers to some questions.

Is this project proceeding?

Yes, we are moving forward with finding and executing on the best way to hide IP addresses of non-logged-in contributors, while preserving the communities’ ability to protect the projects.

Can this change be rolled out differently by location?

No. We strive to protect the privacy of all users to the same standard; this will change across the Wikimedia projects.

If other information about non-logged-in contributors is revealed (such as location, or ISP), then it doesn’t matter if the IP address is also published, right?

That’s not quite the case. In the new system, the information we make available will be general information that is not linked to an individual person or device—for example, providing a city-level location, or noting that an edit was made by someone at a particular university. While this is still information about the user, it’s less specific and individual than an IP address. So even though we are making some information available in order to assist with abuse prevention, we are doing a better job of protecting the privacy of that specific contributor.

If we tell someone their IP address will be published, isn’t that enough?

No. As mentioned above, many people have been confused to see their IP address published. Additionally, even when someone does see the notice, the Foundation has legal responsibilities to properly handle their personal data. We have concluded that we should not publish the IP addresses of non-logged-in contributors because it falls short of current privacy best practices, and because of the risks it creates, including risks to those users.

How will masking impact CC-BY-SA attribution?

IP masking will not affect CC license attribution on Wikipedia. The 3.0 license for text on the Wikimedia projects already states that attribution should include “​​the name of the Original Author (or pseudonym, if applicable)” (see the license at section 4c) and use of an IP masking structure rather than an IP address functions equally well as a pseudonym. IP addresses already may vary or be assigned to different people over time, so using that as a proxy for un-registered editors is not different in quality from an IP masking structure and both satisfy the license pseudonym structure. In addition, our Terms of use section 7 specify that as part of contributing to Wikipedia, editors agree that links to articles (which include article history) are a sufficient method of attribution.

And sometimes, we don’t know the answer to a question yet, because we’d like to work with you to find the solution.

What should the specific qualifications be for someone to apply for this new user right?

There will be an age limit; we have not made a definitive decision about the limit yet, but it’s likely they will need to be at least 16 years old. Additionally, they should be active, established community members in good standing. We’d like to work through what that means with you.

I see that the team preparing these changes is proposing to create a new userright for users to have access to the IP addresses behind a mask. Does Legal have an opinion on whether access to the full IP address associated with a particular username mask constitutes nonpublic personal information as defined by the Confidentiality agreement for nonpublic information, and will users seeking this new userright be required to sign the Access to nonpublic personal data policy or some version of it?
1 If yes, then will I as a checkuser be able to discuss relationships between registered accounts and their IP addresses with holders of this new userright, as I currently do with other signatories?
2 If no, then could someone try to explain why we are going to all this trouble for information that we don't consider nonpublic?
3 In either case, will a checkuser be permitted to disclose connections between registered accounts and unregistered username masks?

This is a great question. The answer is partially yes. First, yes, anyone who has access to the right will need to acknowledge in some way that they are accessing this information for the purposes of fighting vandalism and abuse on the projects. We are working on how this acknowledgement will be made;the process to gain access is likely to be something less complex than signing the access to non-public personal data agreement.

As to how this would impact CUs, right now, the access to non-public personal data policy allows users with access to non-public personal data to share that data with other users who are also able to view it. So a CU can share data with other CUs in order to carry out their work. Here, we are maintaining a distinction between logged-in and logged-out users, so a CU would not be able to share IP addresses of logged-in users with users who have this new right, because users with the new right would not have access to such information.

Presuming that the CU also opts in to see IP addresses of non-logged-in users, under the current scheme, that CU would be able to share IP address information demonstrating connections between logged-in users and non-logged-in users who had been masked with other CUs who had also opted in. They could also indicate to users with the new right that they detected connections between logged-in and non-logged-in users. However, the CU could not directly the share IP addresses of the logged-in users with non-CU users who only have the new right.

Please let us know if this sounds unworkable. As mentioned above, we are figuring out the details, and want to get your feedback to make sure it works.

Next steps

Over the next few months, we will be rolling out more detailed plans and prototypes for the tools we are building or planning to build. We’ll want to get your feedback on these new tools that will help protect the projects. We’ll continue to try to answer your questions when we can, and seek your thoughts when we should arrive at the answer together. With your feedback, we can create a plan that will allow us to better protect non-logged-in editors’ personal data, while not sacrificing the protection of Wikimedia users or sites. We appreciate your ideas, your questions, and your engagement with this project.

October 2020

This statement from the Wikimedia Foundation Legal department was written on request for the talk page and comes from that context. For visibility, we wanted you to be able to read it here too.

Hello All. This is a note from the Legal Affairs team. First, we’d like to thank everyone for their thoughtful comments. Please understand that sometimes, as lawyers, we can’t publicly share all of the details of our thinking; but we read your comments and perspectives, and they’re very helpful for us in advising the Foundation.

On some occasions, we need to keep specifics of our work or our advice to the organization confidential, due to the rules of legal ethics and legal privilege that control how lawyers must handle information about the work they do. We realize that our inability to spell out precisely what we’re thinking and why we might or might not do something can be frustrating in some instances, including this one. Although we can’t always disclose the details, we can confirm that our overall goals are to do the best we can to protect the projects and the communities at the same time as we ensure that the Foundation follows applicable law.

Within the Legal Affairs team, the privacy group focuses on ensuring that the Foundation-hosted sites and our data collection and handling practices are in line with relevant law, with our own privacy-related policies, and with our privacy values. We believe that individual privacy for contributors and readers is necessary to enable the creation, sharing, and consumption of free knowledge worldwide. As part of that work, we look first at applicable law, further informed by a mosaic of user questions, concerns, and requests, public policy concerns, organizational policies, and industry best practices to help steer privacy-related work at the Foundation. We take these inputs, and we design a legal strategy for the Foundation that guides our approach to privacy and related issues. In this particular case, careful consideration of these factors has led us to this effort to mask IPs of non-logged-in editors from exposure to all visitors to the Wikimedia projects. We can’t spell out the precise details of our deliberations, or the internal discussions and analyses that lay behind this decision, for the reasons discussed above regarding legal ethics and privilege.

We want to emphasize that the specifics of how we do this are flexible; we are looking for the best way to achieve this goal in line with supporting community needs. There are several potential options on the table, and we want to make sure that we find the implementation in partnership with you. We realize that you may have more questions, and we want to be clear upfront that in this dialogue we may not be able to answer the ones that have legal aspects. Thank you to everyone who has taken the time to consider this work and provide your opinions, concerns, and ideas.