Talk:Privacy policy/Call for input (2013)

From Meta, a Wikimedia project coordination wiki
The following discussion is closed. Please do not modify it. Subsequent comments should be made in a new section.

Translation request[edit]

We welcome comments in all languages, and would like to ask anyone willing to help out to help facilitate communication by translating comments (even in summary). Thank you! --Maggie Dennis (WMF) (talk) 18:43, 19 June 2013 (UTC)[reply]

What info is collected about readers and editors[edit]

I would like to see the policy updated to include what specific information is collected about readers and also editors. I believe some of the information may be anonymized before others have access to it, so I would like to see that clarified as well. 64.40.54.139 05:33, 19 June 2013 (UTC)[reply]

Hi! That's a great suggestion. In the new privacy policy, we hope to clarify what kind of information is collected about our users and under what specific circumstances we can share that information (as well as in what form -- anonymized, aggregated, or as collected). Mpaulson (WMF) (talk) 21:53, 25 June 2013 (UTC)[reply]
Likewise, and for searchers, and how long is each category retained? I've been trying to follow the discussion about transitioning readership log retention from 1/1000 to 100%, without much luck. Are the questions about that (e.g., "For how long is the access log kept for readers and is there an access log for all readers or just a 1/1000 sample?") going to be answered? EllenCT (talk) 21:02, 19 June 2013 (UTC)[reply]
Hi EllenCT! How long we keep data is an important, and complicated, question and one that we are already discussing in depth. Different types of data are used in different ways by different teams within the organization. We want to make sure that we are not keeping data any longer than we have to, but we must also balance that concern with the flexibility needed to use that data productively to make the Wikimedia projects better for the community. My hope is that the new privacy policy will embody that balance and Wikimedia ideals while more detailed data retention guidelines and procedures (which we also hope to create during this period) will help clarify the specifics of how long information is retained and in what form. Mpaulson (WMF) (talk) 22:07, 25 June 2013 (UTC)[reply]

I've broken this out into a series of specific questions below. SJ talk  21:52, 20 June 2013 (UTC)[reply]

What is logged, for how long?[edit]

  • What logs are kept?
  • How long are logs kept?
  • Which logs are kept for all users, for all readers, for a random subset?

Also:

  • Which logs and data are available to developers? To anyone with a feed?
  • What sorts of secondary, composite or processed logs are produced? (e.g., wikistats)
  • What data is stripped out or anonymized from each (secondary) log?

I know that some of these questions have been mentioned elsewhere (above), but I wanted to collect them all in one place. Some of the above should be specified in the privacy policy. A standard of purging unneeded logs after 1-3 months, possibly after compressing them into anonymized bulk-statistics, should be part of the new policy. SJ talk  21:47, 20 June 2013 (UTC)[reply]

Thanks SJ. This list is very helpful for us. As I mentioned above to EllenCT, we hope to make the new privacy policy retain the general Wikimedia philosophy of keeping data for as short a time as possible while allowing the data to be used for the purpose it was collected. But we do want to be more transparent about what is kept for how long and in what form and who it can be shared with. I think that would be best outlined in data retention guidelines separate of the new privacy policy -- partly because separate guidelines allow for greater specificity, partly because guidelines will allow us to keep such a specific list more up-to-date since guidelines can be more frequently and easily updated than an organization's privacy policy, and partly to keep the new privacy policy at a reasonable length. We will be working very closely with Tech to help bring clarity on this subject, but we'd love to hear more from the community about what they would like to see in potential data retention guidelines. Mpaulson (WMF) (talk) 22:26, 25 June 2013 (UTC)[reply]

Specifics[edit]

One key thing is just to make sure everything is specific. Saying "a limited period of time" just doesn't cut it. Copyright lasts a limited period of time. Even in the year 384,572,250, the copyright on "Happy Birthday" will still be less than 20 years from expiring. Let's hope that's not also the case for today's Wikipedia records. Wnt (talk) 21:58, 19 June 2013 (UTC)[reply]

Hi Wnt! Thank you for your input. While we hope that the Wikimedia projects are still thriving in the year 384,572,250, our data retention practices will hopefully not mimic the unfortunate copyright on the "Happy Birthday" song. =) Mpaulson (WMF) (talk) 22:35, 25 June 2013 (UTC)[reply]

Borrado de las páginas de usuario[edit]

Aunque sea una propuesta totalmente contraria a la política general sobre edición de páginas, me parece deseable que las páginas de usuario (no su discusión) pudieran borrarse de forma absoluta y por el propio usuario sin necesidad siquiera de "llamar la atención" sobre ello pidiéndolo en los tablones públicos.

Muchos usuarios comienzan en Wikipedia aportando datos sensibles sobre sí mismos (lugar de residencia, ideas políticas o religiosas, identidad sexual, etc.) que desearían, mas tarde y por razones que nadie debería juzgar, que fuesen retiradas no sólo de la página visible, sino incluso de historiales que pudieran ser rastreados. Por supuesto, esto puede ser de inmensa importancia si, en algún momento, tales editores comenzasen a trabajar en temas que fuesen susceptibles de algún tipo de revancha en el mundo real (conflictos políticos, corrupción económica, criminalidad, etc), aunque con más frecuencia podría evitar conflictos en el mundo laboral, en el que hoy en día es cada vez más frecuente que las empresas rastreen el historial en Internet de sus candidatos. Un saludo. --Fremen (talk) 09:37, 19 June 2013 (UTC)[reply]


Translation: The Anonymouse (talk) 23:15, 20 June 2013 (UTC)[reply]

Deletion of user pages[edit]

Although it is a completely against the general policy on editing pages, it seems desirable that the user pages (not discussion pages) could be deleted by the user without even needing to "get attention" by asking for on public boards.

Many users that start at Wikipedia provide sensitive information about themselves (where they live, political or religious beliefs, sexual identity, etc.). Later, they would like (and for reasons no one should judge them) for it to be removed not only from the visible page, but even of records that could be tracked [the history]. Of course, this can be very important if, at some point, such editors begin working on issues that were subject to some kind of revenge in the real world (political conflicts, economic corruption, crime, etc.), but more often for avoiding conflicts in the workplace, as it is now increasingly common for companies to track the internet history of their candidates. Best regards. --Fremen (talk) 09:37, 19 June 2013 (UTC)[reply]


If every user could delete their user page, then there would be many users who move their own talk page there and delete those discussions or vandals who move other pages onto their user page to delete them there. If they moved more than one page after another there and delete it there, then the version histories of different pages would be merged together and admins would have to do a big job to get those pages fixed again. Maybe, it wouldn’t be possible, if then there would be to many versions and stewards would have to do this. Or it would be too difficult at all, so noone would fix those things. As long as a user page is being treated the same way as any other page and everyone can edit or move it anywhere or move other pages there, I strongly oppose this. Then there would first be a need for a software change, so that only bureaucrats and stewards (ok, every admin/sysop is also ok, since they can delete and restore pages anyway, and often users create articles on their user page which then should be moved by admins to a user subpage) would be able to move user pages or to move pages there. But in no way should every user (or any user who wasn’t elected as an admin) be technically able to delete any pages they want.
I have also seen an admin having moved different own user pages which weren’t needed anymore (but where also other users had made contribs to) to one place in the user namespace (called trash or something like that) and deleted them there after another. I think that later there were versions restored which was a mix of different version histories and really confusing. This way, it shouldn’t be done. But if admins do such kind of things, what will normal or new, unexperienced users or even vandals do with this possibility then?
Any user can e-mail an admin for a request to delete their user page, (s)he doesn’t have to request this onwiki.
And I hope you only mean the main user page and not also user subpages (where also sometimes private data have been revealed), cause very often articles are moved from main article namespace to the creator’s user namespace, if they aren’t ready, but different users then already have taken part in that article in the main namespace. Such articles should never be deleted by normal users without admin rights, there should always be an admin who takes a look upon it before deleting it. --Geitost diskusjon 21:06, 23 June 2013 (UTC)[reply]
Probably someone could implement a filter that stops users from moving pages to their user page? What is the benefit of moving an article to one´s user page if it is outside the user namespace? --80.108.59.205 16:56, 25 June 2013 (UTC)[reply]
I don’t think that there’s any benefit with that, except perhaps, if someone has moved his user page onto a subpage and then later wants to move it back again with the version history. But that could also be solved, if only admins would have the right to move a page onto someone’s (main) user page, cause they may delete and restore pages. Then you would have to ask an admin to move the page there, that wouldn’t be such a problem.
I have also once seen a user moving an article of a new user onto that user’s main user page, cause the article wasn’t good enough. I then choose to move the article further onto one of the user’s sub pages, where it then a bit later has been deleted. Thinking about that now, I think it is good that this article wasn’t deleted on the user’s main page, where the user never had placed it himself, and maybe at some time in the future together with other deleted normal user main page versions. So, it’s no good idea that this possibility isn’t blocked by the software (MediaWiki) itself, that’s no task for filters, but for the software. There’s no need for filter logs, it just should be forbidden for all normal users which aren’t admins.
Also the fact, that users can move their user’s main page onto another (not-)existing user’s main page, perhaps because they want to rename themselves (which happens from time to time), would be blocked then automatically, by the way. That would also be very useful. Should also be the case if a user wants to move a user’s main talk page into another (not-)existing user’s namespace (onto a user’s main talk page, user’s shouldn’t be able to move pages onto a user’s main page or main talk page either; it doesn’t differ, if it’s their own page or one of another person). The second one with the user’s talk page move should only be done by bureaucrats with renaming users and – after August – that’s a task for the stewards, but nevertheless, bureaucrats could still do that, if there would be any need for it after August. I’ve never seen any useful move of a page onto a user’s main (talk) page except for bureaucrats renaming users. --Geitost diskusjon 00:55, 26 June 2013 (UTC)[reply]
Hi All. The ability to delete user pages is certainly a sensitive and controversial topic. While it does touch on privacy concerns, it strikes the heart of a deeper question -- when should something be struck from the public records of a Wikimedia project? This question is one that belongs with the community. If there was or ever is a great call from the community for the ability to delete user pages, we would look into the feasibility of helping answer that call. However, such a change would need substantial community support to be considered. (Side note - time permitting, would someone be kind enough to translate this thread into Spanish for Fremen? Thank you!) Mpaulson (WMF) (talk) 22:49, 25 June 2013 (UTC)[reply]

Traducción: Lguipontes (talk) 17:27, 26 June 2013 (UTC) (eh, yo hablo portugués, disculpa si mi castellano de escuela brasileña es demasiado malo; tengo certeza que hay portuñol como las bananas de Brasil aquí, para dar y vender :3)[reply]

Re: Borrado de las páginas de usuario[edit]

Si cada usuario podríamos deletar nuestras páginas, entonces hubería muchos usuarios que mueven sus proprias conversas allí para deletar aquellas discusiones, o vándalos que mueven otras páginas en sus proprias para poder deletarlas. Si [estos vándalos] moviesen después de otra y deletasen todas allí, entonces los históricos de ediciones pudrían ser misturados y los administradores de Wikipédia terían que hacer un trabajo demasiado grande para fijarlas de nuevo. Quizás, eso no sería posible, si entonces allí hubieran muchas versiones y stewards (no sé como se llaman en español :S) terían que hacerlo. O podría ser que eso fuera tan difícil que no sería posible de ninguna manera, entonces nadie fijaria estas páginas. Con tal que una página de usuário es tratada en la misma manera que cualquier otra página y todos podríamos editarla o moverla en cualquer lugar, yo firmemente me opongo a eso. Entonces allí sería la primera necesidad para un canbio de software, teríamos que ter solamente burocratas y stewards (ok, cada admin es también ok, ya que ellos pueden deletar y restorar páginas de cualquer manera, y muchas veces usuarios crían artículos en sus proprias páginas de usuario, los cuales los admin tienen que mover a subpáginas) como capaces de mover páginas de usuario o de mover las paginas además allí. Pero de ninguna manera debería todo usuario (o cualquier usuario que no fuera elegido admin) ser tecnicamente capaz de deletar cualquier páginas quisiéramos.

También he visto un/una admin mover diferentes páginas de usuario de si mism@ que no le necesitaba más (pero donde también otros usuarios contribuyeron) a un lugar en las páginas individuales de usuario (llamado basura/papelera o algo del género) y deletaron ellas allí una después de otra. Creo que después hubieron versiones restoradas las quales había una mistura de diferentes históricos de versiones y aquello era notablemente confuso. En esta manera, nosotros no deberíamos tener eso. Pero si administradores hacen estos tipos de cosa, que harían los usuarios normales o principiantes o mismo vándalos con esa posibilidade?

Cualquier usuario puede se comunicar por e-mail con un admin para un pedido de deletar su página, ell@ no necesita hacer eso abiertamente a la visión de otros wikipedistas. Y tengo esperanza que usted solamente propone [en su sugestión] las páginas principales y no también las otras páginas "userspace" como las conversaciones (donde también hay informaciones privadas a seren reveladas), porque muchas veces artículos son movidos del espacio principal para o userspace del creador, si aún no están prontos, pero diferentes usuarios pueden tener ya contribuido para aquel artículo en la area principal de Wikipédia. Estos artículos no deberían jamás ser deletados por usuarios normales sin derechos de administración, debería siempre haber un admin que le mire antes de deletarlo. --Geitost diskusjon 21:06, 23 de junio de 2013 (UTC)

¿Probablemente alguien podría implementar un filtro que evita que los usuarios "muevan" páginas de la area principal de Wikipédia para sus respectivos userspaces? ¿Cuál es el beneficio de mover un artículo a la página de un usuario si esa está fuera del userspace? --80.108.59.205 16:56, 25 de junio de 2013 (UTC)
No creo que existiría algun beneficio con eso, con la excepción tal vez, de alguién ter movido su página de usuario a una subpágina y entonces después la mueve para el otro lugar con el histórico de versiones. Pero eso podría ser resolvido, con solamente algunos admins tengan el poder de mover una página en la página de usuario principal de alguién, porque ellos pueden deletar y restaurar las páginas. Por tanto uestes terían que preguntar un admin para mover las páginas allí, que no sería problemático.
También he visto una vez un usuario mover un artículo creado por un novato a página de usuario principal del mismo, porque el artículo no era bueno suficientemente. Entonces elegí mover o artículo más, para las subpáginas en el userspace, donde aquello fue poco después deletado. Pensando sobre eso ahora, creo que es bueno que el artículo no fue deletado en la página principal del usuario, donde el usuario nunca le puesto el mismo, y tal vez en algun tiempo en el futuro junto a otras versiones históricas de la página normal de dominio principal. De esta manera, no es una buena idea que esa posibilidad no sea bloqueada por el software (MediaWiki) por si mesmo, esta no es tarea para filtros, pero para el software. No hay necesidad de filtros de registro, a penas debería ser proibido para todos los usuarios normales que no los administradores.
También hay el facto de que usuarios pueden mover sus páginas de usuario principales en otras (no-)existentes páginas del mismo tipo, tal vez porque ellos puedan querer se renomear (una cosa que acontece de tiempos en tiempos), pode ser bloqueado automaticamente así, a propósito. Eso puede ser muy útil. Debe también ser el caso si un usuario pueda querer mover la conversación principal de un usuario en otra página de userspace (sea cual fuera; no importa, si es el caso de su propria página o de otra persona). La secunda opción con una página de conversación movendo debe ser hecha solamente por burocratas con renomeamento de usuarios y - después de agosto - sería una tarea para stewards, pero de cualquier manera, burocratas también podría hacerlo, caso haiga alguna necesidad para eso después de agosto. Nunca he visto alguna movida útil de una página para dentro de una página de conversación o usuario princial con la excepción de burocratas renomeando usuarios. Geitost diskusjon 00:55, 26 de junio de 2013 (UTC)

Hola a todos. La capacidad de eliminar las páginas de usuario es sin duda un tema delicado y controvertido. A pesar de que hace contacto en cuestiones de privacidad, golpea el corazón de una cuestión más profunda – cuando debe suprimirse algo de los registros públicos de un proyecto de Wikimedia? Esta pregunta es una que pertenece a la comunidad. Si hubiera o de cualquier manera hay un grande llamado de la comunidad para la habilidade de deletar páginas de usuario, podríamos mirar en la viabilidad de ayudar a responder a ese llamado. Sin embargo, un cambio de este tipo necesitaría un importante apoyo comunitario para ser considerado. (Nota al margen - si el tiempo lo permite, sería alguien cariñoso el suficiente para traducir este hilo en español para Fremen? Gracias!) Mpaulson (WMF) (conviersa) 22:49, 25 de junio de 2013 (UTC)

Location of draft[edit]

Hi. I assume privacy policy (or some other page on Meta-Wiki) will be used to draft a new privacy policy, right? It's important to support transparency in more than name only. Any proposed changes to the privacy policy should be drafted in public. --MZMcBride (talk) 13:32, 19 June 2013 (UTC)[reply]

Originally posted on Talk:Privacy policy, moved here for specific discussion. Jalexander (talk) 19:47, 19 June 2013 (UTC)[reply]
MZ, when we did the Terms of Use refresh, we did it at Terms of use on meta. I would presume that something similar will be done here. Philippe (WMF) (talk) 10:06, 23 June 2013 (UTC)[reply]
Philippe is correct. Once we receive input from the community at this stage, we will craft a new privacy policy draft with that input in mind and present it to the community for an extensive feedback and discussion period. Mpaulson (WMF) (talk) 22:31, 25 June 2013 (UTC)[reply]

Thanks for this[edit]

Michelle and the legal team:

Thanks for seeking out community input on how we can collectively improve the privacy policy. I hope the community is productive, helpful and courteous in giving suggestions and feedback.

One thing I'd suggest isn't so much on the privacy policy but in ensuring that the access to non-public data policy is enforced proactively. When I signed up for OTRS, I wasn't asked to submit my name under the non-public data policy: we need to make sure all new OTRS agents go through the process of identifying to the Foundation before getting access to OTRS.

I'd also be interested to know if there are any plans in privacy policy reform to set rules on chapter use of personal information: there's obviously the Toolserver to Tool Labs switchover, but how will the Foundation ensure that chapters comply not just with the law of the country they are operating in, but also a movement-wide set of privacy and personal information rules? —Tom Morris (talk) 19:47, 19 June 2013 (UTC)[reply]

Similarly, EU privacy law is considerably more private in some regards than US privacy law. We should consider what the appropriate movement-wide rules are, and not simply adopt the minimum level of privacy mandated by the country holding our servers. SJ talk  21:56, 20 June 2013 (UTC)[reply]
Tom, regarding OTRS - currently, the OTRS team is not required to identify. They are, however, required to be willing to identify on demand. A couple of years ago, we attempted to change (in a poorly timed attempt) that policy, and met with near mutiny - many were not, in fact, prepared or willing to identify on demand. However, at that time, we decided to reverse back to the status quo, do a deliberate process of decision making (to include a comprehensive privacy policy and non-public data policy review) and reconsider. And here we are. Philippe (WMF) (talk) 10:02, 23 June 2013 (UTC)[reply]
Hi Tom! Thank you for the well wishes. We hope this consultation period inspires a lot of community input. Philippe has partially addressed your OTRS question, but I wanted to add on to it. We will likely be editing the access to non-public data policy (as it does go hand-in-hand with the privacy policy) to provide greater clarity about who the policy applies to and what needs to be done if it applies to you. For those that identification applies to, we are also hoping to simplify the identification procedure. We will be similarly be asking for input and feedback from the community for the revisions to the access to non-public data policy and procedure as well. Ideally, a proposed draft of the updated access to non-public data policy will be ready for community feedback at the same time as the new draft of the privacy policy.
As to your chapters question -- the Foundation's privacy policy would not apply to chapters because chapters are technically separate entities and our privacy policy only applies to how we handle data. Regarding the potential for a movement-wide set of rules, I will pass your suggestion on to other members of the legal team who more directly handle chapter relations.
We really appreciate you taking time out of your day to give us your input on these subjects. Mpaulson (WMF) (talk) 23:31, 25 June 2013 (UTC)[reply]

Principles in responding to government requests for user data[edit]

I think the Foundation's privacy policy should strive to score well on the Electronic Frontier Foundation's "Who has got your back?" criteria. That is, a section on responding to government requests for user information should clearly lay out the Foundation's committment to at least the following:

  • The Foundation should require a warrant supported by probable cause in order to disclose content of communications. (Because the Foundation's projects have so few means of communicating privately, this may have limited applicability.)
  • The Foundation should promise to tell users when the government seeks their data unless prohibited by law. Notice is the only way that users can attempt to mount a legal challenge to such a request.
  • The Foundation should publish statistics on how often they provide user data to the government. As we learned recently, the public cannot raise concerns about government requests for user data if they are not informed (at least in aggregate numbers) of the existence of such requests.
  • The Foundation should adopt and publish policies or guidelines explaining how it responds to data demands from the government, such as guides for law enforcement. The Foundation operates in many jurisdictions. Providing guidelines to law enforcement in their native languages, taking into account the legal systems operating in those places can help guide law enforcement into making narrower, more reasonable requests, and sets expectations appropriately.
  • The Foundation should promise users that if, in its judgment, a government demand for access to user content is overbroad, then the Foundation will challenge that demand in court. The reality is that the average user does not have the know-how or resources to mount a court challenge to a request for their data. The Foundation is better situated to challenge overbroad requests.
  • The Foundation should adopt a policy that seeks the modernization of electronic privacy laws to defend users in the digital age. Specifically, as to the U.S., the Foundation should join the Digital Due Process Coalition. This might entail some advocacy in legislatures around the world, but the sum of all human knowledge being online helps one a lot less if one's government leaves you terrified to utilize it. Brianwc (talk) 21:17, 19 June 2013 (UTC)[reply]
Hi Brianwc! Thank you for your suggestion! We are big fans of the EFF over here at the WMF legal department and are aware of the "Who has got your back" criteria. We will be taking this criteria into very serious consideration as we draft the new privacy policy. Mpaulson (WMF) (talk) 00:05, 26 June 2013 (UTC)[reply]
Surely national security letters from the US grovernment makes all of this moot while the foundation is bound by secret US laws? Is the only way to run the sites ethically to move the foundation to the EU? -- Jeandré, 2013-06-28t10:49z
Moving it to Latin America and Asia-Pacific is actually much safer, according to that recent data leak. The United States is obviously interested in spying Europe. For example, the worse thing Brazilians do online is creating 2 million webpages (not websites) and making 400000 yearly content downloads related to neo-Nazism (as SOME people from Kurirama love to rant about how better-off and whiter they are and people from the Southeast and Center-West blame the fact that they aren't like the "colonizing hillbillies" on minorities and conspirations because being stupid to a pathologic level is what people in this country do best) what is a crime here but not there, we aren't terrorists, we wait 21 years of government abuse until we make a decent street manifestation, most people discussing things on the internet aren't concerned with serious business, we don't have decent intelligentsia (or at least not one with global/important plans), and we speak "foreign", so the American government got nothing to do here with individual citizens or organizations, the stealing of natural resources that is the only concern their corporates may have about us won't end so soon. Same story with about all other Latin American neutral countries. It will be even more politically correct as mostly of our energy comes from hydro-electric generation (though it will consume more conditioner air, and I tell you, light bills here are expensive). Lguipontes (talk) 22:33, 28 June 2013 (UTC)[reply]

Wikimedia UK's Website Privacy Policy[edit]

Hi there -

Just thought it might be helpful to highlight the Wikimedia UK Website Privacy Policy that was recently approved by the Wikimedia UK Board of Trustees. Currently it sits alongside a Donor Privacy policy and Data Protection Policy and readers will note it is deliberately aimed at a broad audience by being very simple rather than technical or detailed.

I am hoping that the UK Chapter will later in the year review the separate policies and bring them together alongside a complaints process which will allow the chapter to become members of the UK Fundraising Standards Board. From my experience working with legal counsel to draft the Website Privacy Policy I would say the two key lessons are:

  • Make the policy as simple as possible (to encourage people to read it, and in terms of our movement, to make translation easier)
  • To manage expectations around where the law requires policies to commit to certain things - when drafting ours I was very clear all along that I wanted wide input, but that ultimately certain elements were hidebound by EU and UK privacy laws.

Anyone who has questions more broadly or is interested on working on the UK's policy framework here please ping me on my talk page separately! :-) Katherine Bavage (WMUK) (talk) 09:16, 20 June 2013 (UTC)[reply]

Hi Katherine Bevage (WMUK)! Thank you for your helpful suggestions and congratulations on the recently approved WMUK privacy policy! One of our key goals in drafting a new privacy policy is to produce an easy-to-read document. We believe that everyone should be able to understand how their data is collected and used, so we are going to avoid legalese as much as possible. As to your second suggestion, I certainly agree. It is another one of our goals to be transparent throughout the privacy discussions with the community. This transparency includes being open about what we can and cannot do under applicable law. Mpaulson (WMF) (talk) 00:13, 26 June 2013 (UTC)[reply]

Dutch Law[edit]

As best I know the way Wikimedia handles IP-addresses is at odds with Dutch law (itself inspired on a European directive). As I understand it, to store IP addresses you may need to be registered, and to publicly show them you need a) permission from the user in question and b) an explicit statement of your internal policy on this. And you cannot keep these addresses longer than six months? - Brya (talk) 05:02, 21 June 2013 (UTC)[reply]

Hi Brya! Thank you for your question. The Wikimedia Foundation, and its servers, are based in the United States, and therefore are subject to U.S. privacy laws. However, we are grateful to have a global community of Wikimedia users and are therefore sensitive to the privacy concerns of Wikimedia users everywhere. This is part of the reason we are asking for community input about what should and shouldn't be in our new privacy policy, so that we can take into consideration the priorities and concerns of users, no matter which country they may reside. Mpaulson (WMF) (talk) 00:28, 26 June 2013 (UTC)[reply]

Checkuser[edit]

Auf welcher Rechtsgrundlage werden CU durchgeführt? --Liesbeth Lasst (talk) 12:31, 21 June 2013 (UTC)[reply]
Google translate renders the question as: "On what legal basis are checkusers performed?"

The current Terms of Use allow for Checkusers to be performed in order to maintain the security of the site, in Section 10, which you can read at https://wikimediafoundation.org/wiki/Terms_of_Use#10._Management_of_Websites. As for the legal basis, I'm not a lawyer, so I leave that to them. But there's the policy basis. Philippe (WMF) (talk) 09:58, 23 June 2013 (UTC)[reply]
Hi Liesbeth Lasst! Thank you for your question. Philippe is correct in pointing to the Terms of Use for the policy basis. Is that what you were referring to? If not, could you clarify what you mean by "legal basis"? Mpaulson (WMF) (talk) 00:30, 26 June 2013 (UTC)[reply]

Data retention term[edit]

Our current privacy policy does not state clearly about term of data retention. Generally, the Foundation collects minimal data such as IP address and browser information, but sometimes the Foundation collects major private information in identifications which are necessary for appointments of checkusers, oversighters, stewards. The data retention period are not clearly stated. (Most of important things are unanswered :( ) I wonder how long the informations will be retained, and when those will be destroyed. Privacy Protection Law of South Korea states processors of private information must notify users purpose of data collection, what is collected, and data retention period. Also the law states private data which becomes unnecessary or which the retention period expired are to be destroyed immediately. Maybe other countries have similar laws. I think new privacy policy must include the period of data retention and condition of destruction of data. – Kwj2772 (msg) 02:27, 22 June 2013 (UTC)[reply]

The current process for identifying those functionaries does not include data retention. That is, we don't keep ANY of that data, once they've been added to the identification noticeboard. Whether or not you agree with that policy (and I personally happen to think it's flawed), it's the policy as I inherited it, and the General Counsel approved it. With that said, under the new privacy policy, we simply don't know what the retention term should be. What do you think it should be, Kwj? Philippe (WMF) (talk) 09:53, 23 June 2013 (UTC)[reply]
Hi Kwj2772! I'd like to refer you to a similar question and answer concerning data retention above. It explains what we think might help alleviate this lack of clarity in our general data retention practices. If you have additional ideas about what should be included in our future data retention guidelines, we would love to hear from you. As to the process for identifying particular functionaries, this answer might address your concerns. If we do retain functionary identification in the future, the length of retention will be explained in the data retention guidelines. Mpaulson (WMF) (talk) 00:44, 26 June 2013 (UTC)[reply]
Related: bugzilla:37626. --MZMcBride (talk) 15:22, 24 June 2013 (UTC)[reply]

Hi!

A simple thing, clearly indicate:

  • What datas are collected?
  • How are they collected?
  • How long are they kept?
  • What coockies are placed?
  • What datas are affected?
  • How long are they actived?

Best regards --Hamelin [ de Guettelet ]01:16, 1 July 2013 (UTC)[reply]

Is there a gravy train for this one ?[edit]

Is this privacy policy enforced by the Ombudsman Commission, the same Ombudsman commission where people get spashed with cash to fly on jet airliners all around the word to discuss what a great job they are (not) doing, while the job, which is not being done, is not getting done according to every person I have seen who has sent a complaint, and the generic responses of 'email ? what email ? are there emails ? what planet is this ?' is the only visible work that the committee responsible for enforcement is doing ?

The O.C. is responsible for investigating breaches of privacy policy, and yet the response I got was they can't find my bot or my emails. I do want to know that they are sucking up the cash from donations, I want to be comfortable in knowing that the privacy policy shall have a purpose, even if it is turning donations into AVGAS, giving the hospitality industry a boost, like the was it $10,000 or $14,000 junket for 6 people to Hong Kong. I have questions. Penyulap (talk) 07:27, 22 June 2013 (UTC)[reply]

The Hong-Kong budget for the Affiliations Committee was US$40,000, and they plan to spend it on flying 9 people there. (Just sayin'.) odder (talk) 09:42, 22 June 2013 (UTC)[reply]
Setting aside the AffCom as not germane to this discussion (because it's not related to the privacy policy), regarding the ombudsmen commission - I authorized that meeting, and it was cleared through the General Counsel. Penyulap, I don't understand your objection - the main thrust of that meeting was to fix their processes and assure that cases were being handled in a timely and expeditious manner, which seems to be your main gripe with them. Philippe (WMF) (talk) 09:51, 23 June 2013 (UTC)[reply]
I agree that there are many Wikimedia boondoggles. I'm not sure a single meeting of the Ombudsman Commission qualifies (though it would be great if Philippe and others provided a public rationale explaining why wikis, IRC, Google Hangout, Skype, Etherpad, mailing lists, private e-mail, and every other technology on the Internet were insufficient).
I agree that there has been a recurring criticism that the Commission is slow and ineffective. This is probably outside the scope of this page, however, and should be discussed under a separate RFC. --MZMcBride (talk) 16:09, 23 June 2013 (UTC)[reply]
Slow ? are you sure? have you heard rumours that they are actually doing something slowly ? As far as all evidence I have ever seen shows, (and I've researched a lot) they do precisely nothing whatsoever except spend money. So 'nothing' would be a more appropriate word, unless you mean spending tens of thousands of dollars, where 'slowly' wouldn't really be the word either. Thanks odder for correcting me, $40,000 is such an incomprehensible amount to so many people my memory obviously underestimated.
My question still hasn't been answered, so I repeat, will this policy be the excuse needed to spend incomprehensible amounts of donated funds to no visible effect ? Penyulap (talk) 16:47, 5 July 2013 (UTC)[reply]

Generation of editor profiles[edit]

The call for input regarding the Wikimedia privacy policy happens to coincidence with the migration of services from the Toolserver to Wikimedia Labs. The privacy policy of Wikimedia Labs just refers to the WMF privacy policy under consideration and does not include the extended privacy policy of the toolserver. Let me quote from this policy:

Tools that allow profiling of individual user's activity (beyond what can easily be achieved directly on the public wiki sites) must only be applied with the respective user's consent (opt-in).

The relaxation of this point lead to this RFC at Meta where we find a proposal to enable more profiled statistics and graphs for X!'s edit counter which before would have been in conflict with the toolserver's policy. From the RFC:

This opt-in was set up because because of a law in Germany, where the toolserver is located. Since we are migrating everything from the toolserver to Wikimedia's labs (in the U.S.), this law isn't relevant anymore.

I am very well aware that these detailed editor profiles can be generated from public data. But it still matters if the WMF allows its own resources to be used for the generation of editor profiles even if a considerable fraction of editors feels uncomfortable with that (see the votes in the RFC). It is also well-known that external tools exists like wikichecker (currently for selected projects only). But independent from the legal environment and what third parties can do, we should restrict ourselves within the WMF projects to that what is in the best interest of the mission and the editors who contribute to achieve that. And in regard to privacy we should take the approach to avoid editor surveillance as far as possible, i.e. transparency is important but editor profiling is not. Hence, I would like to see this opt-in model for profiling report generation on editors be enforced in the WMF privacy policy. In your call for input you refer to PRISM and I would like to quote from the laudable stand of the WMF, section Why we care:

Freedom of speech and access to information are core Wikimedia values. These values can be compromised by surveillance: editors and readers understandably are less willing to write and inform themselves as honestly and freely. Put simply, "rights of privacy are necessary for intellectual freedom."

This is exactly the point, profiles of editors that are easily accessible and which reveal working periods do not serve the need to contribute to WMF projects but may have the impact that editors are less likely to contribute. --AFBorchert (talk) 11:22, 23 June 2013 (UTC)[reply]

  • I strongly support this statement. WMF should not offer software that allows the systematic and automatic generation of editor profiles without explicit consent of each person. --Martina Nolte (talk) 12:37, 23 June 2013 (UTC)/12:52, 23 June 2013 (UTC)[reply]
  • Support Support. Und jetzt warten wir auf den ersten Beitrag mit dem Vorwurf "Am deutschen Wesen soll die Welt genesen" bzw. auf den fälligen "canvassing"-Thread. -- smial (talk) 13:08, 23 June 2013 (UTC)[reply]
  • +1. Indeed, using Wikimedia user data for generating profiles with tools that are not implemented in MediaWiki already should not be allowed. Opt-in solutions are a standard within the European Union.--Aschmidt (talk) 13:29, 23 June 2013 (UTC)[reply]
  • Related: bugzilla:48667. --MZMcBride (talk) 16:03, 23 June 2013 (UTC)[reply]
  • +1 Support Support --Geitost diskusjon 20:33, 23 June 2013 (UTC) Ich widerspreche hiermit vorsorglich der Bereitstellung von aggregierten Benutzerdaten über mich, sobald sie über die bisherigen Beitragszähler hinausgehen und Profile erstellen und ich nicht explizit der Veröffentlichung der aggretierten Daten zugestimmt habe (opt-in). Dies stellt ein vorsorgliches Opt-out für derartige Daten dar und soll dementsprechend jederzeit in jede derartige Datensammlung ohne Opt-in eingefügt werden, falls es mal solche geben sollte.[reply]
    • Die Auflistung der Gesamtbeiträge inkl. und exklusive gelöschten Beiträgen sowie aufgesplittet nach Namensräumen (bzw. ebenso bei den Logbucheinträgen) sollten die einzigen Datenübersichten bleiben (neben den normal üblichen Beitragslisten), die mit Unterstützung der WMF bezogen auf jeden einzelnen Benutzer frei verfügbar bereitgestellt werden. Aber keine täglich nach Uhrzeiten, wöchentlichen nach Tagen oder monatlichen Übersichten ohne Opt-in und auch keine Übersichten über die am meisten bearbeiteten Seiten in den Namensräumen oder Ähnliches ohne Opt-in. --Geitost diskusjon 20:33, 23 June 2013 (UTC)[reply]
  • Support Support NNW (talk) 08:35, 24 June 2013 (UTC)[reply]
  • Support Support The rules at the Toolserver were there for a reason. Because the US is a developing country in terms of data-privacy we need at least a WMF-policy about it. --DaB. (talk) 00:36, 26 June 2013 (UTC)[reply]
  • Support Support --Kellerkind (talk) 08:33, 26 June 2013 (UTC)[reply]
  • Support Support — Racconish Tk 21:20, 30 June 2013 (UTC)[reply]
  • Support Support Per Martina Nolte, Aschmidt and DaB., above. Keith Roth (talk) 12:14, 3 July 2013 (UTC)[reply]
  • Support Support The reasons haven been stated above. --Peter Putzer (talk) 21:54, 20 July 2013 (UTC) [reply]
  • Comment: You do realize that it literally impossible to prevent this kind of aggregation of editing statistics? This isn't just something the WMF or people with Toolserver database access can do. Anyone can aggregate these statistics about an individual account because the data is public, just like page histories are public. It's something that could be aggregated from the live site, the monthly XML dumps of the wikis, and other methods. If you don't want people knowing what you edit, then don't edit. There is no way to make your edit history private. Steven Walling (WMF) • talk 23:00, 20 July 2013 (UTC)[reply]
Just because someone else can do it there is no need for Wikimedia to provide aggregated data. NNW (talk) 13:57, 21 July 2013 (UTC)[reply]
Steven, we know this very well. But it still makes a difference if these datamining tools are hosted on WMF resources and linked to from various pages (contributions page etc). In addition, it makes a difference according to EU legislation. --AFBorchert (talk) 13:25, 22 July 2013 (UTC)[reply]

"Self-CheckUser"[edit]

Tangentially related to the privacy policy, there may have been calls to follow the lead of other large Web services and provide an individual with his or her own stored private information: bugzilla:27242. --MZMcBride (talk) 16:11, 23 June 2013 (UTC)[reply]

Thank you for your suggestion, MZ! We will take that into consideration. Mpaulson (WMF) (talk) 00:50, 26 June 2013 (UTC)[reply]

Varia[edit]

Thanks for this call, an excellent initiative which addresses a real issue. At the same time, it would - in my opinion - defeat the purpose of this constructive move to take for granted some assumptions, particularly the ideas that the only applicable law is the US law and IP addresses do no relate to privacy concerns. I am not an expert in the legal aspects of transborder data flows, but my layman understanding is many national laws are intended to protect their residents in these matters. I also believe our community wishes to establish strongly high standards of privacy and not to benefit from the shelter of an American law that would not offer to the users the level of protection they are entitled to expect from the reasonable application of their national law. I understand this matter has been addressed in the United States by the Safe harbor Policy and therefore suggest we could inspire ourselves at least from its principles. Concerning IP addresses, I wish to point out two things : (1) a majority of European countries consider IP addresses as private, either by law or by jurisprudence (see this map) ; (2) the foundation itself, in its privacy policy, recognizes here IP addresses are "personally identifiable information" which may be used to identify contributors in order to "protect the rights, property or safety of the Wikimedia Foundation, its users or the public". For these reasons, I think (a) the Data retention policy should clarify what datas are kept, for how long and for what purpose (see (d) below) ; (b) the privacy policy should clarify how contributors can access data concerning them and correct them if need be ; (c) an entity should be identified whom the user can contact in case of problems ; (d) the "catch-all" character of the notion of "safety" in the 6th point of the privacy policy should be worded in a more specific manner. Finally, I suggest (e) the right to vanish should be given to each user, with reasonable implications in terms of data retention. Cheers, — Racconish Tk 21:18, 30 June 2013 (UTC)[reply]

CheckUser policy[edit]

Hi,

There is something that should be cleared up on the CheckUser policy and, therefore, on the privacy policy.

CUs normally never reveal the IP used by an editor. However, it is unclear whether they may or may not reveal the link between an IP having made abusive edits and an account.

Wikis have different views on this: metawiki and stewards never reveal IPs; enwiki reveals the link implicitly (by choosing to block or not to block an IP involved in an RCU); frwiki reveals the link only with IPs that have been mentioned in an RCU; nlwiki and svwiki seem to work this way too (more or less)...

It would be great if the "Policy on Release of Data" could be updated so that CheckUsers would not make assumptions on the policy.

Elfix 07:24, 1 July 2013 (UTC)[reply]

Hi, You raise an important issue. I agree and by the way, I am very much against this policy whether it allows implicit or a fortiori explicit revealing of this link. I don’t know how to do without it but it would be much better in terms of privacy. Anyway, you are right that CheckUser policy should be updated and made clear. Keith Roth (talk) 12:22, 3 July 2013 (UTC)[reply]
Hi Elfix and Keith Roth! We are actually going to be revising the Access to Non-Public Data policy as well, in hopes that it will provide a little more clarity about when non-public data should be accessed and under what limited circumstances it can be shared. The new version of the policy will be released for community feedback alongside the privacy policy, so I hope that both of you will review it when you get the chance and help us make it better and clearer for everyone who uses it. --Mpaulson (WMF) (talk) 00:13, 19 July 2013 (UTC)[reply]

Cookies lifetime[edit]

Per Privacy policy/en#Cookies the lifetime for the login cookie is 30 days: If one saves a user name or password in one's browser, that information will be saved for up to 30 days.

Since August 2011 the default value of MediaWiki is 180 days, see mw:Manual:$wgCookieExpiration. Change was made with rev:94430: Bumped $wgCookieExpiration to 180 days, we were one of the sites with shortest cookie lifetime for too long. This was discussed at lengths about a year ago on wikitech-l, however nobody cared to implement the suggested alternative solutions. Probably because they were overcomplicated and solved only parts of the problem.

I suggest to change the privacy policy accordingly to the MediaWiki default value of 180 days. Raymond (talk) 20:35, 3 July 2013 (UTC)[reply]

This year we noted the discrepancy in engineering and reset $wgCookieExpiration back to 30 days, and I've updated the MediaWiki manual page you linked to note this. You can always check these settings in the public CommonSettings.php. FWIW, I regularly hear users complain about how quickly they get logged out, and have to refer them to the privacy policy. Steven Walling (WMF) • talk 01:35, 4 July 2013 (UTC)[reply]

Some questions[edit]

Q1: As far as I know, we use SecurePoll on Board elections. I think voters would wonder which informations are transmitted by the extension. Does the extension process the list of voters and substance of the vote (i.e. "Support" or "Oppose") independently so that anyone could not know who voted for? I hope the answer is "yes".

Unfortunately the answer to that is no. It is encrypted in the database (with no one but an outside party having the decrypt key during the vote and a very limited set of people after) but IF you have access both to the full database AND the decryption key you could link everything with effort. You would need both and could not do it accidentally but you could do it with the current system. We're not very happy with it to be honest (for many reasons) and so I think looking at other systems is certainly a possibility. The reason for the linking in the end is because we need to know what votes to strike and so the information (which includes that is looked at by the election committee to look for fraud or other concerns (and does not link to any specific vote info through the interface) does have some behind the scenes database connections to the separate table holding encrypted votes. If we have the requirement I to be able to strike votes I don't know of a reasonable way to not have that linking somewhere (which inevitably has a few specially trusted engineers with access) but others know that work much better then I do. Jalexander (talk) 00:53, 19 July 2013 (UTC)[reply]

Q2: Foundation processes sensitive private informations in identification required to candidates for checkusers/oversighters/stewards. Does the foundation have any plan to provide PGP public key to encrypt the data? This would reduce the risk in case that email servers are compromised, email service providers participated in PRISM project, or email are unintentionally intercepted by someone another. Best regards. – Kwj2772 (msg) 19:29, 6 July 2013 (UTC)[reply]

PGP is currently available on request. I suppose we could expose that more clearly. We're also in the process of revamping how that whole process is handled, such that it will no longer involve email. Philippe (WMF) (talk) 01:03, 19 July 2013 (UTC)[reply]

Wording change to "User contributions"[edit]

I don't think this section of the policy clearly conveys that your page editing history is public information. Quoting:

User contributions are also aggregated and publicly available. User contributions are aggregated according to their registration and login status. Data on user contributions, such as the times at which users edited and the number of edits they have made, are publicly available via user contributions lists, and in aggregated forms published by other users.

The language makes it sound like only mundane data is collected. But the most salient fact is that there is a permanent, public record of what pages you edit. This record can reveal a focus of interests, implying likely facts about your politics, sexuality, passion for Justin Bieber, or other things you might not want in the public record.

In addition, the repeated use of the word 'aggregated' could cause people to misread and not realize contributions are also connected to individuals. This is especially relevant to users who choose to reveal their real life identities: they need to understand it's not just a matter of having your name on your user page, but that everything they do on-wiki will be traceable to their identity.

Let's remember that things like "page history" and "contribution history" are abstract ideas that take some time for new users to conceptualize. I pick this stuff up pretty fast but I certainly made many edits before understanding what was happening.

Suggestion: in the first sentence change 'aggregated' to 'logged'. In the third sentence change it to "the times at which users edited a particular page". Fletcher (talk) 02:40, 12 July 2013 (UTC)[reply]

Technical and legal coercion aspects[edit]

(It should be "Technical and oversight" but anyway...)

While we require policy about data held and its use, any privacy policy is useless if it can be overridden by legal coercion, or if it cannot protect our users from silent breach and tampering. Wikimedians in 200 countries worldwide do not generally consent to undocumented privacy breach, nor should they have to accept laws which are not theirs forcing it on them as a condition of WMF services. So our privacy discussion ought to consider what features allow secret privacy breach to occur, and what policies might be indicated as a result. We need policies that make the privacy we stipulate, not only achievable, but robust against systems and hardware tampering attempts, legal coercion, and legal mandatory silence. Otherwise privacy policy is meaningless.

Summary of legal/technical landscape: Recent events show that laws may exist that coerce (any) organization to 1/ secretly change or allow customizing of their hardware and software to allow certain kinds of privacy breach, with 2/ coerced mandatory silence over such changes. A simple solution would be a WMF policy of mandatory swearing under perjury that no such activity has occurred or is occurring, by all senior WMF staff, but 3/ (as shown by other laws such as SOPA) it's quite possible for a law or court to rule that such staff must swear untruthfully and will have legal immunity if they do, so perjury law isn't likely to protect us. Also 4/ data centers overseas could be affected by coercive laws without US staff knowledge, and 5/ there is considerable interest in 'hacking' as a way to gain information covertly (recent news indicates that both China and US governments are willing to hack into reputable private organizations; others may as well). Finally 5/ a risk of silent interception (eg tapping telecoms cables) wouldn't be known and is beyond WMF ability to prevent.

We state to people in 200+ countries that we honor their privacy. Our projects' best protection to meet that commitment is transparency. Ensuring our systems cannot be modified, or scope for unofficial data extraction routes added, without triggering wide knowledge, are our best defenses. Any privacy policy we have is useless if technical or board staff can be silently forced to ignore it. So in any policy discussion, I hope we will have measures targeting these risks, so it is almost impossible for systems to be physically (in hardware) or logically (in software) modified to allow privacy breach by anyone, under coercion or otherwise, without triggering widespread public knowledge.

I'm not technical enough to know what's "enough", but I can list the kinds of ideas I would like to see at a technical and policy level. As stated they are extreme, but it's hard to know what is needed. Perhaps while these are extreme someone else can use them to spark discussion of less extreme but equally effective policies:

  1. Enforced systems transparency - Some kind of systems approach whereby any modifications to our running systems is impossible without causing a log entry or difference to be visible, and where it's designed so that 'silent tampering' with the logging or change management system would not be possible. Alternatively some kind of hypervisor that runs on all systems, with an in-house module that monitors and integrity checks any code running, and which cannot be silently tampered or bypassed.
  2. Public eyeballs on core systems - Some kind of user group whereby anyone worldwide who wants, can monitor and assure themselves the above is operating correctly. This would be radically transparent, effectively it means allowing anyone worldwide who wishes it, to have absolute and total read-only oversight of core systems and running code (but not data pages or data files).
  3. Segregation of privacy data from everything else onto separate systems and servers - Redesign of code so that most systems do not handle actual private data - they handle tokens, or the like. The private data is then held separately and only stored and processed in a very few systems, which are hardened so that manipulation is difficult and private data can only leave the systems in a logged manner; remote managed by WMF but with overview of any remote management activity, and located/replicated in countries that do not allow for silent legal coercion. Examples -
    • Perhaps core systems don't hold and cannot send out private data, and this can only be done by custom servers upon validation of a legitimate request using multiple replicated systems in two or more countries (or a consensus of replicated systems). This would mean that a "private data handling server" won't allow access to private data unless 2 or more servers in different countries validate (or agree) the legitimacy of the request (hard to tamper with servers outside any given country), and any send is then irrevocably logged before being sent directly to the requestor.
    • Perhaps all SQUIDS (WMF proxy servers) whose task is fairly simple and uniform, might be modded to strip any private data and 'do something else with it' so that it cannot subsequently be leaked.
  4. Tampering while offline or during change - What policy would mitigate against tampering or physical privacy breaching approaches at our datacenters? Should it be policy that any time a core system is taken down or modified (in any country), at least two technical volunteers from a different country should be physically present and perform any core code change, and that any changed systems code is integrity checked against an "official" version of code that's internationally available? Should technical staff in residence where a datacenter exists, always include volunteers in residence from an overseas country, some of whom also act as privacy representatives for an annual term, who must be present for any hardware or systems changes, or offline actions? (The rationale being they are productively active in a technical capacity, but will not be bound to silence by that country's law once back overseas after their year).
  5. Tamper resistant hardware/firmware - What mitigation exists against bringing in tampered hardware or hardware with tampered on-board modules, for example? Should WMF have a policy of only using TPM hardware for its systems, that enforces the (known) approved firmware, OS and software only? Can we mitigate the scope (under coercion or otherwise) of installing servers with modified on-board cards or firmware to bypass systems controls? (A logical, predictable, and not too difficult 'next step' in how privacy breach might occur)
  6. Move to enforced HTTPS (possibly with 'allow http' as a user preference) - almost no browsers and no ISPs or systems today, lack support for encrypted connections. Has the time come to enforce https as our sole protocol, or have "allow http" as a user preference that is disabled by default? That would be an immensely powerful statement and technically quite feasible; it's also the only protection we can offer our users against optical/data cable tapping.
    • (On the same point, does SSL technically allow us to notify users if SSL privacy is dubious, by checking the certificate chain our end, as users won't otherwise know if MITM interception may be occurring? Or provide transparent SSL proxy servers as entrypoints across a number of countries whose certificate chain is more directly verifiable.)
  7. Staff terms of service/resignation policy? - Assuming that tampering would involve consent (however reluctant) of senior/board staff, should WMF adopt a policy that any member of staff should resign if their role would require them to implement unauthorized systems? It makes it a bit harder to tamper with systems if anyone asked to assist would by agreement resign or request a role change/removal of authority.
  8. Privacy committee to focus on this as a living issue and improve approaches over time - Can we have an independent privacy committee, which is not from one country or continent, and where persons in countries with known government coercion capacity do not have "the final say" (making it harder to silently coerce decisions), and whose remit is to proactively consider how to improve the tamper resistant and privacy breach resistant nature of all WMF systems, and the transparent/verifiable nature of its core systems and software? (We have no commercial IP secrets so enforcing systems transparency and tamper resistance at the heart, is a good idea)

These "brainstormed" ideas are proposed in the knowledge that, while we cannot solve the problem absolutely, a discussion of privacy policy and recent revelations should take seriously the issues above. I hope others more technical will consider the attack vectors we face, and decide the key policies needed to ensure that WMF's systems are more irrevocably hardened, accountable and verifiable/transparent to worldwide users, so that privacy breach enabling changes are difficult to bring online (in any location) without inevitable widespread knowledge.

Last, if we are seen to put the work in to do it (and we communicate this project impactfully in public media), perhaps it will have wider effect. It may encourage change-oriented debate and be more persuasive, that this is important and action is possible. If our example prompts wider awareness and change, then it will bear fruit beyond just our sites and on the global web, in ways that would be unlikely via mere protest. FT2 (Talk | email) 03:38, 17 July 2013 (UTC)[reply]

Longer cookies, longer log-in times[edit]

I want to have cookies that last for more than 30 days, because I want to stay logged in for longer than 30 days. WhatamIdoing (talk) 22:26, 19 July 2013 (UTC)[reply]

Out of curiosity, why is the Wikimedia expiration date (30 days) different from the MediaWiki default (180 days)? --Michaeldsuarez (talk) 23:21, 21 July 2013 (UTC)[reply]
The answer to this question is available on this very page. odder (talk) 23:23, 21 July 2013 (UTC)[reply]
Oh, I see now. Thanks. --Michaeldsuarez (talk) 23:30, 21 July 2013 (UTC)[reply]

The above discussion is preserved as an archive. Please do not modify it. Subsequent comments should be made in a new section.

Delay for drafts[edit]

Thank you all, we really appreciate all of the input we received from you during the initial consultation period! We have been working hard to craft a new privacy policy draft that reflects your thoughtful feedback. Unfortunately, it is taking us a little longer than anticipated. We hope to present the Wikimedia community with a draft in early September, and we still plan on having at least a 4-month long community feedback period. Thank you for your patience. Jalexander (talk) 03:39, 1 August 2013 (UTC)[reply]

@Jalexander: - it's more than a little past September. Any comment? Wnt (talk) 20:35, 20 March 2014 (UTC)[reply]
@Wnt: - hmm, I'm not sure what you mean? We just finished a 5 month conversation at Privacy policy / Talk:Privacy policy (along with the other draft policies linked from there) with all project banners etc. in Februrary (with over 500 editors participating). They aren't approved by the board yet but the discussion was started in early September ( September 5th) like I said above (and then extended from the original January end date because there was still discussion going on). Jalexander--WMF 01:02, 21 March 2014 (UTC)[reply]
Thanks - I haven't been on Meta much and I misunderstood the order of events. Wnt (talk) 10:52, 22 March 2014 (UTC)[reply]