Talk:Privacy policy

From Meta, a Wikimedia project coordination wiki
Jump to navigation Jump to search
This page contains changes which are not marked for translation.


Allow people contributing pictures to conceal camera make and model for privacy reasons[edit]

When I uploaded some photos taken using my smartphone I didn’t realise that the wikiMedia website would display all the EXIF metadata from the camera. Please can you add a privacy feature to the user account to hide camera make and model information for my contributions Adrian816 (talk) 14:23, 27 February 2018 (UTC)

I think that's a technical question about removal of the EXIF metadata from your existing uploads, and about removing EXIF during your future uploads. Perhaps ask Commons:Village pump? --Gryllida 22:30, 27 February 2018 (UTC)
If your photo is displayed somewhere and you download it from there, the downloaded file has no EXIF metadata. The problem though is that without the metadata, there is no evidence that you're the photographer. Guido den Broeder (talk) 23:47, 1 March 2018 (UTC)
But metadata may be edited by a some easy ways, and this is not a evidence in the general case. --Kaganer (talk) 14:49, 10 April 2018 (UTC)
In my POV show the metadata of a file in Commons improves the transparency and openness of the project. But many persons as Adrian can't be aware of this. We can suggest a legend or banner in the Upload Form to prevent the people to be uninformed of this technical characteristic. ProtoplasmaKid (WM-MX) (talk) 20:17, 21 May 2018 (UTC)
ProtoplasmaKid, Good idea. Rutheni (talk) 10:15, 24 May 2018 (UTC)


I sugest the commons help me —The preceding unsigned comment was added by 168.194.161.181 (talk) 16:27, 7 August 2018 (UTC)

Enforcement[edit]

Policy is always nice, but without enforcement it is meaningless. There have been multiple breaches of privacy of more than one individual, and by the very people who are supposed to be enforcing it. In some cases, volunteers entrusted with PII have even released this information intentionally. There is still no standard for vetting and training oversighters, or reporting mechanism for breaches. The NDAs signed by oversighters are meaningless, because they are volunteers, not employees or contractors. In spite of multiple failures of the system, there has been no public investigation or examination of the system itself, or even an acknowledgment of the problem. Perhaps it is time for an outside evaluation? —Neotarf (talk) 23:13, 24 May 2018 (UTC)

Hi Neotarf. Concerns about users with Oversight or Checkuser Permissions should be raised with the local Arbitration Committee or with the Ombuds Committee, and the Foundation will work closely with them to resolve any issues. TSebro (WMF) (talk) 04:59, 5 June 2018 (UTC)
So, TSebro (WMF), concerns about the arbitration committee or the ombuds committee should be taken to the arbitration committee or the ombuds committee, and everything will be just fine? The problem with that is that the WMF doesn't *know* whether everything will be just fine, because they aren't tracking it. It may be that instead of "working closely with the Foundation", the committee members will instead choose to block the user so they can't object, block their talk page so they cannot ping someone to unblock them, publish further dox about the individual, remove their email access so they cannot contact someone else to remove the dox, and send them bounce notifications if they try to email the committee directly. I would also note that any private information that someone sends the arbitration committee will then end up being stored in the personal email accounts of the individual arbitrators, even after they are no longer participating with Wikipedia. And the arbcom mailing list has historically been very leaky. Also, if the user is in an region with heavy internet censorship or surveillance, or a repressive government, expecting them to send more and more emails to try to get something removed may actually be dangerous advice and bring additional unwanted attention. It would be better just to have an arbitrators that didn't publish personal information in the first place. Neotarf (talk) 01:30, 15 June 2018 (UTC)

Additional privacy and safety issues[edit]

I hope this is not too far off topic, but I don't see a better place to post these concerns.

My first concern is with potential medical and child safety situations and their reporting and followup. I was 17 years old the first time I had to handle a suicide attempt, and within minutes was able to get professional and medical attention to the situation, and was kept informed of the situation throughout the night. The individual survived and later sent me a thank you note. This was in the context of a small non-profit organization with almost no resources. In contrast, with the situations I have been involved with in the Wikimedia organization, no one knows what to do or how to report or how to know whether the situation has been turned over to an appropriate professional. It is known that some people do carry a special phone at times, but they may not be fluent in English or they may just cancel calls without evaluating them. In addition they may not have enough medical skills to make an evaluation or they may have a personal friendship or history of conflict with the individual in question that makes it awkward to report to them or discuss and have taken seriously any information you may have about their medical history that may bear on the situation. In addition, the nature of Wikipedia's pseudo-court/police model of governance makes it less likely that any situation will make it out of the talk pages and get appropriate attention. Administrators and arbitrators are more likely to try to sanction anyone who tries to report something, instead of turning it over to someone who knows how to handle it. In the real world you do not want security handling these situations, they simply escalate and get out of hand; you need someone with management skills. In the context of Wikimedia, these ad hoc solutions that individuals try to put together when they are faced with a situation are simply not working.

The other concern is with PII (Personally Identifying Information) and the refusal of volunteer oversighters and arbitrators to remove it. In December 2014, while serving as an arbitrator with checkuser privileges, user:AGK sent an email refusing to remove Personally Identifying Information, as defined by the Wikimedia Privacy Policy. This year he is a candidate for the 2018 arbitration committee election and I have asked him about this on his talk page. His answer seems to indicate that arbitrators believe they have the authority to suspend the privacy policy retroactively on an individual by banning them, and at that point they do not have to answer for their actions. This to me underscores the necessity of using paid staff who have signed a meaningful NDA for these positions. Trying to get something revdeleted or oversighted is very, very difficult under any circumstances unless you know someone personally. Once you have been named to an arbitration case, you automatically become high profile, whether you want to or not, and there is an almost certain probability you will be stalked on Wikipedia and beyond. This makes it even more urgent that any information, accidental or otherwise, be removed as quickly as possible. Even if you do know someone, and they speak fluent English, it can take dozens of emails to get them to understand and agree. The format of the policy page itself is not much help since it does not incorporate a TOC for navigation or paragraph numbering.

I don't want to just point out issues without suggesting solutions, so I am proposing a button for flagging potential safety situations. Every social media from YouTube to Facebook to Twitter has a button like this, it is extraordinary that Wikipedia does not. Since administrators and arbitrators are usually the first to be made aware of a situation, they should be trained on how to respond to dox and where to request a medical or child safety evaluation. It would take a less than 10 minute self-paced test, and they could be asked to complete it the first time they pass an RFA and every year thereafter, in order to stay current on the latest safety procedures. The proposal is on the community wishlist talk page; I'm sure it can be improved on. Pinging TSebro (WMF) and Mdennis (WMF). Also Doc James might know where to refer any medical related questions.

Thank you for your attention to this. —Neotarf (talk) 22:31, 29 November 2018 (UTC)

UI after ‘describe the edit you made’[edit]

Many people here bring up good ideas about hiding camera metadata, ip addresses and so forth. My thought is that a UI window could be added after the ‘describe the edit you made’ window that could give the user privacy options. The purpose of this would be 1.) to let new or uninformed users make their edits without too much effort up front thereby not ‘scaring them away’ and 2.) to have transparent options to publish or delete and explanation about what publishing metadata or IP info (or other similar info) can mean for the user. Victorgrigas (talk) 13:53, 28 May 2018 (UTC) Victorgrigas (talk) 13:53, 28 May 2018 (UTC)

Hi Victorgrigas. Thanks for the suggestion; we've passed it along to our technical teams for consideration. So that I understand your suggestion correctly: are you suggesting the creation of an extra dialogue box that explains the privacy implications of their submission? Some Wikipedias already have a warning for people editing while not logged in, encouraging them to create an account to hide their IP address. We can investigate what kind of privacy information users want to see. TSebro (WMF) (talk) 05:03, 5 June 2018 (UTC)
Yes that's more or less what I'm suggesting, basically add a way AFTER someone has pressed 'edit' and made changes but BEFORE they press save and publish to inform them about the privacy options and implications of saving and publishing. Victorgrigas (talk) 14:00, 5 June 2018 (UTC)

error in translation : tags T:109/T110 -> english text is never translated in the local pages[edit]

Hi all, wrong tagging detected in the original English page around tags T:109/T110 Text "More on Locally Stored Data" remains in english; why ???

Thank you

Viewing personal information.[edit]

I feel Privacy Policy exist for our personal and business information privacy. The lack of concern and properly run area's which can have Privacy the way it's issued in our privacy policy's, we need to implement the known solutions so people and others can feel better no matter where they are or go.

Protected edit request on 16 November 2018[edit]

Two elements from this page are untranslatable:

  • {{Hidden |header=Publicly Visible Information |content=<translate><!--T:94-->
  • {{Hidden |header=More on Usernames |content=<translate><!--T:250-->

Please add translate markup on both:

  • {{Hidden |header=<translate>Publicly Visible Information</translate> |content=<translate><!--T:94-->
  • {{Hidden |header=<translate>More on Usernames</translate> |content=<translate><!--T:250-->

Trizek (WMF) (talk) 09:41, 16 November 2018 (UTC)

Yes check.svg Done and pushed to translation. @Trizek (WMF): thanks for the corrections  — billinghurst sDrewth 09:54, 16 November 2018 (UTC)
Thank you billinghurst! Trizek (WMF) (talk) 10:18, 16 November 2018 (UTC)

Edit request - several untranslatable phrases[edit]

I found 2 spots that are untranslatable. Could you fix this?

1.

{{Hidden
|header=More on Locally Stored Data
|content=

2.

{{Privacypolicy/Right
| content =
<!--T:30-->

misses the "arrow text" argument.

Thank you. Tân (talk) 19:32, 15 January 2019 (UTC)

@Vinhtantran: Yes check.svg Done though I do have a favour to ask. It seems silly to me that we have repeated "back to top" labels to be translated and it would seem sensible to me that we convert that to a translatable template that we insert and re-utilise. Suggest we create {{back to top}}.  — billinghurst sDrewth 21:23, 15 January 2019 (UTC)
Oh, it is already there. That is a task further to undertake then, though I cannot do it now.  — billinghurst sDrewth 21:24, 15 January 2019 (UTC)