User:Kbrown (WMF)/TM prototypes/multipage/H10

From Meta, a Wikimedia project coordination wiki

H10: Image-based problems[edit]

In some cases, users can be the victims of harassment involving images. This can come in many forms, but all have the potential to be upsetting or intimidating. Some examples of how images might be used to harass a user include:

  • An attendee of a Wikimedia event being photographed without their consent, with the resulting photographs posted to Wikimedia Commons or a sister project;
  • Images of pornography, violence, racisim or otherwise shocking images are sent to users;
  • Editing a user's photograph, for example to combine it with a shocking or controversial image;
  • Using a user's image to illustrate an article that has negative connotations (for example, replacing the lead image in the "pedophilia" article with the user's picture)

How and when to get an image deleted[edit]

Getting an image deleted can be a complicated process. The most important aspect to consider is whether the image actually breaks any rules. Sometimes the image itself isn't causing the issue so much as an attacker using it maliciously. Commons has some shocking images, or images that might cause distress to users. This by itself is not normally a reason to have them deleted.

A common form of image-based harassment is the posting of images taken of volunteers without their content. At events, there are usually precautions taken to ensure those who do not wish to have their images taken are catered for. This is usually done with stickers or lanyards. In some circumstances, these may be ignored – maliciously or otherwise – by photographers or camera operators at events. It can be distressing to have a photograph linked to a username, especially if the link wasn't apparent before. It's also treated as a form of [Section link to H8|personally identifying information] on the projects.

Note that images used to harass users are often posted "off-wiki". Learn more about dealing with harassment taking place on external websites.

Images on Wikimedia Commons[edit]

Most images found on our projects are hosted on Wikimedia Commons. You can identify if an image is hosted there, rather than on a local Wikimedia project, by the appearance of the "View on Commons" tab along the top of the page. There will also be a note just below the image stating that the file is accessible on Commons.

Wikimedia Commons has a guideline around images of identifiable people which can be useful in situations like this. It states: "The subject's consent is usually needed for publishing a photograph of an identifiable individual taken in a private place, and Commons expects this even if local laws do not require it."

Wikimedia events are usually considered private places. This can be a little more blurry if the event is held in a normally-public place like a library or a university. A well-run event will have either something to sign if you are okay with photographs being taken, or more usually stickers or lanyards to indicate you are not comfortable being in photographs.

Even in public places, country-specific rules exist on consent. These rules can be complicated, and not all are legally binding. Check whether or not the country in which the offending photograph was taken is covered by a rule such as this.

"Selfies" or other images of editors which are uploaded by themselves are fairly common on Wikimedia Commons. Of course, while these are usually fine, users should be aware that images of themselves have the potential to be abused for harassment in the future.

Images on local projects[edit]

Images hosted on local projects are dealt with slightly differently. Policies on this tend to vary by project. On most, the unauthorized posting of someone's photograph counts as the release of personally identifying information.

Images involving minors/child pornography[edit]

Where there is a suspicion that an image might contain child abuse or child pornography, please immediately report it to the Wikimedia Foundation through legal-reports@wikimedia.org to alert the Trust and Safety team. Include a URL link to it so that it can be reviewed quickly. Even though situations like this are rare it is important that the material is reviewed promptly – having all relevant information available at first contact helps speed things up substantially. The Trust and Safety team at the Wikimedia Foundation is tasked with reporting and otherwise handling such images.

Even if there's no obvious abuse or suggestion of pornography involved with an image of a minor, it can still be upsetting. There currently is no hard rule regarding the treatment of images of younger editors, uploaded by themselves or by others. Generally it is best to use your common sense – would this image be harmful if it remained? Is this image within the scope of Wikimedia Commons?

Test yourself!
#3: Image-related abuse
This module will periodically present you with multiple-choice questions you can use to test your knowledge of the module you are studying. While more than one of the suggested answers may seem correct, remember that you should try to pick the most correct of the options.
An editor who edits under a pseudonym, User:K, routinely edits controversial articles on the Chinese Wikipedia. They primarily edit about Hong Kong's past as a British colony, and do so in an arguably pro-British style. They are routinely involved with brief edit wars and heated discussions, but their conduct is never deemed serious enough to block or ban them. However, User:K discovers that their photograph has been stolen from Tencent QQ and used as illustrations for parody articles on a pro-China Wikipedia fork. It's not immediately clear who has done this, but User:K imagines it's the fault of pro-China User:L, who he has debated with many times in the past. Do you:
Which option would you choose?
  1. Advise User:K to create an account on the Wikipedia fork and remove it from the offending article himself. Provide User:K with guidance on security and privacy on external websites to prevent this from happening again in the future.
  2. Advise User:K to contact the Wikipedia fork to request the removal of their image directly, and recommend they look into legal action against the offending site. Provide User:K with guidance on security and privacy on external websites to prevent this from happening again in the future.
  3. Block User:L immediately, as it's obvious they are the one involved with posting this image on external websites. Provide User:K with guidance on security and privacy on external websites to prevent this from happening again in the future.
  4. Provide User:K with guidance on security and privacy on external websites to prevent this from happening again in the future, but otherwise take no action.

Ready to see the correct answer?
Click to expand! (click to expand or collapse)
The correct answer is: This situation is difficult, though the most effective of these actions is answer B. This allows the website (which may not otherwise be aware that the image is hosted by them) a chance to remove it or open discussions to begin that process. There may also be local laws which protect the privacy of User:K which the sharing of this image is breaking.

There is not much to be gained from answer C, since while it is possible User:L is involved there's no real way of knowing if that's true. Blocking without process in this instance makes little sense. Answer A might be useful if User:K is willing to get involved with such a community, though it seems likely to make the situation worse. Answer D, to do nothing at all, may not actually not as bad as it may sound – there is every chance the "fork" isn't widely read, and intervening may just make things worse further down the line.

(Discuss this question)
Dealing with online harassment
Introduction What this module is about
Basics
  • What is harassment?
  • Why do you need to care about harassment
  • Some examples of harassment on our projects
Handling harassment reports
  • What makes a good reply
  • What to do with third-party reports
  • Replying to non-actionable reports
  • What types of reports should go the Wikimedia Foundation's Trust and Safety task force?
  • What types of problems should be redirected to community noticeboards?
  • What types of problems can an administrator or functionary handle individually?
  • What types of problems should be redirected to local functionaries or arbitrators?
Communicating with victims of harassment
Immediate action
Investigating reports
Providing support and advice
"Doxxing" or release of personally identifying information
"Off-wiki" harassment
Image-based problems
Closing cases
Reporting out
After a case
Other resources