Similar to other online communities, Wikipedia isn’t immune to the harassment behavior. Such toxic behaviors can impact the community and its users in a harmful way. Surveys indicated that 54% of the harassment victims reported decreased participation in the projects they were working on.  Several research from psychology area suggested that being exposed to harassment tend to bring negative emotions to the community. This project aims to find out whether it’s the case when effects of harassment seen at a larger scale.
We want to answer the following questions:
- Do users leave Wikipedia after the attack?
- Do users get more aggressive after the attack?
- What make editors more susceptible to being harassed?
- Modeling behavior and traits (observed before the harassment takes place) that are likely to result in harassment.
Following research project Detox, we use the automatically labeled attacks to divide users into victims and non-victims. We define an attack incident for the a victim as the first ever attack on her user talk page, and an attack incident for the a non-victim as a normal non-bot post on her user talk page. We match the two groups on following confounding factors to describe their behaviors in a time period before the incident and compare the effects in the same length of time period after the incident. At current stage, we are using hard matching on all categories.
- Activity The activity of each user is the number of posts she posted on all talk pages.
- Administrator Status The Wikipedia Talk Corpus labeled each post as whether it’s posted by an admin or not. Using that information, we define the average the admin status for all the posts as the administrator status of a user in a time period.
- Involvement in Controversial Pages Involvement in controversial pages is measured by the number of posts appearing on pages containing a controversial template.
- Aggressiveness The aggressiveness of a post is defined by the labeled probability of being a personal attack.
- Number of Attacks The number of posts classified as personal attacks.
We plan to collaborate with Jigsaw on manually verifying a subset of the automatically-obtained labels of harassment.
Answering the Research Questions
- Do users leave Wikipedia after the attack? For this question, we defined the departure date of a user as the timestamp of her last post in our dataset if it is 6 months earlier then the data end date. Otherwise, we define her as never leave. We compare the number of users leaving after a certain time period to do the retention analysis.
- Do users get more aggressive after the attack? For this question, we compare the average aggressiveness of users after the attack incident.
- What make editors more susceptible to being harassed? For this, the prediction task is whether a user will be harassed or not.
- Modeling behavior and traits (observed before the harassment takes place) that are likely to result in harassment. Computational tools for tracking specific conversational and behavioral dynamics.
We currently divide the whole user groups into active ones and non-active ones based on the number of talk page posts they posted in the month of the attack. We obtain 2768 pairs in the non-active group and 1409 pairs in the active group. The initial result shows that this two groups have different reactions after the attack. In general, in short term after the incident, non-active victims tend to stay but more of them leave the community in long term. While there are always more active victims staying compared to the non-victim group. Also active victims have more changes in terms of aggressiveness of the posts.
- Support and Safety Team. "Harassment survey", Wikimedia Foundation, 2015.
- Kayany J M. "Contexts of uninhibited online behavior: Flaming in social newsgroups on Usenet". Journal of the Association for Information Science and Technology, 1998.
- Cheng, J., Bernstein, M., Danescu-Niculescu-Mizil, C., & Leskovec, J. "Anyone Can Become a Troll: Causes of Trolling Behavior in Online Discussions", CSCW, 2017.
- Wulczyn, E., Thain, N., & Dixon, L. "Ex Machina: Personal Attacks Seen at Scale.", WWW, 2017.