Research:Civil Behavior Interviews

From Meta, a Wikimedia project coordination wiki
Created
20:33, 12 April 2018 (UTC)
Duration:  2018-04 – 2018-10

This page is an incomplete draft of a research project.
Information is incomplete and is likely to change substantially before the project starts.


The Lead: Introduce and describe your project at a high level in one or two paragraphs. Will the output of this project provide tangible benefits for our community (in the form of data, software, Web services)? If the output of this project mainly consists of scholarly publications, what aspects of Wikimedia projects will they help to understand

The current work hopes to provide Wikipedians with a deeper understanding of how the truth is discussed and agreed upon by editors, using a lens of civility. This work views civility as a collaborative process that occurs on Wikipedia Talk pages, and uses this view of civility more fully understand what incivility looks like and how it can be detected in discussions among Wikipedians. Incivility in discussions like those taking place on Wikipedia should be addressed because incivility can influence how an understanding of the truth is arrived at by participants, which may have consequences for what information is present on the resulting Wikipedia pages (Papacharissi, 2004).  

Civility can be difficult to measure and discuss, partially because it is so universally used. One definition of civility used by researchers (Papacharissi, 2004) focuses on the collaboration needed to accomplish common goals, rather than mere politeness. Academics typically define civility as discussions that promote respect for all participants and enhance understanding of the issue being discussed, regardless of whether or not minds are changed (Papacharissi, 2004).  Civility involves conversations that may be heated but still respect the identity of the individuals involved (Papacharissi, 2004).  Wikipedia is set up in such a way as to encourage heated but civil debate, as Wikipedians must collaborate to determine what constitutes truth when writing an entry, as well as ensure that the entry is written neutrally. If these goals are accomplished, and accomplished in such a way that allows all discussants to contribute and be heard, then civility is present. However, well-informed civil discourse is not the only kind of discourse that occurs online, and among Wikipedia editors, and online incivility often derails these necessary conversations (Coe, Kenski & Rains, 2014; Yasseri, Sumi, Rung, Kornai, Kertesz, 2012). By definition, the presence of incivility makes the discussions occurring less productive (Papacharissi, 2004). The presence of incivility may also impact how a neutral view is arrived at by editors, as incivility may result in certain demographics being privileged and others being excluded from the discussion, thus skewing what viewpoints are present for consideration (Mouffe, 1999; Papacharissi, 2004). This skewing of present viewpoints may also influence how new users are attracted and retained (MacAulay & Visser, 2016; Menking & Erickson, 2015).

Adding a system that automatically flags certain behaviors as civil or uncivil may help users of this tool deal with incivility, but the current definition of incivility, while helpful, is broad and difficult to translate into such a detection tool. Previous works that address this problem often operationalize incivility as profanity, slurs, and name-calling (Kwon & Gruzd, 2017; Santana, 2013; Vargo & Hopp, 2014). These operationalized definitions are far more narrow than Papacharissi’s (2004) definition, and may miss more subtle methods of intentionally derailing a conversation (Bishop, 2014; Gervais, 2015; Muddiman & Stroud, 2017). This is particularly true in an environment such as Wikipedia, where edits and reverts can be used productively or unproductively, and an online environment in general lends itself to forms of trolling and derailing that may be apparent to users but easily missed by systems (Bishop, 2014; Gervais, 2015; Muddiman & Stroud, 2017). As a result, this work will use the broader definition of incivility and begin by asking editors about their experiences in Wikipedia Talk pages.

Ultimately, this work seeks to more fully understand incivility and civility as they occur on Wikipedia Talk pages.

Research Team:

This research team is led by

  • E. Whittaker - a PhD student at the University of Michigan
  • Aaron Halfaker - a Principal Research Scientist at Wikimedia

Methods[edit]

Describe in this section the methods you'll be using to conduct your research. If the project involves recruiting Wikimedia/Wikipedia editors for a survey or interview, please describe the suggested recruitment method and the size of the sample. Please include links to consent forms, survey/interview questions and user-interface mock-ups.

Interview Study

           It is important to understand what Wikipedia editors view as a good discussion. In order to understand this, semi-structured interviews were used to gain a deeper understanding of what interactions Wikipedia editors value, and how they think about those interactions in relation to each other.

Participant Sample and Measures

In order to achieve data saturation, a sample size of 15 was collected. Ideally, editors would be randomly selected to be interviewed, but convenience sampling was instead be necessary. Because semi-structured interviews are not standardized measures, an interview protocol has been developed, and is available below. It includes a cognitive walkthrough of two talk pages.

Here is the Interview Protocol

Data Analysis

The interview responses were then be examined for themes and patterns. A modified grounded theory approach was used to analyze the interview data (Deterding & Waters, 2018; Glaser & Strauss, 1967; Glaser, 1992; Strauss, 1987; Strauss & Corbin, 1990). The modification was based on recommendations found in Deterding and Water’s (2018) paper on working with interview data in qualitative analysis software.


Timeline[edit]

Main deliverables:

  • Interview data

Interview Study:

Create interview protocol

Pre-test interview protocol

Recruitment

Data analysis


Policy, Ethics and Human Subjects Research[edit]

IRB forms will be posted here as they are acquired, and before any research is conducted.

Results[edit]

Interviews have been conducted, and the transcripts analyzed. At this point, the researchers are in the process of publishing this work academically, and have submitted our findings to this end. These findings have not been accepted for publication yet, but we would like to share them with the community as part of our publication process. Additionally, our findings are quite rich, and we have split them into two papers so that we have adequate space in each to discuss the implications of those particular findings. I will therefore present our findings in a similar way below. The papers (which, upon acceptance and publication will be linked here) will include quotes, but I have decided to omit quotes from this writeup.

A broad overview of incivility on Wikipedia Talk Pages:

Civility is understood as Collaboration on Wikipedia

Participants were very aware that editing Wikipedia has a specific goal, namely the creation of articles, and this was reflected in their understanding of civility on Wikipedia. The overwhelming majority of particpants indicated that they understood civility on Wikipedia Talk Pages as a process of production, and incivility is the disruption of that production. Participants indicated that disagreements, even heated disagreements, are an accepted part of Wikipedia's culture, and necessary for Wikipedia to function as intended.

Paper 1: What derailed the collaborative process of editing?

Participants indicated that a number of behaviors would disrupt editing.

Policy abuse. One tactic that our participants described as extremely common can be thought of as policy abuse. This involves editors abusing or manipulating Wikipedia's policies in order to stop discussion and drive an editor away from editing in a particular space (or in general). This kind of abuse may take a number of forms. It may involve experienced editors using their knowledge of Wikipedia's policies and processes to be cruel while avoiding censure by administrators. It may also take the form of accusing someone of Point-of-View pushing (POV pushing) or violating the NPOV policy. This particular accusation can be difficult to disprove, and is therefore easily used to remove content that the editor making the accusation does not like (although, of course, sometimes the accusation is correct and necessary to prevent biased editing.).

Identity based incivility. Identity-based incivility was described by participants as involving attacks on a person's identity in order to discredit them. It may also involve making claims of authority over other users in order to have more input. This form of incivility is often subtle.

Technology abuse. Participants also indicated that technological features of Wikipedia were exploited by editors behaving uncivilly. The markup language employed by Wikipedia can be difficult to learn, and more experienced editors appear to take advantage of this. The edit summary feature, and the fact that these summaries are often more difficult to view or follow across time, are also taken advantage of by experienced editors who wish to make uncivil comments but avoid being reprimanded for them.

Content-based. This final theme of incivility is what most think of when they think of incivility. Content-based incivility involves a message or interaction whose content is uncivil, but doesn't involve any appeals to identity. This would involve profanity, sarcasm, and forceful language. It should be noted, however, that while participants indicated that profanity can be uncivil, it is not necessarily uncivil. Participants indicated that profanity could be used in a civil way.

What does this all mean?

Comfortingly, our participants' understanding of civility fit well with previous works on civility and deliberation (Bishop, 2014; Gervais, 2015; Papacharissi, 2004; Muddiman & Stroud, 2017). However, our participants' insights add some nuance to our understanding of what kinds of interactions disrupt online collaboration. Most notable among our results is the information that the policies and technical affordances of Wikipedia are manipulated by editors in uncivil ways. Most works on incivility focus on the language of incivility, but this work indicates that additional attention should be given to the policies and affordances of the platforms being examined. What the language of incivility is continues to be a difficult question to answer, and for that reason I will generally recommend against building any kind of automatic detection tool to deal with incivility (there are a lot of caveats to this, but broadly speaking it's not the best idea).

Paper 2: Gender and Policy

As discussed above, incivility on the Talk Pages sometimes involves manipulating policy in order to claim authority and drive people away from editing. This had some gendered consequences.

The NPOV policy, in particular, was used in a gendered way by some editors. The policy is intended to prevent biased articles from being written, but some participants reported that the NPOV policy was sometimes used to prevent women from writing articles, or articles from being written about women. According to our participants, many editors view men's point of view as inherently neutral, while women's point of view is inherently biased. Standards for women editors are therefore higher, and they may be discouraged from editing certain topics.

What does this all mean? Policies created to prevent bias are instead being used to uphold cultural biases and unintentionally value one point of view over another. These policies are (mis)used by editors to silence others, particularly women, or to prevent speech about women.

References[edit]