Research:Topical coverage of Edit Wars

From Meta, a Wikimedia project coordination wiki
Jump to: navigation, search
23:06, 16 September 2017 (UTC)
Duration:  2017-September — 2017-December

This page documents a research project in progress.
Information may be incomplete and change as the project progresses.

This project is run by the Research Team as part of the Community health initiative.

This project is focused on the the usage of toxic language [1] including specific behavior, such as wiki-hounding [2] The project focuses mainly on the content of user interactions. Our research adopted a complementary, language agnostic approach, which focuses on edit wars and attempts to differentiate between topic-centered and person-centered conflicts.


Edit wars in Wikipedia has been largely studied. An edit war is usually consider to be the consequence of different opinions about an specific topic, between two or more users. For example, users with different political views might have different opinions on many articles related with politics, and these differences can scale in a multi-article edit war. These actions can be consider toxic, but not necessarily a stalking behavior. However, in the case (if exist) that edit wars start happening across multiple topics, this can be an indicator of a person-centered attack (instead of topic-centered), that might be categorized as wikihounding.

Taking the advantage that edit wars can be detected in content agnostic approach (without analyzing the text), we propose to study the topical span of those wars, characterizing usual and unusual (potentially toxic) behaviors.

The main tasks to develop such model are:

  • Define and implement a robust topic model.
    • Define a distance metric for topics (eg: Geography is N steps far from Politics, and M steps far from Sports.)
  • Generate a representative dataset of edit war in Wikipedia.
  • Detect pairs or groups of users involved in more than X (define X part of the study) controversies.
  • Apply an outlier detection mechanism to find potential cases of harassment.


  • Define a topic model, that allows to measure topic distance between Wikipedia pages.
  • Characterize user behavior according to the topics that she/he covers (edit on) and the amount of reverts that she/he does.
  • Compute the probability of pair of users of co-revise a page, and the probability that this co-revision is a revert.
  • Based on the aforementioned co-revision' probability, identify anomalous behavior that are potentially related with stalking or wikihounding behavior.

Topic Model[edit]

Pages to Topic[edit]

Topic Distance[edit]

  • Wikiprojects can be represented as a graph (we use this library
  • Then, giving that each page can belong to more than one category, we define the distance between two pages as the minimum shortest path [3] among all pairs of nodes on the Wikiprojects graph.
    • Example: Given a page X and Y, with X in Wikiprojects a and b and Y in Wikprojects c, and d. We compute the lenght of shortest path between (a,c), (a,d), (b,c) and (b,d), and return the minimum value among these results. In Python:
   def distancePages(Graph,page1,page2):
       Graph: is the wikiprojects graphs
       return -2 if error  (page without wikiproject)
       return -1 if the two pages are the same
       return shortest path
       global pagesToWikiprojects
       results = []
       if page1==page2:
           return -1   
           pages1Projects = pagesToWikiprojects.get(page1,[])
           pages2Projects = pagesToWikiprojects.get(page2,[])
           return -2
       for x,y in product(pages1Projects,pages2Projects):
       if not results:
           return -2
           return min(results)

User Behavior[edit]

Topical Coverage[edit]

  • Topical stability (us): For each user U, we obtain the distance for his next revisions . For example, given a user U, doing three revisions, the first one in the topic 'Sports', the second one in the same topic, and the third one in the topic biology (with distance 4 from Sports), the probability of user U to edit with topical distance 0, is 2/3, with distance 4 is 1/3, and 0 for the other distances. This metric gives an idea of user stability in terms of topics.

Reverting Behavior[edit]

  • For each user we compute portion of reverts (within the dataset) compared with his/her total amount revisions. Considering that our dataset just contains revisions done by users 10 or more revisions, note that we are just considering the reverts among these users.

Wikiprojects Graph[edit]

Terminology / Conventions[edit]

  • We use Wikiproject as proxy for Topics


Reverting Behavior[edit]

Not surprisingly we found a strong correlation between number of revisions and number of reverts done, suggesting that reverting other is part of the task of active users. A detailed analysis of reverting behavior can be found here: [2]

Characterization of user topic-focus[edit]

Following our definition of topical stability, we see that 83.95% of 'next' revisions happen on the same page, and 99.25% on the same topic. Moreover, 52.88% of users never jump out of the same topic, however 41.88% jumps more than 4 steps at least once.

Characterization Topic focus of Wikipedia Editors

For more details check: [3]

Characterization on Topical Distance in Multipage editwars[edit]

In order to have notion of how frequent are reverts and editwars across multiple topics, we consider all pair of users with U, V, where U has reverted V more than 2 times, and compute the topical distance between all the pairs of pages reverted, next we compute the mode (most frequent value) for each pairs of users, and report the frequency of those values. As expected, most of wars (71%) focus on one page, 22% in the same topic but different page. The remaining 7% are a cross-topic reverts, reinforcing our intuition that cross-topic edit wars are rare.

Characterization Topical Distance in Multipage Editwars
Distance  %
-1 0.7107
0 0.2297
4 0.0143
5 0.0121
2 0.0103
3 0.0074
6 0.0056
7 0.0034
8 0.0027
1 0.0016
9 0.0012
10 0.0007
11 0.0002
12 0.0001


We found that almost 40% of users jump 4 or more steps in the Wikiprojects graph, making difficult to predict the likehood of to users of co-editing the same page. Details about this study can be found here: [4]

Conclusions and Main Outputs[edit]

  • Based on previous work [5], we have implemented and release a model that allows to measure topical distance between Wikipedia pages.
  • We have found that around 99% of 'next revisions' are done within the same topic.
  • We have found that just 7% of edit wars are cross-topic.
  • However, users involved in this cross-topic edit wars are generally very active users, making difficult to assume that those wars are due person-centered conflicts.


Raw Data[edit]

  • We consider all the users with more than 10 revisions in the selected period (1/Jan/2017 to 16/Nov/2017)
  • All the revisions from those users can be obtained from this query:
  • List of possible Bots was obtained using this approach: [Research_talk:Identifying_bot_accounts]

Reverts dataset[edit]

  • The interactions dataset can be downloaded here: [6]
   Format: interactions[user1][user2]:[[pageid,timestamp,deltatime,revision_id_reverted,revision_id_reverting], another revert, etc]
   user1: user reverting
   user2: user reverted
   pageid: page_id
   timestamp: timestamp when the reverted version was created (done by user2)
   deltatime: delta time from the reverted version to the reverting revision (done by user1).
   revision_id_reverted: revision_id_reverted (by user2)
   revision_id_reverting: revision_id_reverting (by user1)

Wikiprojects Graph[edit]


  • Find all the code used in this study here:
    • Generate interactions dataset: [8]
    • Reverting Behavior study: [9]
    • Topic-Span of edit wars: [10]

Future Work[edit]

  • Design probabilistic model for outlier detection, considering the (un)predictability of two users of co-editing the same page.
  • Improve mapping system from pages to Wikiprojects: The aforementioned [query] returns around 22% of pages matching with no wikiproject. However, manually reviewing we found cases where the Wikiproject is not correctly (?) assigned as category in the Talk pages (example en:Classon_Avenue_(IND_Crosstown_Line) belongs to Wikiproject Trains, but that project is not listed as category).


Q1, Q2


  3. shortest path
  4. script for parsing wikiprojects graph
  5. script for parsing wikiprojects graph