Research:Metrics/Brainstorming

From Meta, a Wikimedia project coordination wiki

Engagement and retention Metrics[edit]

Dario: goals for first iteration

  • Scope: what is covered / what not (for example, reader engagement metrics is for later)
  • Aim: distil up to 5 (?) key user engagement metrics, focus on new registered users.
    • engagement (volume of raw contributions)
    • retention (editor survival, edit rate decay)
    • work quality/productivity (revert rate, text survival metrics, quality rating)
    • type of contribution/diversity (taxonomy, size of contributions, cross-namespace contributions)
  • For each candidate metric:
    • Rationale (What does this metric measure? How does it work? How do we know it is appropriate?)
    • Pros/Cons of using it as opposed to other metrics
    • Technical dependencies (how complex it is to extract these metrics? Do we need supplementary data sources on top of the data available in the MW database)
    • Optional: communication (overheads in explaining what these metrics stand for to a lay audience)
    • Optional: refs in the literature

Motivation & Justification[edit]

(Ryan F.) Having well defined metrics can help us:

  1. Measure the effect over time due to new feature implementations and due to experimentation
  2. Assign value to product features and experiments in the backlog. This will aid prioritization of features as well as experiments.
  3. Brainstorm new feature ideas and experiments
  4. Gain an understanding about which metrics correlate most with the growth/decline of the community (the health of Wikipedia) - there is an implicit premise here that we will be able to "define" the strength of the community via a set of metrics

Broadly, what are we trying to measure?[edit]

Here, rather than define directly quantifiable metrics it may help to come up with some higher level descriptions that can help us brainstorm new low level metrics and to provide some structure to our metric definitions. Some of these metric "categories" may be:

  1. editor retention
  2. editor contributions (edits - active editors, different categorization based on edit counts and patterns)
  3. editor contributions (other - click through rates, account registration, article feedback etc.)
  4. edit quality (edit surivability)

Independent variables against which to measure metrics:

  1. time unit (simplest and most common)
  2. edit count (range)
  3. article category
  4. registration date

Infrastructure[edit]

  1. datasources
    • slave dbs
    • squid logs
  2. building and storing the metrics
    • python - we'll want to have an extensible code base here for building metrics
    • cron jobs
  3. how will we report these metrics
    • dashboards
    • data visualization
      • d3.js (time-series)
    • report card (utilizing david S.'s libraries?)
    • served on apache
  4. where will this live?
    • stat1 - metrics generation
    • stat2 - hosted reporting and visualization

Resource allocation[edit]

- Aaron, Giovanni, Dario, Ryan F.

Nomenclature (from least to most active user classes)[edit]

(may need to include readers, see below)

  1. Registered user
  2. Active user: registered user who has *visited* an edit page once [1][2], aka “live user” [3]
    • note that users may be ambiguous if it's referred (in some contexts) to readers
  3. Users who performed at least one edit a day [4][5]
    • how to call this class? contributing editor, contributor? editing user? [3]
  4. New wikipedian: editor who has performed 10 edits [6]
  5. Active editor: > 5 edits/month [6]
  6. Very active editor: > 100 edits/month [6]

Metrics[edit]

Global[edit]

Number of users by month/week/day:

  1. Nr. of registrations [3]
  2. Nr. of active users [6]
  3. Nr. of editing users (see above) [3]
  4. Nr. of new wikipedians [6]
  5. Nr. of active editors [6]
  6. Nr. of very active editors [6]
  7. Nr. of sessions
  8. Nr. of edits per session
  9. Nr. of "notifiable" users with a verified email address [7]
  10. Nr. of active readers (identified by the volume of feedback submitters with a unique identifier (user_id or IP address), we can't get uniques unless we rely on anon tokens, we have persistent anon tokens for AFT4, not for AFT5)

Readers (We haven't thought through reader engagement metrics bu we will have to at some point, also this might be relevant for proto-users or power-readers or whatever we want to call them)

New users[edit]

(I recommend we have separate set of retention metrics for newly registered users who may not have completed their first edit)

  1. Time to first edit click (after how long a registered editor becomes active?)
  2. Time to first edit (after how long a registered editor completes his/her first edit?) [8]
    • both are instances of the time to milestone metrics family, see [9]
  3. Number of non-edit modifications to the site (watchlisting, preferences) as tracked by user.user_touched.
    • needs to be sampled at regular intervals (maybe monitoring db replication UPDATEs via triggers)
  4. Email authentication (has the user inserted an email address at registration time? has the user authenticated it?)
  5. Time to email authentication (after how log a registered editor authenticates his/her email address?)

Editors[edit]

(I imagine we want to restrict these metrics to registered only at least in a first phase? A subset may also apply to anonymous editors whose persistent activity we can obtain via clicktracking logs)

  1. Cumulative edit count [6][1][10]
  2. Daily edit count, aka editing activity [6][4]
    • activity is the inverse of the average time elapsed between two consecutive edits
    • we should also include edit rate per session, Aaron has worked on that
    • we need to clarify how we deal with editors using power-editing tools
  3. Edit delta

Retention[edit]

  1. see Ryan's suggestion of using a k-n retention metric, for editors making at least k edit(s) at minimum n days [9].
  2. Time to milestone, see [9]
  3. Productivity (or maybe "quality" to avoid confusion with edit
    1. Revert rate within a given time window
    2. Binary variable (based on criteria such as: at least one revision that didn't get reverted in the first week, see [11])
  4. We need to distinguish between metrics that are measured over a limited period of time (i.e. an observation window), and metrics who don't. k-n retention seems to fall in the former
  5. Average editing rate per session
    • How is a session defined?
  6. Average session length (in minutes)
    • Ditto
  7. Edit metadata:
    • Tool used (normal interface, power-editing tool, etc., API)
    • Session id (if required?)
    • User agent
    • The usual metadata (Namespace, etc)
    • Edit summary (amount of text inserted, removed, etc.)

Groups (cohorts or treatments)[edit]

(These metrics should include all of the global metrics above (at least those which can be measured at cohort or treatment group level. Note that at some point we may also want to consider WikiProject-level metrics, or reuse and adapt any metrics that Global Dev is developing for outreach events or Global education to measure group-level productivity)

  1. Cohort retention [6]: percentage of editors who joined in a given month, who are still active (= made at least one edit) at the time of measuring (see individual retention above)
    • we should have more sophisticated metrics for groups than just means averaged from individual metrics as many of these metrics will not be normally distributed

Articles[edit]

(This may not be within the scope of editor engagement, but we may have to include some article-level metrics for some of our analyses)

  1. Macro - Categorization

Data sources[edit]

  1. revision timestamps (revision.rev_timestamp)
  2. daily contributions (user_daily_contribs.contribs)
    • data only from 2011 (?)
    • does it capture actions other than revisions?
  3. user_touched: “the last time a user made a change on the site, including logins, changes to pages (any namespace), watchlistings, and preference changes” [12]
    • does not update upon logins based on cookies
    • does not only measure contribution: is it any good for integration with other data?
  4. first click on the edit button (edit_page_tracking.ept_timestamp)
    • data only available from July 2011

References & Notes[edit]