Research:Onboarding new Wikipedians/OB4

From Meta, a Wikimedia project coordination wiki
Jump to navigation Jump to search
Shortcut:
R:OB4
Walnut.svg
This page in a nutshell: Adding additional task types -- adding links and clarifying articles tagged as confusing or vague -- to the GettingStarted process did not lead to a significant increase in active editors. The most popular task type was still simple copyediting.

Overview[edit]

Screenshot, landing page

This experiment is a test of the effect of a new Special:GettingStarted landing page which includes two additional task types in addition to copyediting -- articles needing clarification and adding wiki links. In order to learn about the impact of the new design, we will be running it as a 50/50 split test against a control which receives no landing page, but rather the MediaWiki default.

Cohorts[edit]

We are examining four distinct funnels in this experiment, plus a control group who received no new interface. In total, those are:

  • Control (no delivery of Special:GettingStarted)
  • Test (given new Special:GettingStarted). This breaks down further in to...
    • Returnto (via the "No thanks, take me back where I was" button)
    • Gettingstarted-copyediting (users who viewed a "Fix spelling and grammar" task article)
    • Gettingstarted-clarification (users who viewed a "Improve clarity" task article)
    • Gettingstarted-addlinks (users who viewed a "Link pages together" article)

Metrics[edit]

We will focus on the following metrics for this experiment:

  • The live account rate. (At least one click on an NS0 edit button post-registration)
  • The 1+ edits threshold (1+ NS0 edit in 24 hrs)
  • The 5+ edits threshold (5+ NS0 edits in 24 hrs)
  • Revert rate

Research questions[edit]

For this experimental set up, we're particularly interested in answering the following questions:

  • How do the conversion rates of gettingstarted users exposed to this version of the landing page compared to the control group? We want to know how these conversion rates compared to R:OB2 and R:OB3.
  • How do the conversion rates among task types (copyediting, clarification, add links) compared to each other?
  • What is the quality of edits made in each task funnel, as measured by revert rate and hand-coding? See also: Qualitative analysis

Cohort analysis[edit]

We defined the following cohorts in order to measure their respective productivity. We included in the sample all new registered users on the English Wikipedia (excluding mobile registrations) during the first week of the split test.

OB4 cohorts
Cohort ID Unique users Description % of total bucketed in group
All users
e3_ob4a 14,684 Users in the control group 100%
e3_ob4b 14,756 Users in the test group 100%
GettingStarted funnel
e3_ob4b_gettingstarted-any_page-impression 1,566 Unique users in the test group who landed on any GettingStarted article (regardless of the type) upon successful account creation. 10.6%
e3_ob4b_gettingstarted-addlinks_page-impression 455 Users in the test group who landed on a GettingStarted article ("add links") upon successful account creation. 3.1%
e3_ob4b_gettingstarted-clarify_page-impression 495 Users in the test group who landed on a GettingStarted article ("clarify") upon successful account creation. 3.3%
e3_ob4b_gettingstarted-copyedit_page-impression 771 Users in the test group who landed on a GettingStarted article ("copyedit") upon successful account creation. 5.2%
ReturnTo funnel
e3_ob4a_returnto_page-impression 2,071 Users in the control group who returned to an editable page via the ReturnTo button upon successful account creation. 14.1%
e3_ob4b_returnto_page-impression 3,324 Users in the test group who returned to an editable page via the ReturnTo button upon successful account creation. 22.5%

Note: The total number of users who fall into a gettingstarted_page-impression cohort is 1,566, or 10.6% of the test group. This overall group is split further in three cohorts to be able to compare the different task types for each of the three.

Live account rate[edit]

We measured the proportion of "live accounts" or users in each cohort clicking at least once on the edit button on an article in the main namespace within 24 hours of registration (measurement taken 24h after the last valid registration).

OB4 ns0 24h live accounts (test vs control)
ID Unique users Live accounts %
e3_ob4a 14,684 4,671 31.8%
e3_ob4b 14,756 5,133 34.8%

We found a significant 3.0% difference in the proportion of live accounts between test and control [X² = 29.20, N = 29,440, p < .01].

Below are the live account rates for users clicking on different types of GettingStarted articles upon account registration.

OB4 ns0 24h live accounts (GettingStarted funnel)
ID Unique users Live accounts %
e3_ob4b_gettingstarted-any_page-impression 1,566 555 35.4%
e3_ob4b_gettingstarted-addlinks_page-impression 455 146 32.1%
e3_ob4b_gettingstarted-clarify_page-impression 495 150 30.3%
e3_ob4b_gettingstarted-copyedit_page-impression 771 321 41.6%

1 edit in 24 hours[edit]

We measured the proportion of users in each cohort completing their first main-namespace edit within 24 hours of registration (measurement taken 24h after the last valid registration).

OB4 1+ ns0-edit 24h threshold (test vs control)
ID Unique users Editing users %
e3_ob4a 14,684 2,988 20.3%
e3_ob4b 14,756 3,225 21.8%

We compared the proportion of threshold-hitting users in each group and found a significant 1.5% difference between test and control [X² = 9.94, N = 29,440, p < .01].

Below are the rates of threshold-hitters for users clicking on different types of GettingStarted articles upon account registration.

OB4 1+ ns0-edit 24h threshold (GettingStarted funnel)
ID Unique users Editing users %
e3_ob4b_gettingstarted-any_page-impression 1,566 270 17.2%
e3_ob4b_gettingstarted-addlinks_page-impression 455 65 14.3%
e3_ob4b_gettingstarted-clarify_page-impression 495 83 16.8%
e3_ob4b_gettingstarted-copyedit_page-impression 771 159 20.6%

5+ edits in 24 hours[edit]

We also measured the proportion of users in each cohort completing 5 or more main-namespace edits within 24 hours of registration (measurement taken 24h after the last valid registration). The proportion of 5+ threshold hitters varies insignificantly between the two conditions.

OB4 5+ ns0-edit 24h threshold (test vs control)
ID Unique users Editing users %
e3_ob4a 14,684 460 3.1%
e3_ob4b 14,756 503 3.4%
OB4 5+ ns0-edit 24h threshold (GettingStarted funnel)
ID Unique users Editing users %
e3_ob4b_gettingstarted-any_page-impression 1,566 46 3.0%
e3_ob4b_gettingstarted-addlinks_page-impression 455 18 3.9%
e3_ob4b_gettingstarted-clarify_page-impression 495 18 3.6%
e3_ob4b_gettingstarted-copyedit_page-impression 771 21 2.8%

Conclusions[edit]

We observed that the test group that got Getting Started had a +1.85% increase their rate of first edits, which is still a statistically significant improvement. However, this is not an improvement in conversion rates over our first versions of the landing page with just one task type (see R:OB2) and since the majority of users are still choosing the copyediting task, it is possible we're just cannibalizing from that group.

Remote usability tests also suggested that users were confused by the need to select both a task type and an article. We also observed that the daily number of Getting Started edits tagged in RecentChanges decreased during the period after launch, even after all users were exposed this landing page.

Overall, adding two additional task lists in this style did not lead to our desired goal of a statistically significant increase in active editors reaching their fifth edit, and may have had negative effects on first time editors. With the user experience changes we made for the next split test, we hope to simplify the choice necessary to merely selecting one of the three tasks, and give users a random article appropriate to this task type.