Jump to content

Research talk:Onboarding new Wikipedians/OB6

Add topic
From Meta, a Wikimedia project coordination wiki

Work log[edit]


Wednesday, February 12th[edit]

Today, I'm building a report to track the usage of GettingStarted and GuidedTour. There are a few resources that I can draw from:

Things that would be nice to know:

--Halfak (WMF) (talk) 18:22, 12 February 2014 (UTC)Reply

So, this is going to be difficult. My log data from those schemas is only available to be joined with data from enwiki, but I'd like to be able to perform this analysis for all wikis where GS is turned on. Right now, that list is small, but it could grow. I think I'm going to have a build up a whole environment to help me gather this data. This could be something that supports a lot of other work too.

Checking with DarTar on that strategy now. --Halfak (WMF) (talk) 18:58, 12 February 2014 (UTC)Reply



Like we discussed on IRC, using subcohorts creates confounds regarding the reason for any observed effect you see. However, I would like to observe basic edit rates (at least) for the following sub-cohorts. Explanation is inline.

1. control users with gettingstarted-specialpage-impression vs. test users with redirect-invite-impression.

Looking at basic funnel analysis, it seems that while 99% of control users got an impression, only 89-90% of all test users bucketed got a redirect-invite-impression. Aaron, your proportion might be different, since you also filtered out accounts that weren't self-created. These two subcohorts are most important I think, and have fewer potential confounds because they did not behaviorally change based on an action. We just know for sure they saw the UX we intended. I am concerned that by examining total cohorts we are unfairly disadvantaging the test condition, where fewer users got the correct UX due to poorer browser support and so on.

2. control and test users with appropriate click actions (so gettingstarted-specialpage-click in the former, redirect-invite-click in the latter) by funnel (meaning redirect funnel for test, and the usual gettingstarted-* for all)

This will tell us if there are different activation rates and quality according to sub-funnel. It is likely that while some may be substantially similar (i.e gettingstarted-copyedit) others are complete unknowns (particularly redirect click users)

3. test users who completed the 'firstedit' guided tour

What proportion of users in the redirect funnel / test condition actually completed the tour? Does the tour lead to more high-quality edits, or does it encourage low quality contributions?

Does this make sense Halfak? Steven Walling (WMF) • talk 23:48, 15 October 2013 (UTC)Reply