Title: What MoodBar tells us about new registered users
MoodBar is an experimental tool designed to collect feedback from newly registered users about their first experience editing Wikipedia. Its main purpose is to take the pulse of new editors in real time and address issues they encounter when they first hit the edit button. However, MoodBar also gives us a unique opportunity to measure the activity of these new contributors. While we know that levels of early activity vary widely, our results indicate that contributors who share their mood about their first editing experience represent a particularly active and prolific group of new editors:
- MoodBar users are twice as productive as active users who never sent any feedback.
- Receiving a “helpful“ response to MoodBar feedback is associated with an edit count four times higher than that of an average active user who never sent any feedback.
In this short series of posts we present results from studies on the engagement of newly registered users using MoodBar.
MoodBar usage and early editor activity
Some time has passed since the introduction of MoodBar. The last time we wrote about it was to introduce the Feedback dashboard, a tool that allows community members to respond to issues reported by MoodBar users. In December 2011, we added another small feature, the Mark as “Helpful“ extension, which allows MoodBar users to rate the helpfulness of the response they received.
Since last December, 16,810 feedback messages have been posted using MoodBar by 14,568 new registered users. Of these, 5,767 have received at least one response so far, 961 of which have been rated as “Helpful” by the original posters using the Mark-as-Helpful button. Of all the new editor accounts created since August 2011, when we deployed the first version of MoodBar, roughly 2% have posted feedback using this tool.
These numbers allowed us to study whether MoodBar – a feature that is activated when a new user clicks on the edit button for the first time – has any positive impact on new editor engagement. In this post we will tackle the following research question: what does MoodBar usage tells us about the behavior of new editors and, in particular, is there any difference in early activity between those users who share their mood and those who don't?
What we learned
To answer the above question, we analyzed the edit count of new registered users during their first 30 days of activity. Our sample includes both users who sent feedback using MoodBar and users who did not, despite it being activated. We call the latter group of active users (i.e. new users who clicked the edit button at least once) our Reference group. Among those who did post feedback, we further distinguish three categories:
- Feedback: new users who posted feedback but did not receive a response from the Feedback Dashboard response team;
- Feedback+Response: new users who did receive a response but did not mark it as helpful (whether they really didn't find it helpful or simply never saw it);
- Feedback+Helpful: new users who did receive a response and marked it as helpful.
The results are surprising. Figure 1 plots the average edit count after 30 days since the activation of MoodBar. We can see that MoodBar users (green bars) do indeed contribute to Wikipedia more than the reference group of non-MoodBar users (orange bars). However, those who have received a response but have not marked it as “helpful” (Feedback+Response) seem to contribute slightly less than those who simply posted feedback but never received a response. This seems counter-intuitive: even assuming that a large fraction of users in the former group was not aware of the possibility of receiving a reply, and hence did not know that somebody had responded to their call for help, how can it be that receiving a response is associated with a lower productivity than not receiving a response at all?
To shed some light on these preliminary findings we ran a more controlled series of tests.
We performed a regression analysis of the edit count of users in our sample over their first 30 days of activity to control for possible external factors that we were not taking into account. The main results of the analysis are the following:
- MoodBar users who received a response not marked as helpful (Feedback+Response) are indeed slightly less productive (-6%) than those who only posted feedback but did not receive a response (Feedback).
- MoodBar users who received a helpful response (Feedback+Helpful) are 41% more productive than MoodBar users (Feedback), and 400% more productive than non-MoodBar users (Reference).
- Different types of reported mood are associated with different levels of productivity, but in a very limited way:
- For all groups, reporting a “confused” mood is associated with a 4% decrease in productivity compared to reporting a “happy” mood.
- For users in the Feedback+Helpful group reporting a “sad” mood is associated with a 16% lower edit count than reporting any other mood.
Identifying natural-born Wikipedians
These results are very encouraging for both the MoodBar team and the community members who use the Feedback Dashboard and volunteer in the Response Team – to whom we are grateful for their priceless work at welcoming and responding to new registered users. However, it is natural to ask what is the cause of the effects we observed. Are natural-born Wikipedians particularly good at posting feedback and receiving helpful responses, or are good feedback and helpful responses what really makes for a good Wikipedian? Or in other words: is MoodBar increasing the productivity of new editors or just helping identify the most productive ones?
We are currently running a controlled experiment to answer this question. We collected data on a sample of newly registered editors who did not have MoodBar enabled by default on their accounts, and we intend to compare them with a group of users who had MoodBar enabled as usual. At the same time, we issued a call to action to the response team in order to clear the backlog of unanswered feedbacks. We hope that this will give us enough data to be able to perform a statistically significant comparison and understand whether MoodBar has a genuine effect on new users or is simply a good detector of natural-born Wikipedians.