Research talk:Gender micro-survey

From Meta, a Wikimedia project coordination wiki
Jump to navigation Jump to search
See also discussion

Boy does this make me feel uncomfortable[edit]

Yes, I understand what you're trying to do here. But I will tell you that popping this question is always going to leave editors wondering "What happens differently depending on my answer?" Anyone who's even slightly web-savvy will *expect* that the next screen they see will be dependent on the answer to the question. I'm not terribly in favour of your asking this question in the first place, but asking it in this way may very well stop people from doing what they're doing already. Risker (talk) 01:13, 2 July 2013 (UTC)[reply]

Right. Not saying what the question is for would be dishonest. "For research purposes" means little, that's what every marketing company would say; however, there is a "learn more" link, where does it point (not to this page, I assume and hope: only 12 pageviews yesterday). --Nemo 05:28, 2 July 2013 (UTC)[reply]
Regarding, "What happens differently depending on my answer?": before the three options, there is a disclaimer that reads "This information is for research purposes only. (learn more)". The learn more link leads to en:Wikipedia:Survey disclaimer which clearly describes how the information will be used. Also, this is why we explicitly included a "prefer not to say" option as one of the three. Steven Walling (WMF) • talk 05:35, 2 July 2013 (UTC)[reply]
Steven, the "survey disclaimer" says: "By participating in a survey, you consent to the transfer of the information you submit to the Wikimedia Foundation in the United States and elsewhere. The information collected is for internal research purposes only and will not be publicly associated with your account. However, we may share the results of this survey with the public in anonymized or aggregated forms." It does not preclude any decision-making by the survey software. In fact, that's pretty much the standard survey disclaimer used regardless of what happens as a result of the completion of various answers. Incidentally, have you passed that through Legal? Risker (talk) 05:39, 2 July 2013 (UTC)[reply]
Yes, we consulted with legal. For future reference, there is a guide on our office wiki about legal survey requirements with disclaimers and other procedures that everyone is aware of. Steven Walling (WMF) • talk 05:43, 2 July 2013 (UTC)[reply]


It's amusing that we're using GuidedTour and EventLogging to do completely different things, but note that this usage changes the meaning of GuidedTour, which should hence be renamed for clarity, otherwise the community (and not only the community) loses track of what's happening: remember ArticleFeedback's failure due to poor understanding of its purposes, also because of its unclear name. --Nemo 05:28, 2 July 2013 (UTC)[reply]

I don't really think a name change for GuidedTour makes sense at this point. Nearly all the functionality is still custom made for multi-step tours, pointing out particular features. This was simply the most efficient way to run this single step tour at the last minute. Steven Walling (WMF) • talk 05:36, 2 July 2013 (UTC)[reply]
I guess the problem here is that it is not a tour, it is a survey. One that prevents users from actually using the guided tour until they either respond or dismiss it. Risker (talk) 05:43, 2 July 2013 (UTC)[reply]
"One that prevents users from actually using the guided tour until they either respond or dismiss it." Your language is again confused here. How can they prevented from using a tour if it's not a tour? In any case, the survey is totally optional. As noted in the documentation here and the regular guided tours documentation, users have a variety of methods for closing the tour permanently, including the X button, hitting ESC, clicking anywhere outside the tour message, or completing the tour. There is literally no other individual interface element on the site with more affordances for getting rid of it. Steven Walling (WMF) • talk 05:46, 2 July 2013 (UTC)[reply]
Steven, please pay attention. When one creates a new account, NORMALLY the first thing they get directed to is a page that offers guided tours. Now they get directed to that page, but there's a survey on top of it, and they can't take the tours until they either answer the survey or dismiss it. The fact that the software behind the survey is called "GuidedTour" just means that it's another piece of software either mislabeled with common English words that it doesn't really mean, or it's software hijacked to do something it's not intended to do. Risker (talk) 05:54, 2 July 2013 (UTC)[reply]
"NORMALLY the first thing they get directed to is a page that offers guided tours." Sorry, but that's incorrect. All these users are automatically directed to Special:GettingStarted post-registration, which does not mention or include a guided tour unless they continue through by selecting a task. Even then, users are not exposed to the name "guided tour" as some distinct part of the process. There isn't really any room for confusion, unless you're the kind of editor who delves around WMF product pages and research documentation. Steven Walling (WMF) • talk 06:01, 2 July 2013 (UTC)[reply]
Having just registered two new socks (they're identifiable) to see this myself, the first page (which got covered over by the survey) offered me three things to do, which all led to what a normal person would call guided tours. Surveys aren't guided tours. This particular one may be utilizing the GuidedTour software, but it's not a guided tour. And geez, thanks for the implication that my opinion isn't worth considering. Risker (talk) 07:00, 2 July 2013 (UTC)[reply]

Metrics: successful registrations[edit]

The research seems to lack metrics about how this additional step and hindrance to new editors affects successful registrations, or more specifically the number of new accounts completing an edit within some time range. (This is particularly relevant if we're going to constantly clutter the registration process, which seems likely to happen: «further surveys potentially launched as-needed».) It also doesn't say why it's necessary to give the survey to all users rather than, say, half (or 10 %, or 5 %? statistical experts to the rescue please). --Nemo 05:28, 2 July 2013 (UTC)[reply]

Concur with Nemo here. You don't even have baseline data to tell you how many male/female editors were starting before VE, so you have nothing to compare it to. Risker (talk) 05:35, 2 July 2013 (UTC)[reply]
1. "You don't even have baseline data to tell you how many male/female editors were starting before VE." No, we don't unfortunately, but that has nothing to do with edit rates, unless you're assuming men and women have different ones, which we can't. 2. Your language is confused here. You say "successful registrations", but this survey is given after a user registers only, so it can't impede registrations. If you're just interested in "the number of new accounts completing an edit within some time range", we can very very easily track the proportion of accounts that complete an edit with 24 hours. That rate is something we have tracked over many months and years, and which we know intimately. I seriously doubt a one step, optional tour is going to deter or distract a significant number of people, but I would be happy to measure and publish that edit rate to prove you wrong without a doubt. Steven Walling (WMF) • talk 05:39, 2 July 2013 (UTC)[reply]
Nobody cares if a new account makes a single edit in a specific period, what they care about is whether they make multiple edits over multiple editing sessions over time. (Well, that's not true. It seems to be an incredibly important metric for those WMF projects that need to demonstrate that they've produced new editors.) A survey is NOT a tour, even though you're using the software intended to be used for tours. And a survey on how many new editors are male or female, linked directly and intentionally to the introduction of new software, cannot be used to show any level of change related to that software absent the existence of baseline data. It's actually quite disturbing that this survey would take place right now. I can tell you that if I am asked such a question on registration to a new website, I log off and never return. Risker (talk) 05:51, 2 July 2013 (UTC)[reply]
Risker: the documentation explicitly says that the point of the first survey is to establish a baseline. Steven Walling (WMF) • talk 05:57, 2 July 2013 (UTC)[reply]
Yep. It also gives a "related research" link to the research on VE. That's pretty clearly linking VE with this. What I'm not seeing is why you didn't do this, say, mid-May, and then again now. As it is, I'm expecting that somewhere down the line that we're going to start hearing something like "look, 20% of registrations are female once we went to VE, therefore VE attracts more female editors". In any case, my bigger concern is that this survey is standing between registration and the guided tours, which seem to be at least somewhat successful in getting somewhere on user interaction. How about putting it on their talk page instead? You still will probably repel some people, but at least new users will see an educational, supportive first screen before they're asked personal questions. Risker (talk) 06:08, 2 July 2013 (UTC)[reply]
"What I'm not seeing is why you didn't do this, say, mid-May, and then again now." To be totally honest, it's because we didn't plan that far in advanced. For context: I don't work on VE research normally, it's totally outside my scope, and I didn't decide to run this. My boss asked me to run this quick survey late last week. We actually tried to get it out a few days before the VE split test was done, because that would have been a perfect way to get two comparable groups. In any case, the Foundation has strangely never run any demographic survey of new registrations that I am aware of, so even if we have to run some followup survey or split test + survey to make a claim about the impact of VE in particular, I think this is useful to have as a basis for conversation about gender on English Wikipedia. There is also the opportunity to run a similar survey on other projects before VE is launched there, to establish a pre-VE baseline as you suggested. Steven Walling (WMF) • talk 06:17, 2 July 2013 (UTC)[reply]
Okay. Now, I'm not kidding about moving that survey. Decide what face Wikipedia should be presenting to new editors (one that wants them to participate, or one that asks them personal questions before telling them anything else). Myself, I'd go with showing them how to participate before worrying about what gender they are (and yes, after all these years working with Wikimedians, my first thought was "where's the 'OTHER' option?"). Their talk page should work fine. Risker (talk) 06:22, 2 July 2013 (UTC)[reply]

Talk pages are certifiably horrid for surveys of people who aren't already very active editors. The response rates are abysmally low. This survey is just one question, is very easily dismissable, and certainly is important enough to spend one week running, considering we know almost nothing reliable about the gender distribution of people who register accounts. Steven Walling (WMF) • talk 06:34, 2 July 2013 (UTC)[reply]

Well, I didn't think it was all that easily dismissable. The box is faint grey and about 1/4 the size that one normally sees in dismissable surveys, which are most frequently a BIG white X in a black or grey circle. And I really hate that new users will walk away with a first impression that Wikipedia thinks their gender is more important than anything else that they bring to the table. I'd have walked away, myself. Risker (talk) 06:40, 2 July 2013 (UTC)[reply]
Steven, for me a "successful registration" is one which produces the result the user expected, which might have been e.g. being able to set a preference or create a page; of course it's hard to measure what the user wanted, but it's 100 % sure they didn't register to take part to a survey ;).
I didn't expect this talk page to explode like this; I think it makes sense to conduct one such survey, as it happens to be technically so easy, but we must be sure to keep track of what we want to achieve and what effects our actions have. There still isn't any indication of why the survey should be forced to all new accounts, nor any harm reduction/monitoring technique. --Nemo 15:43, 2 July 2013 (UTC)[reply]

Boy, does this make me uncomfortable, part 2[edit]

I'm uncomfortable too; though for different reasons than Risker. (Despite this, I'm glad to see micro-surveys underway! And glad that gender cis getting more attention. Thank you.)

  • There are more than 2 genders. You might at least include an "other" option. See for instance: Genderbread
  • Please don't run surveys for "all users who do X" for any X. This throws away an easy control group. Asking this question of a randomly chosen 10% of all newly registered users during the same period, for instance, would let you compare other metrics across those users, based on whether or not they got the survey.
  • Please estimate and try to measure the cost of each survey or engagement. The best surveys should have negative cost: they increase the participants interest in the project and their engagement. Over time we should learn how to make almost any survey, banner, or process have negative cost. But we can't learn how to do this if we don't start measuring that and learning from the results.
  • Please don't run surveys on only the English Wikipedia. I'm an English Wikipedian, and it makes me uncomfortable; I can only guess how it makes the members of other wiki communities feel. While there are interesting studies conducted across many languages and projects, there is still a systematic bias (over-focusing on the English language and on Wikipedia) in the research methods that we endorse and directly apply WMF resources to. This limits our understanding of the projects, and in particular keeps us from understanding the highly relevant differences across language and wiki communities. It seems to me that comparing sibling wikis is our most immediate way to evaluate the impact of interventions -- but we can only do that if we are studying (all of our) baselines on multiple projects. SJ talk  10:50, 3 July 2013 (UTC)[reply]

A note re: translation -- Adding a translation step to small surveys like this one should be no big deal. And if it's still a Deal today, it is worth investing in that translation toolchain. That's a strength which we need for so many things: we should improve it every time the issue comes up. Cheers, SJ talk  21:10, 3 July 2013 (UTC)[reply]

Thanks for the comments. I think some of them are easy to consider for next time, like adding "Other", while frankly other areas of this we just needed to make a call (e.g. the tradeoffs of running a survey directed at new accounts via another method). Regarding translations: the advantage of using guided tours is that the tour is automatically sent out to TranslateWiki. As l10n happens, there's nothing stopping us from running the same survey on other wikis, including potentially pre/post VE launch like we originally wanted to for enwiki. I know that James F. was excited about the idea. Steven Walling (WMF) • talk 21:34, 3 July 2013 (UTC)[reply]

Survey 2 length[edit]

From 11 to 31 looks more like 3 than 2 weeks. Or didn't they all start and end all together? Worth clarifying.[1] --Nemo 14:41, 5 August 2013 (UTC)[reply]

Stereotype threat[edit]

I didn't check the sources, but: «Even before the first dot-com bubble burst, Stanford psychologist Claude Steele showed that a stereotype threat can bias performance by evoking negative stereotypes. For instance, when women are reminded of the stereotype that men are better at math - even in extremely subtle ways, such as checking a gender box at the beginning of an exam - their performance measurably declines.» [2] --Nemo 11:17, 20 July 2014 (UTC)[reply]