Grants talk:IEG/Studying content interest and editor engagement factors with new editors

From Meta, a Wikimedia project coordination wiki

This project has not been selected for an Individual Engagement Grant at this time.

We love that you took the chance to creatively improve the Wikimedia movement. The committee has reviewed this proposal and not recommended it for funding, but we hope you'll continue to engage in the program. Please drop by the IdeaLab to share and refine future ideas!

Comments regarding this decision:
We appreciate the idea for an action-research project - hope to see more from you in the future!

Next steps:

  1. Review the feedback provided on your proposal and to ask for any clarifications you need using this talk page.
  2. Visit the IdeaLab to continue developing this idea and share any new ideas you may have.
  3. To reapply with this project in the future, please make updates based on the feedback provided in this round before resubmitting it for review in a new round.
  4. Check the schedule for the next open call to submit proposals - we look forward to helping you apply for a grant in a future round.
Questions? Contact us.


Aggregated feedback from the committee for Studying content interest and editor engagement factors with new editors[edit]

Scoring criteria (see the rubric for background) Score
1=weakest 5=strongest
Potential for impact
(A) The project fits with the Wikimedia movement's strategic priorities 4
(B) The project has the potential to lead to significant online impact. 3
(C) The impact of the project can be sustained after the grant ends. 3
(D) The project has potential to be scaled or adapted for other languages or projects. 3
Ability to execute
(E) The project has demonstrated interest from a community it aims to serve. 2
(F) The project can be completed as scoped within 6 months with the requested funds. 4
(G) The budget is reasonable and an efficient use of funds. 3
(H) The individual(s) proposing the project have the required skills and experience needed to complete it. 4
Fostering innovation and learning
(I) The project has innovative potential to add new strategies and knowledge for solving important issues in the movement. 2
(J) The risk involved in the project's size and approach is appropriately balanced with its potential gain in terms of impact. 3
(K) The proposed measures of success are useful for evaluating whether or not the project was successful. 3
(L) The project supports or grows the diversity of the Wikimedia movement. 4
Comments from the committee:
  • We appreciate the interest in this subject as it aligns with strategic priorities.
  • The researcher seems well-qualified and we expect he would have some interesting findings to share.
  • The study may have significant redundancy with what has already been learned about editor engagement or what is already being worked on in teams like WMF's Editor Engagement Experiments.
  • Budget is very well broken down but sample size appears small for the funding requested.
  • Unsure if what is learned from this project would be easily applied to Wikipedians across geographies and languages.
  • Measures for success do not appear to be sufficiently concrete for a scientific study.
  • We would prefer to see more specific hypotheses that can be turned into actionable recommendations when verified, so that the project aims for more direct impact.

Round 1 deadline 15 February 2013[edit]

We're not sure from your budget breakdown if you have completed all 3 parts of your proposal. If it is ready for consideration in this round, please submit it for review by updating the wikimarkup in your page from status=DRAFT to status=PROPOSED. If you have questions or need more help submitting your proposal, you can ask for it at IdeaLab Help. Thanks! Siko (WMF) (talk) 20:36, 15 February 2013 (UTC)[reply]

Thanks for the info! I don't know what happened but I just had posted a previous version. Also with the "draft" when it should have been "proposed". Now it's OK. Looking forward to feedback :) --Marcmiquel (talk) 22:05, 16 February 2013 (UTC)[reply]


Merchandise[edit]

Why is the "wikimedia merchandise for volunteer giveaways for 40 editors"? Does this mean you are looking at the effect of giving away Wikimedia merchandise and editor retention? Doc James (talk · contribs · email) 01:24, 23 February 2013 (UTC)[reply]

Dear Doc James, I would like to avoid by all means a bias like this. Actually, I've read some research about the use of wiki in learning environments and there is one specific study which proves that when there is no reward students did not use it (http://content.imamu.edu.sa/Scholars/it/VisualBasic/Fulltext.pdf). So, I think it is not a good option to create such effecte. I want to give merchandise for the initial interview/survey, as they do in most of the usability studies. Then every user will decide his own activity depending on his personal situation and I will not propose any goal or reward according to it.--Marcmiquel (talk) 13:16, 25 February 2013 (UTC)[reply]

Editor engagement[edit]

Hi, thanks for this proposal. I am wondering to what extent you intend to focus your study on specific engagement factors related to things that are a) not yet well understood and b) relatively easy to change. What I sometimes see from engagement studies are learnings like "wikimarkup is difficult to edit" and "the talk page system for communicating is less good than the kinds of messaging systems on other websites." I find this frustrating because we already know these are problems, and the solutions to them require large technical changes that are being worked on but won't be completed for some time still...which means these studies don't tell us something that is new and directly actionable. In the case of your project, it sounds like you've got something more specific in mind for your study, by focusing on topic discovery or suggestion, correct? In terms of engagement being related to topics of interest - have you looked at research on Wikiprojects or Suggest Bot (which offers topic-based article suggestions to editors on English Wikipedia, not sure if Catalan has something similar) already? If you are planning to manually offer and detect topics of interest for your study participants offline, do you have any thoughts on how learnings might scale to an automated or online context? Siko (WMF) (talk) 05:38, 23 February 2013 (UTC)[reply]

Hi Siko. Usability is a strong part of User Engagement and I don't want to rediscover the problems which are noticed in the Usability Initiative from three years ago or other research. Actually, you are right and I think that it would be better to focus on the interests in the beginning of the relationship between any new editor and Wikipedia. My hypothesis is that if we help establish this connection well and the user find it pleasent, it will continue. A manual detection of topics would be through interviews and a survey. After a while, we would be able to see with the metrics if there is a correspondance between what they reflected and their contribution to the encyclopedia. I have thought about how it would scale in an automated way in the online context. Probably the best option would be a Mediawiki extension which allows you to navigate through the topics detected in answers to a survey you have previously responded. I did not know the bot you suggested, although it seems useful it takes previous edits while I think it would be better to work on the direction of the survey. Activities like "Wiki Loves Monuments" have proved some users feel really enthusiastic about their local stuff, while others would feel more attached to other things. At the end, I would involve in proposing a real extension - I would send UI designs and task analysis for someone else to carry on with the project. Thanks a lot for the comments! I am excited about how we can plan it. :)--Marcmiquel (talk) 13:39, 25 February 2013 (UTC)[reply]