Learning and Evaluation/Archive/Connect/Questions/Archives/2013-09
|This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page.|
Where should I report my activities as a WiR funded by my organization internally
Hi, I'm Dorothy. I'm the WiR at the Metropolitan New York Library Council, which is a consortium of ~150 libraries and archives in New York City. My position was initially 2 months, but has recently been extended to 5 months. My position is not funded by WMF, but I want to make sure that I am reporting the progress of my work to the appropriate people and generating information as I go so that others can learn from my experience and so that I can just generally feel more connected to the activities of other WiRs. My work as WiR has been focused on engaging with libraries- prominently museum libraries in New York, and consulting with them about possible Wikipedia projects and events, the GLAM-Wiki movement, discussing how they can incorporate Wikipedia into their daily workflow. I have created a GLAM page for us, where I have been posting updates about our activities. See it here: Wikipedia:GLAM/Metropolitan New York Library Council. I have also created an account with notes from all of the meetings I've had. I have looked at the Edit-a-Thon logic model, and I will use that in the future. But I'm mainly interested in finding out if there are any particular platforms or WMF people that I should be engaging with as a WiR to put information about the trainings/ events that I have had. Wikipedia:User:OR drohowa 17:06, 5 September 2013 (UTC)
- Hi Dorothy, welcome! That's a great project, congratulations on your new role, and it's great it has been extended. It's great that you want to report about your program, even if it's not funded by WMF. It's really important - as it can serve as a teaching tool for other Wikipedians in Residence, the staff that manages them, and can also engage and inspire other organizations to consider a similar role at their own institution. I'm glad the edit-a-thon Logic Model was useful. We're in the process of developing further resources on how to evaluate them, which will be here in the future. There really aren't any WMF people that would be appropriate to ping about your work, unless you're seeking funding or analytics input, however, the best place is generally the GLAMWiki.org community, or the US GLAM-Wiki community. The mailing lists they have are generally the best places to share experiences, and the main GLAMWiki page has a place for case studies and models. And we're always happy here to do our best to help you with evaluation of activities. WMF basically placed GLAM into the communities hands last year, so it's more community focused, and no staff time is being devoted to supporting it, hence why I suggested those two community groups. You can also post your efforts and updates in the This Month in GLAM newsletter. Oh, and if you are interested in writing a blog for the Wikimedia Foundation blog about your work, we totally support that. Hope this helps some! SarahStierch (talk) 17:58, 10 September 2013 (UTC)
How and when are programs evaluated against strategy?
How should programs - while in design, and while in operation - be measured against strategic goals? Are there good models for continuous comparison or alignment, feedback loops that also include revision of strategies based on experiences, and sharing this evolution as it happens? –SJ talk 02:07, 14 September 2013 (UTC)
- Hi SJ, are you referring to the strategic goals of the Foundation or are you referring to the strategic goals of individual movement entities (chapters, thematic organizations, etc.)? Or both? --Frank Schulenburg (talk) 17:25, 16 September 2013 (UTC)
We have suggested using a logic model as a general mapping of one's program theory (i.e., anticipated cause and effect chain of outcomes to strategic goals and objective targets) and basic road map for guiding program monitoring and evaluation planning (see here) However, the logic model is neither an evaluation nor project management plan, t there are several program/project management models for continuous implementation monitoring many of which come from a standard business perspective while others come from more of a social, systems change perspective. There is actually a lengthy article on en.wikipedia as well as you may find some helpful resources on our resources page, however, those links are focused on program evaluation rather than management. Most important is that the project management model fit the ethos of the program and organization and that there is a clear plan, along with frequent and regular review, to prevent things like scope creep or failure to document major changes in implementation. As we tried to emphasize at the Budapest workshop, programming, and its evaluation, is an iterative process and shifts and changes are often a requisite to success in a real world setting. JAnstee (WMF) (talk) 17:57, 16 September 2013 (UTC)
Wiki Academy survey
I have tried to do a survey on wiki academy. The link to the survey site is here. But PLEASE DO NOT respond to this survey as it will skew the data of people who are the prime target of the survey. I am sharing this to get suggestions from others on how this can be improved. TS-Sowmyan (talk) 04:14, 25 September 2013 (UTC)
- Hi TS! I'm Sarah, the Community Coordinator for Program Evaluation & Design. I'm not a survey specialist, but, I have done them in the past as a community and staff member and I do have some feedback or you. I also asked Jaime, the Program Evaluation Specialist, who is very experienced at surveys to provide you feedback. Here are some suggestions:
- Add an introduction. There is no lead to the survey at the top of the survey explaining why I'm there, what I'm taking the survey about, etc. It's always important to have a simple, welcoming introduction, with any resources that the survey respondents might need to respond (i.e. a link to the event they attended).
- There are some lower case "Wikimedia"'s throughout the survey, I'd fix that for consistency.
- In Question #2, one of the options is "Train editors." I'd maybe reword it to say "Train Wikipedia editors," or "Train people to contribute to Wikimedia projects." It's always important to be detailed but specific for clarity.
- Question #3, "over view" is misspelled, there is no space, it's "overview."
- Question #3, instead of "basics of editing," I suggest something more like "The basics on how to edit Wikipedia" or "The basic on how to make edits on projects like Wikipedia"
- Question #3, it says "Understanding of where to find more information" if that is the name of the presentation/workshop section, ignore my suggestion, but, I'd perhaps say something like "Understanding where to find help and more information about Wikimedia projects like Wikipedia." or something like that.
- Question #5 has a response saying "Not able to get" I suggest "I did not receive help," I also would suggest you put an option for someone who did not ask for help/need help, something like "I did not need or ask for help." Not everyone needed help, perhaps.
- Question #6: "During the training, did you learn to create input in an Indian language?" I don't understand what "create input" is at first glance. If you want to know if people learned how to contribute in an Indian language, I'd say something like "During the training, did you contribute in an Indian language to Wikipedia or related projects?" The use of the word "input" and "data" makes it sound like they're learning a data entry course, not how to contribute to Wikimedia projects. I'd perhaps change that language, unless you used "input" and "data" throughout the workshop and you assume that they know that's what you're referring too. I've never used those terms during workshops, so it just sounds funny to me :)
- Question #8, I'd remove "Seriously," that makes it sound like unless you're editing an article about chemistry you're serious, but if you're editing an article about Pokemon, perhaps your not, or the word seriously could be a bit judgmental of someone's attempted editing habits.
- Question #9 asks what "you have practiced," but some of the responses are written in a different way. I'd change things to say "creating a user page" instead of created, carrying instead of carried, etc, or change the question to say "you have been practicing since after the event?"
- I would, for fun, add an optional response for gathering peoples usernames and a way to contact them for futher information as needed. I'd also have a blank question for gathering general feedback "Please share any additional thoughts about the workshop or suggestions.")
- Also, it's missing a wrap up, unless that happens after you submit the survey (like Google surveys do). If so, make sure there is a call to action for people and a thank you :)
- I do have a final question. Did you do a pre-survey before the event started in order to know if the participants were confused about WikiLeaks being part of Wikimedia or thought Wikimedia India charged for people to write articles, etc? Doing a pre-survey and a post-survey together can be a great way to build a future event out to what is actually of concern and confusion to participants so you don't spend time lecturing about WikiLeaks not being a WM project if 90% of the attendees know it's not. Just a suggestion! Hope this helps! Great work and I can't wait to see the outputs of the survey. SarahStierch (talk) 16:42, 25 September 2013 (UTC)
- Thanks Sarah. When you pointed them out, I could see every one of these problems. I guess these did not occur to me because I am a non native speaker and have not bothered to learn to write better. I have promptly joined a coursera course to improve my writing skills! My mails carrying the link usually deal with the introduction. Without that respondents may not even click on the link to access the survey website. I agree that when we include an introduction and wrap up it would be great. I also like the idea of doing a pre survey and a post survey. Thanks for the feedback. I appreciate it.
- I saw the content on wiki metrics. That looks like the one I should use. I have always considered using the user id as a violation of privacy. I like the idea of asking the permission of the respondents to use their user id and then track these. I got about 10% response rates for the surveys, and 4% success in a training program producing a Wikipedian. TS-Sowmyan (talk) 17:27, 25 September 2013 (UTC)