Wikimedia monthly activities meetings/Quarterly reviews/Grantmaking/January 2015 -2
Please keep in mind that these minutes are mostly a rough transcript of what was said at the meeting, rather than a source of authoritative information. Consider referring to the presentation slides, blog posts, press releases and other official material
Present (in the office): Anasuya Sengupta, Jaime Anstee, Kacie Harold, Maria Cruz, Garfield Byrd, Jake Orlowitz, Haitham Shamaa, Sati Houston, Alex Wang, Jorge Vargas, Carolynne Schloeder, Siko Bouterse, Geoff Brigham, Erik Moeller, Lila Tretikov, Tilman Bayer (taking minutes), Floor Koudijs, Edward Galvez, Anna Koval, Tighe Flanagan, Katy Love, Winifred Olliff, Asaf Bartov; Participating remotely: Alex Stinson, Samir Elsharbaty
Group intro and learnings - 10 minutes
Education - 15 minutes
Annual Plan Grants - 15 minutes
The Wikipedia Library - 15 minutes
Learning & Evaluation: Programs & Impact - 15 minutes
Questions and Discussion - 20 minutes
- 1 Group intro and learnings
- 2 Education
- 3 APG
- 4 The Wikipedia Library
- 5 Learning & Evaluation
- 6 Asks
Group intro and learnings
named the team "Grantmaking/community growth" for now
you saw this slide before
concentric circles, impact via ripple effects
[slide 4](grants by type)
Katy: rows add up to 100% this time
[slide 5]diversity breakdown
[slide 6]Overarching goals for Q3&Q4
Floor: saw this slide before, just as reminder: 3 areas, combination of monetary and non-monetary support
for the first time we have targets, baselines
can't cover all, so decided on focus regions. They are: Arab world, Spanish-speaking world, particularly Mexico and Argentina, and Indic language, specifically Telugu and Malayalam.
two focus areas for reporting
successful (experimental) programs move from left to right area
[slide 9]scale of users, coverage
start measuring, establish baselines for e.g. gender
Jake: (for TWL) we don't know how links are clicked [so need baseline there first]
or: which communities have reference desks
Floor; personal relationships are important, found this to be true in all of our work
Arab education program clear example for creation of quality content
a lot of the core members of the Egypt user group came out of edu program
empower: program leaders will motivate/mentor others
Lila: are we talking about voluteers, or e.g. professors?
Floor, Anasuya: both
Lila: so what do we mentor them on?
Floor: e.g. professor who wants to start program but does not know a lot [about editing and community]
or, other end of the spectrum: Experienced Wikipedians who don't know much about education world
Lila: ok, that's what the Edu Foundation does too, makes total sense
how many professors, Wikimedians do we support?
Floor: we don't have these numbers
Anasuya: we do actually have the data, can estimate it for you
Tighe: typically, our touch points are Wikimedians who are in that space
What we did
focus on four areas
Q2 goals and status
Anasuya: highlight - they surpassed quality content goal in Arabic
Erik: Arab program support still mostly done by us?
Tighe: some is coming from local user group, but still a lot of support from us
(looking at public WEP dashboard)
Erik: so this is a self-reporting tool?
Lila: do you track by school too? in next iteration
Lila: this is good
What we learned
(pie charts: Wikimedia projects/languages, program models)
Lila: only higher ed? no
Floor: 44 different language WPs, Commons becoming bigger
Anasuya: data shows that unlike hypothesis at beginning, different projects [than WP] may work better for some
e.g. Wikisource, to learn wikisyntax first
hard to load VisualEditor [in some regions] because of low bandwidth
Kacie: WMCZ goes back and forth between the two [VE and wikitext editor]
Lila: can you connect with Damon on this?
Lila: do we send material to universities?
they also translate
Lila: in this area, are you also working with Frank [Wiki Edu]?
Anna: they are one of the 70 programs we count here
Anasuya: a lot of the material is based on the foundation that Frank built when he was WMF.
Lila: but also cooperate with them? yes
Lila: just want to make sure there is a process
Anna: work on simpler version of toolkit, also for translation reasons
Kacie: US/Canada cultural context in that material doesn't work well everywhere
Anausya: e.g. ambassador role is based on student teaching assistant role in US/Canadian univsersities, doesn't exist in other countries
Lila: what's in it for the professor?
Floor: motivational tool for students
Anna: national pride, in many countries
media literacy for students
care about state of information in their language
e.g. in say, Hindi, one can increase number of [available online] articles about legal concepts from 50 to 80 with just one course
Kacie: also, wiki lets professors watch students work/write very transparently
Lila: still dealing with (academic) hate on Wikipedia?
Kacie, Asaf, Anasuya: it's still a big challenge, different countries are differently progressive
Anasuy: once there is government support (e.g. Israel), big progress
Floor: university level still dominant, but (surprising to me) secondary level school level is significant
Lila: what do these numbers mean?
Tighe, Anasuya, Jaime: number of programs (can be several per country), initiatives
Lila: this slide needs to be reworked. At the end of the day, want to know how many people reached
Anasuya: have never done this before
has already shown us surprising things, like amount of secondary level programs
Erik: so do we get numbers on participants in reports?
Anna: some, but not always. counting schools isn't a problem, counting students often is
Lila: and it might often not be worth it - but professors and schools, definitely
Floor: Highlighting that almost half of progams is in Global South countries, and one third of program leaders we work with are female.
Geoff: this looks great. How do we motivate female leader there?
Anasuya: targeting different (types of) schools and subjects
e.g. in Egypt, language schools have much more female professors
(multiple win: also multilingual, can work on arwiki-enwiki translations etc.)
Tighe: can follow up offline with some more information
learning patterns library - a big thing we've been working on
by now 150 patterns in library
24 of those education-related, 18 of them written by Kacie. Next step is inviting community members to participate.
Kacie: goal: should be usable by non-Wikimedians (teachers) too
takeaway: great diversity, people adapt a lot to local conditions
(e.g. where writing encyclopedia articles doesn't work well, contribute to Commons instead)
Lila: why that level of detail for Arab world?
Tighe: We ran that ourselves
Lila: 3% on Arabic WP contributed by education program students, that's high impact
Lila: do we see social issues, negative issues for women contributing to arwiki?
Tighe: not really
Anasuya: some middle east countries have...
Erik: translation ratio?
Erik: have we talked to language engineering about support from Content Translation tool (CX)?
Asaf: I have
Erik: let's talk about that, it can still be a useful tool even for languages where there's no machine translation available (in CX)
Lila: first one is to grow number of programs...?
I recommend to (track specific numbers)
Anasuya: it's a bit of a challenge, because we rely on volunteers and their self-reporting
Anna: some professors don't want their name attached to students' Wikipedia writings
Lila: maybe not get a name, but at least an email address
add a volunteer in a country
Floor: often we are not even directly in contact with that professor - volunteer is
Lila: sure, but consider what happens if that volunteer gets hit by bus
Asaf: often we are one level further removed, e.g. WMCZ might recruit that volunteer, not us
Floor: we help synthesize data from them...
no. 4 ["keep our fingers on the pulse..." is about improving measuring
Lila: you are going in the right direction, but these goals are weak
Erik: talk about tech support
to what extent built inhouse, or by external contractor like Wiki Edu did
Anasuya: problems, e.g. translation support
Lila: need to get clarity on what is needed, and how it is going to be done
Erik: and then be reflected in our planning
Lila: These activites are pretty broad. As you are learning more, I want you to think about possibly focusing and not doing too much at the same time
Katy: background - important to remember where coming from. Think back to 2012: FDC was solution to a political problem
now, FDC pushed everyone (including WMF) on impact. we're also looking closely at global south and gender.
a lot of small movement organizations learned a lot, achieving much better impact.
important: this is not Winifred, Anasuya and myself making decisions, but the volunteer committee. We provide data.
- definitely a lot of progress towards impact and non-monetary support
Lila: I think we achieved or overachieved the goal for the last quarter - thanks for all the hard work!
much more to do
- consultation with community on APG grants and impact - in development
- external benchmarking study
- confirmed that our participatory grantmaking is very innovative and in line with our movement values. No one else does it on this scale.
Key things that changed this round
- large orgs did not get increase, or saw decrease.
- The two largest grant increases went toward supporting organizations with high potential for growing communities in Latin America and Eastern Europe.
earlier: concern that requests and budgets grew more than impact. Now moving out of that red zone
usually, grants get larger over time, and are matched to increasing effectiveness and and scaling impact.
- the community members on the FDC are committed to continue on this path of requiring impact, basing decisions on impact and data.
FDC election year coming up.
- outcomes are now reported, though we see lack of consistency of reporting data quarter to quarter.
- we start to see metrics over time, but it's not good enough yet.
- this should should improve in a few months, with standardization, consistency and global metrics
Many donors don't allow grantees to choose what metrics to report on but that's not where we started. The global metrics aren't perfect, but they are a good start, and we hope will be complemented by stories and data.
Erik: these numbers cover all APG recipients?
Katy: some gaps in reporting, but culture has shifted towards understanding impact. this has been a cultural shift. not perfect yet though
Lila: it's a continuing tuning, as we do on our side as well
Learning together (visualization thanks to Maria and Jaime)
Interesting to see that organization who participate in more of these things receive more funding
We're starting to track our inputs and non-monetary support and see how and whether it aligns with funding decisions (and later, impact)
Erik: is this intended to be repeated?
Lila: would be great to also see how well they achieve objectives
What we learned
- organizations are articulating and leveraging their value with institutional partnerships
- We are pivoting to providing non-monetary support too
eager to see upcoming Global Metrics report
Lila: great, very informative
Q3 consultation complete?
for non-monetary support, can we add qualifications on how many people/orgs you will touch base with, etc.
Anasuya, Katy: OK. consultation will be done in Q3 and Q4.
Lila: not clear what net outcome from benchmark study is
Winifred: still need to decide about socialization / communication of that research
Anasuya: proportion of people who use tool that we just built
Lila: so percentage of participation? yes
The Wikipedia Library
Jake: focus on super users
Lila: I like this overview, helps me to understand
Jake: started as IEG. learned to scale, become effective, through mentorship etc.
so this program is example what (a grant project) can be at its best
day to day management of accounts etc. is done by volunteers, who know best what local community needs (e.g. books on Arabic WP, German-language publisher on dewiki)
work ahead about improving capacity
What we said/did
needed to fill gaps, pay debts
now, easy to get new partners and scale
Lila: this is the right kind of targets
as [occasional] editor, I would not have known about program
Jake: super-editors are likely to have seen watchlist notice [where this was advertised]
important ones here are # Global branches, #volunteer coordinators
Lila: that's exacctly right,
Erik: # of accounts issued is cumulative? yes
The Wikipedia Library’s Research Partners
Lila: you mentioned that many are moving to open content, do we consider putting these on Wikisource?
Jake: that's just free to read [not freely licensed]
Anasuya: which was hard enough
What we learned
Lila: do we have contracts with them?
Jake: we have MoUs with them that Legal reviewed
it's lightweight, and about a pretty direct exchange anyway
Lila: last point about lacking GLAM capacity, does that refer to volunteers?
Jake: no, leverage at WMF
Anasuya: possible GLAM coordinator at WMF
Lila: but we don't have enough capacity, need volunteers for that
Jake: yes, this is about supporting volunteers talking to GLAMs
not looking at full-time cacpacity
WMF tried that in the past...
Anasuya: it's about facilitating
Asaf: like in the edu program
Geoff: are we leveraging chapters? if not, could we?
Jake: not targeting specifically
yes, could do that
have pitch guides now
Geoff: but chapters don't have complementary programs?
What's next (Q3 & Q4)
Erik: (on objective 4:) you can't commit to build Echo notifications on your own ... ;)
Jake: oops ;)
Learning & Evaluation
What we said
relied a lot on Quarry tool
doing great, on time for most things
many grantees: why don't you mine data more, we don't have time ourselves
- Education programs can have high impact but can also be costly
- WLM and other photo events generate a lot of image content and bring in a good number of new users, with some success at activation/retention
- On-wiki writing contests produce a lot of quality content for low financial investments, but tap active editor time
- GLAM good for content generation, not for large numbers of new editors
- editathons have good impact for the little investment they often are, but haven't seen large scale
- editing workshops so far haven't demonstrated effectiveness at activating/retaining new registered editors
What's next (for evaluation reports)
Next steps for evaluation reports
more data, more outreach
better data, e.g. on bytes added
What we did
What we learned
captured much more data
reached further and deeper to 59 countries
Lila: how do you do that, word of mouth?
Jaime: mined all grant reports and have reached out to leaders from those leads, and team is now good at finding things in English and Spanish and we reached out to those program leaders as well
Asaf: and public emails on lists
Anasuya: also ran a CentralNotice
Jaime: and blog post, Twitter, Facebook, the usual routes
Asaf: so broadcast+mining
data capture and reportability
Quarry helped here
these programs we identified cover a good proportion of the content uploaded to Commons during the same timeline
Anasuya: that's what we want to get at - sense of how significant (grant-funded activties are) in overall contributions
(Jaime:) participation on L&E portal
What's next (Programs Knowledge, Design & Toolkit Development)
want to revisit your charts with the stars (slide 31) - make sure we measure the right things
Erik: building reports in parallel, make sure they have unified language
Erik: of these three, only Wikimetrics is staffed. no concerns there (but make sure you get what you need)
the others ...
Lila: need to understand 1. how these will increase impact 2. how much it is going to cost (for that, sit down with someone in Engineering)
Erik: also would like to know how your conversations with Wiki Ed goes, e.g. about phasing out Edu extension
Asaf: you mentioned an internal services group?
Lila: that's totally in ideation phase still
Erik: and realize you have Engineering resources for Grantmaking right now