Wikimedia Foundation Learning and Evaluation team
The Global Data & Insights Team (GD&I) helps monitor the impact of the work being done across the movement, and guides other Community Engagement teams in executing effective programs through ongoing analysis, learning and evaluation support.
Who we are
The Learning and Evaluation team promotes the use of evaluation for learning and improving decision making that will better Wikimedia projects and communities. This includes:
- developing the capacity of internal and affiliate leaders in program evaluation, design, and learning in order to enable effective and efficient programs;
- facilitating the use and development of evaluation for creating shared frameworks, tools, and research;
- providing a hub of learning and evaluation resources for our affiliates, grantees, volunteer program leaders, and the broader Wikimedia community to connect to resources and experienced people to help them exchange learning to be successful; and
- executing and reflectively applying research and evaluation work to inform WMF’s overall community engagement strategy.
- Jaime Anstee
- Jaime began at the foundation late April 2013 with the development of the program evaluation and design initiative as the Program Evaluation Specialist tasked in developing program leader knowledge, evaluation resources, and overall capacity for evaluation and design of volunteer programs. She currently leads the team and works to support the Affiliations Committee with their proactive monitoring of Wikimedia user groups and chapters. Jaime has an interdisciplinary Ph.D. in Social Psychology
- Dana McCurdy
- Dana has a BA in Race, Class and Gender Studies and a Master of Public Health in Social and Behavioral Health Sciences. She has worked in the nonprofit sector as a program evaluator since 2007, supporting the measurement, learning and improvement of community health, prevention, and public education programs. Her specialties include evaluation training and capacity building, culturally responsive evaluation, data visualization, and strategic learning. Prior to working as an evaluator, Dana was a health educator in public schools and universities. She supports Wikimedia community members to learn and apply evaluation practices and she works to synthesize and report on movement-wide outcomes.
- Dumisani Ndubane
- Dumisani has a Diploma in Electrical Engineering and extensive experience in System Analysis and data pattern recognition. He came to the Wikimedia Foundation from a strong background as a volunteer for the local chapter and convener of the African regional meeting. He works at the foundation to monitor, analyze, and share learning from affiliate and Program leaders.
- Rebecca has a PhD in Sociology and an MHS in International Public Health. Before joining the Foundation in 2019, she taught courses in sociology and conducted mixed-methods research on the U.S. environmental movement while pursuing her PhD. Prior to this, she worked in household-level injury prevention research. She manages the annual Community Insights Survey and supports L&E's efforts to better understand how the Foundation can effectively support the Wikimedia movement.
- Irene supports the grants team's efforts with data analysis and data architecture system upgrades. Talk with me about article quality, editing metrics, knowledge gap tracking, search insights, reader insights, grants metrics, Python & Pandas for data analysis, Jupyter Notebooks, Superset.
What we do
Learning & Evaluation Infrastructure
- Guiding strategy and community engagement in sharing evaluation and program knowledge, through learning patterns, online, and offline workshops. To this end, we create announcements, maintain a Collaborations Calendar, create and distribute Quarterly Newsletters, social media, and blog series.
- Reaching out to the community leaders, and encourage them to use the resources available.
- Monitoring affiliate and program-based social media, reports, wiki pages, and blogs to learn about activities and opportunities for sharing learning.
- Wikimedia Resource Center. Supporting community capacity development and knowledge exchange including management and maintenance of entry point and the L&E portal (learning patterns library, the survey support desk, and other portal resources.
Learning Exchange & Storytelling
- Collecting data for knowledge sharing and storytelling.
- Key messaging and audience engagement through data visualization and infographics.
- Supporting development of knowledge exchange. Connecting people to information on how different programs are run, and what could work best in which contexts, and what program styles work best for which goals.
- Coordinating workshop design and content leadership of Learning Days pre-conferences and conference presentations and workshops for learning and evaluation.
- Planning and facilitating training and learning sessions for community listening and engagement strategies.
Evaluation Tools & Resources
- Developing quantitative and qualitative evaluation instruments, metrics, and data approaches (including surveys).
- Collaborating with Program Officers, event coordinators, and grant committee members to review and assess grantee programs and evaluation plans.
- Designing, implementing, and monitoring evaluation strategy, plans, and processes.
- Collecting and analyzing quantitative and qualitative data from different populations
- Providing internal design & development input to movement organizers tools
Evaluation Capacity Development
- Facilitating community and program leader development opportunities for gaining skills in learning and evaluation.
- Capturing and understanding the qualitative and quantitative outcomes/impact of core programs and activities through their reports, and sharing insights back to movement communities.
- Providing evaluation design, survey design, and analysis consultation to Wikimedia program and project leaders inside and out of the foundation. Including Community Engagement Insights survey strategies beginning 2016.
- Supporting project evaluation consultations, online meetings and learning circles for evaluation data use and methods learning.
- Reviewing affiliate reports and monitoring compliance.
- Providing support for recognition and mentoring.
- Providing staff support to AffCom for meeting facilitation and support for application pipelines.
- Community listening support to understand their community structure, needs, challenges, strengths, creations, or collaborations.
- Supporting Internal Collaborations: Providing capacity development supports to other WMF Teams for learning and evaluation practices.
- Participating in task and workflow tracking and management for quarterly reporting; weekly team stand-up and manager check-ins, bi-monthly department meetings, and monthly cross team check in for program capacity and learning collaborations.
- Supporting foundation strategic planning & accountability through quarterly and annual planning and reporting of team efforts as well as participation in WMF overall strategic planning.
Learning and Evaluation team
|To whom I go to about …||L&E contact|
|Wikimedia Community Initiatives||Maria|
|Setting targets & impact analysis||Jaime, Dumi|
|Evaluation design||Jaime, Dumi|
|Evaluation tools & metrics||Dana, Dumi|
|Survey design and analysis||Jaime, Dumi, Dana|
|Wikimedia Programs||Dana, Maria, Dumi|
|Workshops organization||Jaime, Maria|
|Data mining and analysis||Jaime, Dumi|
|Community Research||Maria, Dumi|
|Communications & Storytelling||Maria|
|Innovation processes/Design Thinking||Maria|