The project focuses on triangulating our best available signal metrics across key engagement and enablement domains. All metrics are scaled 0-100, averaged across triangulation points, and then scaled 0-100 again so that the global percentile rank is 50 and those scores nearer 100 represent geographies signaling the most engagement in any named domain and those nearer 0 signaling the least. All scores are comparative indices in which no further modeling has been implemented beyond averaging and scaling triangulated signal metrics. The initial domains we have been mapping are:
In addition to the main domain-level metrics, for those domains in which we have robust enough data (Readers, Editors, Grants, and Affiliates) we will also share sub-metrics along two facets:
Question 2. Are there input measures that could be improved from the landscape of existing data or measures which should not be part of the measurement framework for Wikimedia presence & growth?
Click here to review the input data to the metrics
Click here to review the underlying metrics at hand
Domain
Measure
Readers
Average monthly unique devices
Average monthly pageviews
Editors
Average monthly editors
Average monthly active editors
Programs
Count Education Events
Count GLAM Events
Self-reported capacities
Grants
Annual grants FY
Historic grants
Growth in Grants
Affiliates
Highest Governance Type
Affiliate grants
Affiliate count, size, and tenure
Count new recognitions
Count organizing hubs engaged
Population
World Population
Population Growth
Access
Population Accessing Internet
GSMA Mobile Connectivity Score
Access to Basic Knowledge
Access to Information & Communications
Freedom
Freedom Index
World Press Freedom Index
Control of Corruption Score
Question 5. Do you have any other thoughts you would like to share with us about the Wikimedia presence & growth measurement framework or the planned dashboard elements?