Monitoring Product Trust and Availability
Direct the Foundation or other movement body to review and regularly report on the public trust and access to Wikimedia and to identify if, or how, recommended or planned changes to policies, technology choices, product experience and marketing, present potential risks to that trust, and the impact and resources which result from it. For example, what large scale political or social risks to Wikipedia as a vital consumer internet product, central to the free information ecosystem in much of the global north. If we are to grow into the essential infrastructure of free knowledge we will need the traffic to Wikimedia’s websites as a base, attracting resources, volunteers and influences. As we move to take on existing structures of power, those structures are likely to use technical means, such as censorship and regional or national firewalls, to undermine our efforts. To understand, communicate and recommend ways to handle these potential impacts to our underlying internet traffic based model, a new capability is needed at the movement level. By creating a globally centered capability to monitor and report on these issues, we hope that individuals, groups and paid staff, will be able to better understand and avoid these risks, as well as identify opportunities to increase the impact of our movement.
Wikimedia’s resources, cultural diversity and relevance to much of the world is based on the large amount of traffic, and associated revenue, to Wikimedia’s domains. This in turn, relies on public trust in the product and effective balancing of sometimes contentious or even violent political and social disputes. Currently, the Foundation and the movement as a whole have no ongoing capacity to monitor and analyse user access and trust of our products. As we pursue goals like “knowledge equity”, even if they are morally just and widely supported within the movement, there are potential risks to perception of the products as trustworthy and neutral.
Without a dedicated capacity or clear expectations in this area, the Foundation, affiliates and informal groups will continue to make assumptions about these issues based on limited perspectives and often through their own biased lens. This, in turn, leads to risks due to ignorance of local or culturally specific concerns, and risks which may limit the growth or continued use of our products and knowledge. Creating and monitoring metrics such as user trust, and tracking when and where there are major changes to our traffic, revenue or participation rates, and sharing these with the movement at large will enable all participants to better plan for the future. By creating a globally centered capability to monitor and report on these issues, we hope that individuals, groups and paid staff, will be able to better understand and avoid these risks, as well as identify opportunities to increase the impact of our movement.
This recommendation will require paid staff member(s). This would be a new capability, either of the WMF or other affiliate, not drawing on the existing functions of the Foundation. The outcome will be regularly recurring reports, published on wiki, and presented to the movement including Foundation and Board for their consideration and incorporation into existing work. This function would also generate data about Wikimedia’s product health in this area. For example:
- surveys of reader trust in Wikipedia and perceived bias (per geography, and preferably in a way that can be compared to media, tech giants etc.
- surveys of editor trust in Wikimedia Foundation and general community perceived threats to project use and participation
- reviews of geopolitical risks to access or trust by the actions of government actors, such as censorship or disinformation campaigns
What will change or shift? (both direct and indirect impact)
This recommendation relies on the assumption that collecting, analysing and sharing information around these issues will reduce the risk of planned changes to software, content or marketing of the Wikimedia projects, and provide expert global guidance towards new opportunities. This relies on the other parts of the movement to engage with these findings and is largely expected to result in indirect impacts on plans by affiliates, the foundation and informal groups in the area of product experience, consumer awareness and technology architecture.
How will this recommendation change the structures to enable programmatic work towards becoming the support system for the free knowledge movement to be more effective?
Although these issues may (or may not) be considered by existing structures, we believe this capacity, and a global perspective on issues of society, politics and Wikimedia, will help ensure our existing traffic and resources remain healthy, while also potentially enabling more concerted and impactful programmatic work on technology culture and related political policy.
This new capability would mainly be focused on influencing the paid decision makers of the movement working in areas of technology, product and marketing. However, by publishing an open finding of fact and analysis of proposed changes, we also hope to influence a broad constituency of developers and participants in the Wikimedia technology ecosystem, including volunteer developers and external partners.
Overview of possible negative outcomes of the recommendation, supported by a risk assessment.
The most likely “negative” outcome is that the work done in this area, and the assessment of specific proposals would be largely ignored, making the recommendation little more than an academic exercise. Because no enforcement or decision making process is proposed to change, it would be quite possible for the intended audience to proceed with work identified as risky, or to ignore potential opportunities without consequence.
We could consider process or other requirements to mitigate the risk of this work being ignored. For example, if a proposal is identified as highly risky to traffic in a certain region (for example, a program that is likely to result in a blanket censoring of the projects), we could put some requirements in place that the proposer provide a risk mitigation strategy or worst case scenario plan, prior to proceeding.
Additionally, the weight and attention the Board and others give this work will have a major influence on how impactful this will be.
As described above, this need is currently addressed in an ad-hoc fashion, usually in the course of planning for or discussing some specific software, marketing or policy change. These risk and opportunity assessments are therefore often quite local in scope, significantly biased in their understanding, and not available for the wider movement to engage with. This recommendation proposes a new, dedicated capability to bring this work together, make it more widely available, and bring in a balance of perspectives.
It is related to, and similar in form, with our proposal for a Movement Technology Ethics Advisory Panel.