How can WMF improve messaging and transparency with the community about product updates, especially updates that may be more sensitive (e.g. AI tools)? (Starting with: a retro / post-mortem on Simple Summaries and Tone Check.)
Early discussions often do not get enough engagement.
From the WMF side, it appears that the engagement was tried, but nobody responded.
From the community’s side, there isn’t enough that is concrete enough to give feedback on
Communication as "experimentation" in and of itself causes concern; community members are unsure what it means for the next step as part of the communication process.
Silos/Echo chamber within specific communities and communication channels and discussions from a point in time – lack of context about other wikis and research
WMF side can often be seen as defensive or justifying when providing that context.
Historically adversarial relationship between Communities and WMF (lack of trust)
Language is often opaque and ambiguous
"Automated summaries" for example does not indicate that the intention was to use large-language models to summarize the article content
Information might be missed due to ambiguous terminology (does "Moderator" apply on enwiki?)
Feedback pathways like the Annual Plan also have a lot of "corporate-speak" and are often hard to parse by non-technical community members, making it hard to provide actionable feedback on the plan.
Timing issue and the multiple channels – there are so many places VPT, watchlists, etc. Community members don’t have time and don’t care to follow all these avenues until something changes them, surprising them.
If for each experiment, experienced community members acted as champions and technical experts and provided context regarding the experiments and guide feedback, we expect that the community will have a better time digesting/understanding the "how" and "why" behind an experiment.
Narrowing down the number of communication points on a per-wiki basis by asking the community where/how they would like to consume updates from the WMF will make sure the volunteers have a specific place to look at when looking for updates from the WMF.
Sohom and ChaoticEnby have discussed putting important announcements in Centralised Discussions on enwiki as an example.
Wikimedia Bulletins or Tech News also have traction, but not everyone subscribes to those.
Providing an essay/diff post/YouTube video explaining the typical product lifecycle of the feature will help users understand and relate to the different stages in which a product might be. This might help experienced contributors quickly understand what stage a product is in (and what kinds of feedback are best to provide at each stage)
The Wikimedia Foundation has a general "Inclusive Development Playbook" for product dev process + individual teams apply this in different ways.
This model has previously worked well in the context of the Community Wishlist and in other areas in technical development spaces.
On-wiki discussion: Bring the community to this conversation and work with them to determine key channels for these conversations to occur. Provide the same resources with specific examples of how this didn’t go well, solicit ideas for how it could go better.
Community members champion and drive community discussion: PTAC members (or other community members) participate in key product communications topics to help provide context regarding the experiments and guide feedback in conversations with communities.
The sequence of communications the Wikimedia Foundation did for both the Simple Summaries and Tone Check projects is given below in the collapsed sections. This was meant to give a sense of the sort of processes that the Wikimedia Foundation currently follows, to help generate ideas for how this should be improved or different in the future.
WMF annual plan is published, containing the WE 3.1 KR focusing on Content Discovery for readers, including a hypothesis around experimentation with simple summaries. There is some limited discussion of the idea at this time.
The WMF hosted a session at Wikimania 2024 where Wikimedians workshopped different ways that AI/machine-generated remixing of existing content can be used to make Wikipedia more accessible and easier to learn from, including discussion of potential workflows for simple summaries.
Mediawiki page is created for content discovery experiments, listing simple summaries as a planned experiment and linking to Wikimania session discussion
Newsletter update published around planned content discovery experiments and the experimental browser extension, including Simple Summaries experiments, to the Web team’s projects newsletter with approximately 330 subscribers (previously the Vector 2022 newsletter)
Browser extension launch announced with a call to action to participate in the experiments (including automated summaries experiments) and a link to the project page and newsletter
Olga starts a dedicated page on Mediawiki for the simple summaries experiments, moving previous documentation from the Content Discovery Experiments page
10 February 2025 Publishing of Simple article summaries: research so far and next steps across a number of language Wikipedias
Olga and Szymon post a summary of late 2024 experimentation around simple summaries and planned next steps for wider experimentation and editor involvement on English (VPT), Russian, Turkish, Spanish, Vietnamese, Arabic, French, and Polish Wikipedias.
The post receives no comments or discussion on English Wikipedia
Olga presents a summary of the work done so far and a demo of the simple summaries feature. The community discusses the feature and it’s potential usage for English Wikipedia and African language wikipedias
Olga presents a summary of the work done so far and a demo of the simple summaries feature. The community discusses the feature and its potential usage across Wikipedias with members of various communities, including English Wikipedia
Eliza starts a thread for Simple summaries: editor survey and 2-week mobile study to share the proposed launch of an editor survey and short mobile study, which sparks widespread anger and criticism. Originally, we planned to also post to Spanish, French, and Japanese wikis (where the mobile study was planned to take place) but quickly cancelled those.
4 June 2025
Olga posts Reply WMF to share that WMF is following the conversation closely and will circle back.
Olga posts Taking a step back to clear up misconceptions, and saying we should have done a better job introducing the idea.
5 June 2025
Marshall posts WMF update - let’s continue next week to reiterate that the survey was closed, project paused, and that we would circle back the following week.
11 June 2025
Marshall posts WMF Update/Reflection, sharing updates and reflections, emphasizing the need to step back and consider priority problems for readers while also apologizing for the way we brought up the simple summaries idea. Marshall’s comment leaves the door open for future discussions around the reader’s problem space, experimentation, and AI/LLM tools overall. The comment is mostly responded to positively.
Peter publishes a summary of recent en.wiki conversations along with an invitation for volunteers to edit/refine the questions/concerns we understand volunteers to be raising.
Peter presents Tone Check (then called "Peacock Check") proof of concept by-way-of a recorded demo during the ESEAP Summit. Volunteers reacted positively and enthusiastically to the Check.
23 May 2025 | Invitations published on volunteer talk pages seeking help with model review
Announcement renaming of "Peacock Check" to "Tone Check" and making another mention of the volunteer-led human evaluation that’s set to begin on 23 May.
Peter presents Tone Check (then called "Peacock Check") proof of concept during the "Afrika Baraza Annual Planning Call." Of the volunteers who were present and reacted during the call, all expressed enthusiasm for the feature and asked about when the feature would be available for testing on the wiki they are active on and how they can participate in the [[mw:Edit_check/Tone_Check/Model_evaluation|evaluation of the model].
The announcement includes information about the volunteer-led human review that’s scheduled to begin as well as context about why the initial languages were selected.
Peter publishes an announcement at New pages patrol/Reviewers to try to make patrollers/reviewers aware of the feature and invite them to discuss and try the feature in its early state. User: Sohom Datta and User:asilvering try the, then, early Peacock Check prototype. Sohom raises some questions about tagging/logging when the Check is shown and the calls to action the card presents.
23 April 2025 | CEE Catch up Annual Planning Workshop
Peter presents Tone Check (then called "Peacock Check") proof of concept during the "CEE Catch up Annual Planning Workshop." Of the volunteers who were present and reacted during the call, all expressed enthusiasm for the feature and asked about when the feature would be available for testing on the wiki they are active on and how they can participate in the [[mw:Edit_check/Tone_Check/Model_evaluation|evaluation of the model].
21 April 2025 | Peacock Check feature meeting invitations published
Announcement about Peacock Check-focused community conversation included in Tech/News. AI and patrollers/reviewers explicitly mentioned in order to attract volunteers interested in and/or holding points of view on these topics, "Editors who work with newcomers, or help to fix this kind of writing, or are interested in how we use artificial intelligence in our projects are encouraged to attend."
Quiddity (Nick) publishes an announcement at MoS/Words to invite to try to make patrollers/reviewers aware of the feature and invite them to discuss and try the feature in its early state. No responses.
On 13 June 2024, people who attempt to add an external link in the visual editor (desktop and mobile) will receive immediate feedback when they attempt to link to a domain a project has decided to block.
On 7 March, the first iteration Reference Reliability check became available to everyone at all wikis, by default. Whenever anyone attempts to cite a source that a project has blocked, they will be made aware directly within Citoid and prompted to try another source.
Where "blocked" on in this context means the domain someone is attempting to cite is listed on MediaWiki:Spam-blacklist or MediaWiki:BlockedExternalDomains.json.
11 October 2023 | Reference Check deployed to first Wikipedias
On October 11, 2023, the first Edit Check (Reference Check) was deployed an initial set of wikis: dag.wikipedia.org, ee.wikipedia.org, fat.wikipedia.org, gur.wikipedia.org, gpe.wikipedia.org, ha.wikipedia.org, kg.wikipedia.org, ln.wikipedia.org, tw.wikipedia.org.
February 2023 | Editing Teams publishes summary of early community conversations
Edit Check, from the beginning, has and continues to be shaped in conversation with volunteers. In February 2023, the team published a summary of the initial 7 seven conversations they held between October 2022 and January 2023 on mediawiki.org.
Peter and Marshall, in collaboration with volunteers (Enterprisey, Leaderboard, and ValeJappo), present on the idea of infuses policies more directly into editing experiences
Product and Technology Advisory Council/August 2025 draft PTAC proposals for feedback/Communication