Research:Guided article and section creation
How do we better enable editors to make high impact contributions while reducing burdens on reviewers?
Final report and recommendations

Background & context
[edit]Creating a new article or section is one of the most impactful contributions an editor can make to Wikipedia. It directly supports knowledge equity by addressing content and access gaps, while offering contributors a tangible sense of accomplishment. However, the responsibility tied to such contributions is equally high, especially within a moderation ecosystem that safeguards content quality.
Despite this potential, article and section creation presents significant barriers—particularly for newcomers. For example, on English Wikipedia, approximately 5,000–6,000 draft articles are deleted each month.[1] First-time editors whose articles are deleted show a post-deletion retention rate of just 0.6%, compared to 4.4% for those whose contributions survive.[2]
To improve the article and section creation process, namely through a guided and supported process, we need to understand the activities, pain points, and rewarding moments of creating new articles and sections. And, given ongoing design explorations, there is a need to do early concept testing and get early formative feedback on these product ideas. Given that new article and section creation lies at an important intersection of content-creating editors and reviewers, we need to learn from a range of individuals to avoid blindspots in implications that designs may have for those impacted by workflows (e.g. the moderation and administration burdens and opportunities that may be introduced with new contribution workflows).
Research goal
[edit]This research will support product and design decisions for new tools and workflows by evaluating early-stage design concepts and incorporating the perspectives of editors, reviewers, and readers. We aim to better understand the workflows, obstacles, and successes of editors engaged in new article or section creation, while also better understanding the impact these workflows have on those tasked with reviewing new articles and larger contributions to existing articles.
Research questions
[edit]The following research questions are organized around the different types of individuals involved in the new article and section creation process. The focus for this project is on editors and reviewers.
Editors (new and experienced)
[edit]- What motivates contributors to begin creating new articles and sections?
- What tools or resources are currently used—and where do they fall short? (e.g. wiki guides including article for creation process, your first article, article wizard, new user article flowchart)
- What characterizes successful vs. unsuccessful article/section creation attempts?
- What are the key pain points throughout the creation process?
- When and why do contributors abandon the process? Conversely, why do they continue?
- How do the needs and behaviors of newcomers differ from experienced editors?
- At which moments do newcomers require the most guidance?
Reviewers
[edit]- What are the most common problematic types (and characteristics) of new article and section contributions that reviewers observe?[3]
- What aspects of current moderation efforts are most burdensome, and why?
- What signals or patterns do reviewers rely on to assess quality of new articles and sections?
- What types of guidance would reviewers like to be able to provide for those creating new articles and sections?
- How do reviewers wish to influence article quality at scale? What product interventions may be helpful for aiding influence?
- How are topics prioritized, and what formal and informal processes exist for reviewers to shape focus on topics and topic areas?
Readers
[edit]- How aware are readers that articles and sections are volunteer human editors-generated and editable? What assumptions do they have about how content is created?
- To what degree are readers interested in providing feedback to editors? What types of feedback are they interested in providing, and how? (e.g. feedback signaling success/value, feedback to highlight content gaps, etc.) Why are/aren’t they interested in providing feedback on content?
- How does engaging in feedback processes affect readers’ understanding of content creation processes and/or their identification with a ‘Wikipedia community’, however they define that?
Concept testing guiding questions (all groups)
[edit]- How do participants interpret and engage with the guided article and section creation concepts?
- What aspects of the proposed designs are intuitive or confusing, and why?
- What expectations do users have for each step or feature in the guided creation flow?
- What usability or accessibility barriers emerge during interaction with the designs?
- How do participants perceive the value and purpose of the guided experience compared to current workflows?
- What improvements or alternatives do participants suggest, especially when prompted with participatory design activities?
Approach
[edit]Methodology
[edit]Concept testing will serve as the central approach for evaluating early-stage designs of the guided article and section creation experience. We will gather feedback on how participants interpret and respond to design conceptions, capturing user understanding, expectations, usability barriers, and perceived value. Sessions will use moderated think-aloud protocols combined with structured probing to uncover reactions and misunderstandings in real time.
As part of these sessions, we will secondarily include some basic probing questions about current/past experiences with article and section creation, reserving a small portion of time for task observation of current section and article creation processes and workflows that participants currently use, noting important value, pain, and opportunity points in these current workflows.
Low-fidelity design artifacts will be used for the anticipated participant segments including: editor, reviewer, and reader. To complement concept testing, participatory design techniques will be integrated to elicit user-led ideas and improvements. These methods will allow participants to co-create interface elements or workflows, which is particularly relevant in this context of complex and community-governed ecosystems. Activities (i.e. co-design prompts) will encourage users to express needs through sketching, arranging interface components, or imagining workflows.
Wiki/language selection
[edit]Different language versions of Wikipedia all have different norms, practices, and policies around creation of new content, including varying practices for, and dynamics around, how new content such as new articles and sections are reviewed. As such, we will use a comparative approach (namely by approximate project size measured via content pages) in order to arrive at some basic understandings of how wiki size may interact with concepts that are tested as part of this project. For this project, we will recruit participants from both a large wiki (English), as well as a more medium-sized wiki (Indonesian). While we recognize there are multiple ways of determining size, we rely on the metric of number of content pages given that this project involves innovations for a more guided experience for article creation.[4]
Privacy policy
[edit]All research sessions will be accompanied by a privacy statement, detailing what data is collected and in what ways the data will be used. Any research participant quotes or close paraphrases will be anonymized by default. Participants who have been contacted for this study may, at any point, choose to opt-out of the study. For research participant recruitment, we also use a short survey to gather a limited amount of information needed to know if individuals qualify to participate. Similarly, this survey is accompanied by a privacy statement, detailing what data is collected and in what ways the data will be used.
Phases & timeline
[edit]Scoping [complete]
- Write, revise, and finalize research brief and plan. Create recruitment plan.
Preparation [complete]
- Create discussion guides and other project materials. Translation of project materials and arrangement of supporting resources, such as interpretation.
Data collection [15 Sept - 10 Oct]
- Pilot and refine discussion guide
- Ongoing recruitment of research participants
- Conduct research sessions (estimated start 15 Sept)
- Interim updates and check-ins with project stakeholders
Analysis & reporting [10 Oct - 31 Oct]
- Data analysis
- Reporting and deliverables preparation
- Review period for stakeholders and requested revisions
- Final project shareout(s) and project close
Results
[edit]Key Findings
[edit]- Sources are foundational, yet challenging. Sources are essential to the editing process, and all editors begin article creation or expansion by gathering sources. They create or expand articles based on the availability of sources and primary challenges involve finding, evaluating, and applying reliable, independent, and sufficiently covered sources.
- Judging notability remains difficult. A critical challenge, particularly for newcomers, is assessing notability and aligning content with Wikipedia policy. Editors also struggle with structuring content, writing introductions, and adding citations. Many editors experiment with LLMs to support article structure and source discovery.
- Reviewers’ primary focus is verifying sources and notability. Reviewers focus on assessing notability, source reliability, neutrality and structure in new articles. Verifying sources and notability is the most time-consuming part of reviewing, which reviewers believe is often due to editors’ limited familiarity with guidelines, resulting in promotional or poorly sourced articles. Reviewers are interested in exploring automation to help reduce this workload.
- ‘AI should guide, not write’ was the prevailing sentiment from contributors. Both editors and reviewers see greater value in AI as a tool to reduce workload, rather than to generate content.
- Guided creation concepts are broadly useful and motivating. Newcomers appreciated the structure and guidance, while experienced editors valued specific features. Reviewers felt that such tools could serve as reminders for all editors, especially newcomers. They see potential in these tools for addressing current challenges of promotional content, notability and citation gaps, AI-generated content and incorrect writing styles. Reviewers also anticipated broader benefits, including more articles, increased readership, and higher editor activity.
- Policy adherence and editorial judgment remain top concerns. While the concepts have been well received, reviewers and experienced editors emphasized that adherence to Wikipedia policies and community norms must remain the top priority. A key concern was that reviewers would prefer deeper engagement with wiki policies, guidelines and the community, rather than encouraging editors to simply create more articles. Another concern was that editors won’t modify the AI-generated or templatized content, which makes them wary of integrating such tools into their wiki.
- Feature prioritization highlights the need for flexibility and custom configuration. Participants were asked to prioritize features and both editors and reviewers identified ‘Article eligibility check’ and ‘Add from source’ as must-have features as they support fundamental steps of creating an article. Other features, like ‘Suggested sections’, ‘templates’ (from concept B), and the ‘Introduction’ structure’ were ranked highly by some participants. Participants’ priorities varied by experience, reinforcing the need for configurable features tailored to experience levels, workflows, and needs.
Full detailed reporting
[edit]Full results are available here: commons:File:(Final Report) Guided Article and Section Creation Report (1).pdf
They are organized into four main sections:
- Editor experiences creating new articles
- Reviewer experiences with new articles
- Editor and reviewer feedback on guided article creation concept features
- Reader perceptions of potential input feature
Recommendations
[edit]Guidance and customization
[edit]- Configurable features of a guided article creation process: Give editors (and potentially individual wikis) the ability to select which features are most valuable to them based on their needs, preferences, and experience with editing.
- Templates and scaffolding: Easy-to-access, pre-created structures for common topics (recommended sections, infobox templates, etc) across various article types. These should be flexible enough to allow editing based on sources available or provide a rough structure that editors can customize based on need and information available.
- Context-sensitive subcategory browser: Allow editors to expand categories and view short, example-based definitions (“Biography -> Sports figures, Artists, Scientists”).
- Ensure clarity and accountability when editing templatized or auto-generated content: Before publishing, prompt editors to review and modify any auto-generated or template text, and visually distinguish unedited placeholder material (e.g., greyed-out text) until modified by a human editor. (Provides an action prompt (behavioral cue) and visual cue (interface design.))
Automation and quality safeguards
[edit]- Auto-detect uncited statements in real time: Flag statements in need of citations, and recommend credible, wiki-compliant sources based on topic and language for the human editor to review and add.
- Source validation tools: Automate checks for reliability, independence, significant coverage, and citation format (e.g., including integration with perennial sources list).
- Automation to assist reviewer checks: Reviewers are comfortable with some level of AI-automation to assist in tasks such as verifying basic notability, automate flagging for AI-generated content, promotional, or copyright-infringing content, formatting, grammar and punctuation before requiring manual human review.
- Tone check: Consider integration of Tone Check. Reviewers believe a tool in the spirit of this may have the added advantage of weeding out sockpuppeteers. Depending on reviewer preference, this integration could happen only at the review step, or be available to editors (restricting use to the review step has the advantage of decreasing the likelihood of malicious use).
Cross-wiki and source connectivity
[edit]- Commons image integration: Editors and reviewers appreciated the inclusion of data from sister projects. However, editors need more step-by-step guidance for adding images from Commons safely, including licensing checks and ensuring that the correct image is being added from Commons.
- Article tags for reviewers’ awareness and scrutiny: Flag to review articles created as a result of the concepts for notability and sourcing.
- Support tracking throughout an improvement process to avoid reviewers having repeated oversight (for articles that need improvements but are entirely bad and submitted in good faith).
Policy awareness and editor learning
[edit]- Integration of guidelines into workflow to promote policy compliance: Contextually integrate, and progressively disclose, guidelines on NPOV, reliable sources, and the manual of style to drive editors’ awareness and adherence to the guidelines. This will also serve as an easier way for reviewers to signal and point to key policies when communicating feedback and rejections.
- Incentivize newcomers to complete mini-tasks, games, or tutorials on article creation policies: Completion of such tasks could then be integrated as signals given to reviewers in order to identify where more or less oversight may be important.
Evaluation and experimentation
[edit]- Collaborate with organizations’ (such as WikiEdu) and user groups’ (such as university fan clubs) trainings and workshops for use as a training guide and early stage feedback.
- Controlled experiment in x wiki for a period of time, during which we observe a set of metrics around newly created articles to see how outputs of a guided article creation tool compares with ‘articles from scratch’ (or namely without guidance). Reviewers are particularly interested in certain metrics, such as whether or not editors engage with, and modify, any templatized content (an important concern that was brought up). Analysis of metrics and engagement with editors and reviewers during this process can inform ongoing design improvements and development.
See also
[edit]- mw:Growth/Article creation for new editors
- en:Wikipedia:Village pump (idea lab)#Article wizard should impose a process
References / Notes
[edit]- ↑ en:User:SDZeroBot/G13 Watch; page history
- ↑ en:Wikipedia:Wikipedia Signpost/2011-04-04/Editor retention; We're in need of a more recent, updated figure for this in order to evaluate whether this has meaningfully changed.
- ↑ More specifically ‘new pages patrol/Reviewers’ (en:Wikipedia:New pages patrol/Reviewers) including administrators; see also wikidata:Q98064718.
- ↑ While we expect this comparative approach will help surface differences that may be correlated with project size, we cannot expect it to illustrate the full range of variability that may exist across projects. Only iterative testing that continuously expands the number of wikis included can help achieve that goal.