Jump to content

Wikimedia CEE Meeting 2025/Submissions/AI in Moderation: How Far Is Too Far?

From Meta, a Wikimedia project coordination wiki
ID : 11 AI in Moderation: How Far Is Too Far?
Author(s): Doğu Abaris Username(s): Doğu Type of submission: Roundtable
Affiliation: unaffiliated Theme(s): Community Engagement, Technology
Abstract:

With tools like Automoderator becoming available to all Wikimedia communities, we are entering a new era where AI can automatically revert edits based on machine learning. But where do we draw the line between support and overreach? This session invites participants to reflect on the risks, expectations, and governance of AI-assisted moderation, especially in small or mid-sized communities.

Slides:
Level of advancement: medium
Special requirements:
Recording (Yes/No): Yes
Photography (Yes/No): Yes
How will this session be beneficial for the communities in the region of CEE?

Many CEE communities lack the volunteer capacity to deal with vandalism and disruptive edits in real time. AI tools like Automoderator can help, but they may also raise concerns around transparency, false positives, and community control. This session will generate a shared understanding of acceptable AI use in moderation, to help guide policy and adoption decisions across the region.


Interested participants

[edit]

(register below and ask your questions now to the session organiser)

Documentation

[edit]