Wikimedia Foundation Annual Plan/2023-2024/Collaboration/Moderators conversation

From Meta, a Wikimedia project coordination wiki

Overview[edit]

As part of the Wikimedia Foundation's Annual Plan for 2023-2024, one of our key objectives is to improve moderation workflows. To achieve this, we would like to gather insights and opinions from moderators regarding the challenges they face. We're particularly interested in understanding more about how you determine where your attention is needed, and how you prioritize different tasks.

When we say 'moderators' or 'moderation', we're referring to contributions by editors with advanced rights, such as patrollers, administrators, and functionaries.

Objectives[edit]

The call is intended to provide more clarity for the product and technology departments, ensuring we hear directly from moderators about what they think needs to be done to improve moderation workflows. The focus group call is scheduled for 4th May 2023, from 17:00 to 18:00 UTC. During the call, we would like to discuss the following questions in particular, but we are open to discussing needs in general and can talk a little about some projects we're considering:

1. How do you determine where your attention is needed when navigating your project?

2. How do you prioritize what needs your attention?

Your input will be directly valuable to determine how WMF product teams spend their time next year.

Information[edit]

Sign-up List[edit]

Please sign up below to confirm your participation in the call:

  1. Atsme📞📧 21:23, 27 April 2023 (UTC)[reply]
  2. Sage (Wiki Ed) (talk) 15:26, 28 April 2023 (UTC)[reply]
  3. RoySmith (talk) 12:44, 2 May 2023 (UTC)[reply]
  4. MikheilBeruashvili (talk) 15:43, 3 May 2023 (UTC)[reply]
  5. BusterD (talk) 15:36, 4 May 2023 (UTC)[reply]
  6. Jc37 (talk) 15:54, 4 May 2023 (UTC)[reply]
  7. MJLTalk 17:46, 4 May 2023 (UTC)[reply]

WMF staff attending to listen and ask your thoughts on these topics:

  1. Samwalton9 (WMF) (talk) 14:05, 26 April 2023 (UTC)[reply]
  2. LWyatt (WMF) (talk) 15:04, 26 April 2023 (UTC)[reply]
  3. User:CLo (WMF) (talk) 15:55, 26 April 2023 (UTC)[reply]
  4. KStoller-WMF (talk) 15:57, 26 April 2023 (UTC)[reply]
  5. Tgr (WMF) (talk) 15:59, 26 April 2023 (UTC)[reply]
  6. KHarlan (WMF) (talk) 18:56, 26 April 2023 (UTC)[reply]
  7. RHo (WMF) (talk) 06:58, 27 April 2023 (UTC)[reply]
  8. MMiller (WMF) (talk) 18:19, 3 May 2023 (UTC)[reply]
  9. JTanner (WMF) (talk) 15:44, 4 May 2023 (UTC)[reply]
  10. DBrant (WMF) (talk) 15:50, 4 May 2023 (UTC)[reply]

Notes[edit]

The following are generalised notes from the meeting on the topics we discussed and links that were shared.

  • The Key Result is about us focusing on the tools that experienced editors need to curate and moderate their projects. In the recent past we've had more of a focus on new contributors and growing wikis, but we want to make sure that we're also giving editors with extended rights the tools they need to handle this growth.
    • This KR is focused on product initiatives, rather than social ones.
    • When we work on a project we want to ensure that it is beneficial to multiple wikis ("four projects"), extends rather than constrains how creative communities can be in carrying out a workflow, and we want to work with moderators to decide how we should measure our impact.
  • Worth noting that both French and German Wikipedia communities have administrator meetings which might be useful to learn from.
  • On large Wikimedia projects communities have often built specialised tools, gadgets, bots, and scripts to facilitate moderation workflows. That's not the case on smaller wikis. Are we thinking about this problem?
    • Jazmin Tanner shared information about the apps teams' priority work stream for next year - patrolling in the apps.
    • We discussed the pros and cons of using a mobile phone to perform this work. Reactive and easy tasks are well suited to mobile, but some patrollers would rather have a big screen with the capability to jump between different tabs.
  • Is there anything we can do to fix the remaining issues with new editors not knowing that community members are trying to talk to them?
  • There are a few ways the WMF could go about improving moderator workflows, including providing better tools for clearing backlogs, or helping new editors not make edits which require moderation and helping them become moderators themselves. Which should be the focus?
    • Administrators can have difficulty with the all-or-nothing toolkit - needing to be comfortable with both user conduct and content issues. Could we unbundle these? This is really a community issue, the technical side of things is fairly simple to implement.
  • Is finding and prioritising tasks an issue?
    • Not really, on enwiki. There are queues and dashboards. It would be better to focus on preventing the backlogs from getting bad in the first place.
  • Is anything happening with global templates? They're a hugely powerful tool.
    • Hasn't been part of this conversation so far, but product leads will review.
    • Templates can also be complex - can we make them easier to work with?
  • The Growth team have a few ideas they're exploring for next year.
    • Draft Growth team annual planning ideas are here.
    • One which will almost certainly be worked on is extending Community Configuration so that other product teams can provide communities with direct and easy configuration options for new initiatives (e.g. Special:EditGrowthConfig)
    • They're going to be focused on the 'prevent new editors from making problematic edits' angle. Another project ongoing here next year is Edit check.
  • AbuseFilters are very powerful, but they're limited to specific regex and conditions. Are there any thoughts about integrating machine learning?
    • This is a topic the Moderator Tools team has been investigating over the past couple of weeks. The WMF Research team has been working on a new language-agnostic revert-prediction model, which could be used for this purpose. It may need to be an anti-vandalism bot rather than in AbuseFilter for performance reasons, but we're quite interested in this idea.
      • This would be particularly useful for medium-sized wikis which don't have a bot of their own. We'd have to consider how to handle false positives across multiple languages.
  • The Growth team has also been discussing ideas around AI chatbots to point new users at help and policy pages or to summarise them.
    • Lots of community consultation would be required for this.
    • Could be useful, but separates new editors from the community. Asking and answering questions can be a great way to feel engaged.
  • There are also a raft of miscellaneous requests from administrators and stewards which have been made over the years and never actioned. For example:
    • Giving administrators the ability to see deleted contributions for an IP range
    • Stewards want to be able to globally lock users (not just IPs). There was an open unmerged patch for this.