HNMCP Focus Group Modules

From Meta, a Wikimedia project coordination wiki

This page documents a research project in progress.
Information may be incomplete and change as the project progresses.
Please contact the project lead before formally citing or reusing results from this page.


Welcome and thank you for joining us. Please submit responses to modules below.

Module 4 -Evaluation Module[edit]

Guidelines/Tasks[edit]

As mentioned before, evaluation of these ideas should not consist of simple judgements. The challenge is to strengthen even the weakest of ideas to highlight strengths and considerations to incorporate into a few prototypes.

Please create your initial posts based on the following considerations: (Design exercises are incredibly challenging--don’t take the easy way out by saying “none” or “I don’t know”)


  1. This idea fulfils the goals of the brief because:
  2. This idea would work best in combination with:
  3. List some pros and cons of the idea. Explain why.
  4. Address the cons. How would you go about addressing the weakness to make the idea work better?
  5. List your top three ideas. Why are these your top choices?
Ground Rules
  1. Open the Mind: Participants should allow others to comment on their contributions and welcome edits, modifications, and additions to their ideas.

Don’t give into your initial instinct of defending your idea. Asking why an individual has that opinion is a more productive alternative. Don’t judge an idea (words such as “good” or “bad” express judgment). Say why an idea should be approached differently. Seek to improve or add to ideas.

  1. Listen: Encourage input and collaboration. Enabling learning conversations to involve diverse opinions and interests allows for a more creative solution equipped to address broad concerns.

Respect fellow participants and their opinions, perceptions, and ideas. All participants have individual experiences and knowledge that adds insight to the complexity of this challenging problem.

  1. Peer Up: Seek to organize around the outcome. Look for “unlikely partners” rather than relying on top-down hierarchies or formal authorities.

Don’t be afraid to borrow from other online platforms or even different fields altogether, such as politics, education, or the arts.

  1. Share: Prepare to share not only ideas and goals, but also the outcomes. In true collaboration, no single person is responsible for any of the outcomes.

Explain why you think an idea will work. Why are certain features necessarily for the solution to work?

  1. Act Globally: Aim to create outcomes that benefit multiple communities.
  2. Be joyful. Having a cheerful attitude invites creativity. Negativity and pessimism can frustrate innovation.

Establish a Different Name for the Process[edit]

A single link to “Report Problems” or “Report Concerns” or “Report Issues” would lead to a menu for different types of problems, such as technology, content, harassment, etc. This is to address the potential barrier to entry created by the term “harassment”: users may perceive that they would not be able to engage with a “Report Harassment” unless the situation has severely escalated. A single link to report “Harassment” alone could also make the link vulnerable to retaliation or abuse by harassers and abusers.

Discussion of Idea 1[edit]

Do we really want to obfuscate the harassment link behind a menu in this way? Or do we want to have this page in addition to a page with a big red "Report Harassment" button? Tazerdadog (talk) 05:58, 22 November 2017 (UTC)[reply]

Thanks for your quick response. We were hoping to get some more information in the format outlined above in order to further develop the ideas presented. We would appreciate if you have an opportunity to flesh out your responses. Thanks again for your participation! JosephNegotiation (talk) 15:54, 23 November 2017 (UTC)[reply]

yes, if you ramped up teahouse to respond to issues, then you could triage which are harrassment, which are incivility, and which are good faith mistakes. this would require investment in training the teahouse responders, and other teams of responders. Slowking4 (talk) 01:49, 24 November 2017 (UTC)[reply]

A variety of ways to report issues seems appropriate. I would rather see an inclusive system that addresses a variety of issues rather than single out harassment as a specific concern, which perhaps has a more limited definition. 107.170.220.144 21:25, 27 November 2017 (UTC)[reply]

broadening the reach will make sure that you collect it all, (but this is a wikimedia trope of fighting about the name, rather than improving the process). need a landing page, that is visible, with easy to report box, either on wiki or confidentially. Slowking4 (talk) 19:50, 30 November 2017 (UTC)[reply]

Clearly Defined Standards[edit]

Smaller wikis seem to require clearly defined standards that are Wikimedia-wide rather than local to the community. As with the EDP process, the two-year inactive admin removal process, and perhaps other standards, there should be clearly defined standards that apply to all wikis. Depending on the standard, local communities could be given the choice to opt out. If wikis for a given language are able to all follow the same standards, the standards would be especially helpful in the case of Wikipedia vs. Wikibooks vs. WikiNews, vs. Wikiversity, etc.

Discussion of Idea 2[edit]

yes, we have talked about code of conduct, and standard of practice, but the culture resists constraints on freedom of action by veterans. some education and training might change the culture. Slowking4 (talk) 01:51, 24 November 2017 (UTC)[reply]

I agree that some code of conduct and/or standard of practice should be in place across wikis. The culture resisting constraints on freedom of action by veterans would seem to be part of the problem and should be addressed. 107.170.220.144 21:25, 27 November 2017 (UTC)[reply]

Yes, this seems like a very important point. This could be built slowly, with some broadly consensual standards being defined first. Stewards seem equipped to deal with most of these issues, but some would run into matters of enforcement, maybe the proposal could state how enforcement would be done. Chico Venancio (talk) 22:25, 28 November 2017 (UTC)[reply]

Provide a Reporting Mechanism[edit]

A link on the left that would lead to a Special: page where users could report their concerns would seem to be the most effective. Admins on the wiki and the impartial oversight group would be able to see the reports, with notifications at the top of our page when logged in. To maintain anonymity, local admins would not be able to see who contributed the report, but the impartial oversight group would be able to see who reported. The reason for this approach is the idea of someone reporting harassment for themselves in a third-person perspective. I might post, "Dave is being harassed by Fred. See ...". Users placing the report would want to know when their report has been seen and accepted by someone, and they would want to see some type of resolution. If the result is not satisfactory, they would want to be able to escalate their concerns, either once or twice. Once would be to the oversight group. Twice would be again to the local admins, requiring that a different admin respond, and then finally to the oversight group.

Discussion of Idea 3[edit]

Perhaps appeals (either the first or the second) should be to arbcom, if such a body exists on the project? I could see using emails in a clever manner to remove the need for/minimize the functionality required from a new special page. Perhaps link to a website which contains a form with fields for the relevant information, and reports are mailed anonymously to admins/oversighters who have been recently active. Tazerdadog (talk) 06:05, 22 November 2017 (UTC)[reply]

yes, we need clear reporting channels that are widely known. we should have an on the record wiki landing page, and an off the record confidential OTRS channel. we would need response teams trained to respond in a timely manner. Slowking4 (talk) 01:54, 24 November 2017 (UTC)[reply]

There seems to be consensus that a simple interface that supports anonymous reporting, combined with a trained response team or teams is desirable. 107.170.220.144 21:25, 27 November 2017 (UTC)[reply]

User Training[edit]

The underlying issues indicate that we need to do a better job of training users in what the defined standards are (or will be once established), address the adversarial culture and bias, and train on the new reporting system. This should be a professionally developed training, with interactive web content, videos, etc. It could have different parts, first for new users, second for regular users, and perhaps third for power users (not admins). Admin training would be separate from this.

Discussion of Idea 4[edit]

The fundamental problem with this seems to be how many users avail themselves of the training. It'd be a nice resource to have, but unless we shoved it down people's throats, I don't think 1% of the population will go through this training, and if we start shoving it down people's throats, we start to create issues from that. Tazerdadog (talk) 06:08, 22 November 2017 (UTC)[reply]

yes, we can train users, just as we have w:Wikipedia:The Wikipedia Adventure. (maybe a little less structured) we can make training, as a remedy for behavior against code of conduct. we can make training a prerequisite for admin candidacy. Slowking4 (talk) 01:58, 24 November 2017 (UTC)[reply]

I'd be willing to bet less than 1% of wikipedians have seriously gone through the wikipedia adventure. I don't think the community will go for training as a punishment for bad behavior, although it's an option I guess. As far as making training a prerequisite for adminship, on en.wiki, the ONLY prerequisite for running for adminship is having an account, and there's currently a contested RFC trying to get it to de facto extended confirmed. Requiring training as a prerequesite for adminship wouldn't have a snowball's chance in hell on enwiki. Tazerdadog (talk) 10:32, 24 November 2017 (UTC)[reply]
as much as that? the fact that admins are untrained certainly shows doesn't it? i'm kinda to the "shove down the throat" stage. i wonder if we have a silent majority for it? we could certainly start with the circle of trained competence. Slowking4 (talk) 18:50, 24 November 2017 (UTC)[reply]
here is some 'Train the Trainer' [1], and a grant proposal [2]. Slowking4 (talk) 01:02, 27 November 2017 (UTC) I don't think user training can be mandatory, but it should be available. It may help when educating users who claim harassment but simply didn't understand cultural norms. There would also need to be some type of training or materials that transparently explain the complaint process, how it is handled, and what to expect after filing a complaint. Admin training should be expected for those in the role, and required for those who respond to harassment claims. Separately, anyone found guilty of harassment should be required to complete some type of training before being allowed to contribute further. 107.170.220.144 21:25, 27 November 2017 (UTC)[reply]

Admin Training[edit]

Admin training would include at least two parts. First, address the cultural issues identified so that it is more clear what is expected of admins and be more standardized. And second, address how to respond to harassment reporting, and how and when to escalate issues when necessary.

Discussion of Idea 5[edit]

This seems sufficiently similar to idea 4 that they can be discussed together in my mind. Tazerdadog (talk) 06:09, 22 November 2017 (UTC)[reply]

to the extent that admins are public facing, their responsibility to follow a standard of practice, in addition to code of conduct is higher. we should have HR training in how to deal with the public. we need to increase the standard of behavior for admins; training can be a response for not following the standard of practice. Slowking4 (talk) 02:02, 24 November 2017 (UTC)[reply]

Assuming good faith, harassment is at its base a misunderstanding of standard behavior on one or both sides. I agree with "training can be a response for not following the standard of practice", and should apply to admins and users who do not follow the standard in regard to harassment. 107.170.220.144 21:25, 27 November 2017 (UTC)[reply]

i wish i could agree that it is a "misunderstanding". rather, among a small minority, it has become a deliberate strategy as a part of fear uncertainty and doubt. so enforcing a standard will be important to maintain NPOV. Slowking4 (talk) 19:55, 30 November 2017 (UTC)[reply]

Establish Impartial Decision Makers[edit]

Part of the underlying issues is that there is no impartial oversight group. Some admins aren't really concerned about abusing others because there isn't any impartial oversight. In other cases, the only engaged admins are either the abusers themselves or they are the target of abuse and can't respond impartially. This impartial group would be a trained subset of stewards. Stewards are already trusted with an admin oversight role. It may be necessary to have more stewards to address this, and specifically recruit stewards with some type of counseling/mediation training or experience. Depending on interest and demand, the position could be a volunteer role, or a paid-by-WMF role.

Discussion of Idea 6[edit]

Is the phenomenon of strongly biased admins something that is more prevalent outside of en.wiki - this is not really a problem in my experience, so i loo at this and think it's overkill. Tazerdadog (talk) 06:16, 22 November 2017 (UTC)[reply]

it is a problem in my experience. yes, we need additional training for stewards as HR managers. as they tend to be across languages and cultures, they would need training in cultural issues. and we need a process to discuss steward misbehavior. we would need to train an OTRS harassment response team, to supplement the safety team. counseling/mediation skills are vital. Slowking4 (talk) 02:06, 24 November 2017 (UTC)[reply]

I agree that it seems to be an issue on smaller wikis, and support the ideas Slowking4 recommends. 107.170.220.144 21:25, 27 November 2017 (UTC)[reply]

I agree with Slowking4 as well, this seems like a good step. Chico Venancio (talk) 01:27, 29 November 2017 (UTC)[reply]

A Page to Identify Recently Active Admins and Oversighters for Simpler Cases[edit]

Create a page where the three most recently active admins and oversighters are listed. Direct people there with links from other policies. Have a template on the page which populates a report for the admin. Include collapsed tutorials on things like how to make a diff for new users. Template gets posted on active admin(s) talk page(s), and an admin takes care of the problem.

Discussion for Idea 7[edit]

not sure about burdening most active, would rather have landing page, with trained response team. Slowking4 (talk) 02:10, 24 November 2017 (UTC)[reply]

A Special:Staff (or other name) page that shows recently active users with admin rights might be helpful in general, as well as for addressing simple cases. 107.170.220.144 21:25, 27 November 2017 (UTC)[reply]

We use X! tools adminstats to see this in Portuguese Wikipedia. I am not sure it really solves the feeling of helplessness for users that are being harassed. Chico Venancio (talk) 01:36, 29 November 2017 (UTC)[reply]

Community Discussion for Difficult Cases[edit]

The community on each project has to discuss these. This is non-negotiable from the community's perspective. If harassment is so private that the community can't address it, it needs to go to arbcom, or oversighters. If these processes are broken, then they need to be fixed from within. Trying to do an end-run around the community when the parties involved are 2 or more admins/powereditors is likely to end badly.

Discussion of Idea 8[edit]

If there are globally accepted standards, perhaps with opt-out clauses as needed for some individual community or culture, and external advocates to assist those who feel they are being harassed, then the community can discuss difficult cases. But small wikis where there may only be a few active admins and a delicate balance of power may not have any effective way to discuss difficult cases and move forward otherwise. 107.170.220.144 21:25, 27 November 2017 (UTC)[reply]

The problem is how we decide to define each community. The "Wikimedia Movement" has historically decided to give absolute autonomy to each project (generally monolingual and specific to one activity), but if this definition is broadened to include a general minimum decorum expected across all projects, then I could agree to this. To simply leave contributors to be harassed until they can muster enough political support among what we currently define as "a community" is the current status quo, and is not acceptable. Chico Venancio (talk) 01:40, 29 November 2017 (UTC)[reply]

if we could strongly advise training, as we advise EDP, maybe we could improve small wikis. need to be sensitive to cultural issues, but safe space is a part of free knowledge. Slowking4 (talk) 20:01, 30 November 2017 (UTC)[reply]

Support Staff for Harassed Users[edit]

It will look like staff supporting admins fighting harassment both reactively and proactively.It will be used whenever admins need information to make a decision on a case. The solution will specifically accomplish an improvement on the time taken to make a decision and a reduction of the burden on administrators. Users who interact with this solution will expect that the solution will produce faster and more decisive action to curb hars.

Discussion of Idea 9[edit]

This sound really expensive if we are paying these staff, and really unreliable if we are not. Doing this with volunteers could create another ORES level project. Tazerdadog (talk)

it is working for teahouse, and wikihow welcome - but yes, some support with staff collaboration will make it sustainable. need to build response teams, with a mix of volunteers and staff. Slowking4 (talk) 02:13, 24 November 2017 (UTC)[reply]

This seems to be a necessary cost. It requires dedicated, trained "professionals" working alongside volunteers to response to issues effectively. 107.170.220.144 21:25, 27 November 2017 (UTC)[reply]

Tazerdadog, I don't see this a necessarily prohibitively expensive. And I agree with Slowking4 here, a mixture of paid staff and volunteers is most probably the best answer. Chico Venancio (talk) 01:43, 29 November 2017 (UTC)[reply]

“Secret” to Help Harassment Victims[edit]

See Secret (app). It will be located on the web, announced on the Wikimedia communities. The solution will specifically provide a place to seek help and lower the toll of harassment on users. Users who interact with this solution will expect that the solution will produce a space where they can vent their worries about their roles in Wikimedia communities and the harassment they endured without fear of repercussion. This idea will make a good product/solution because it might help users find ways to deal with their frustration caused by the harassment. It will achieve the desired end result which is/are: improve the satisfaction of the harassed users. This solution promotes ease of use, anonymity and a safe space.

Discussion of Idea 10[edit]

yes, this would be an OTRS alternative, but would require a team, of trained councillors. Slowking4 (talk) 02:15, 24 November 2017 (UTC)[reply]

There seems to be agreement that anonymous reporting without fear of repercussion be part of the solution. 107.170.220.144 21:25, 27 November 2017 (UTC)[reply]

Auto Tagging Harassing Content[edit]

It will be located as new ORES model to identify harassment and tag it. It will be used when edits are made on Wikimedia projects. The solution will specifically reduce the burden on humans and increase response time to harassment. Users who interact with this solution will expect that the solution will produce accurate assessment of harassing edits. This idea will make a good product/solution because everyone likes automation and AI. It will increase efficiency, speed, and response rate of anti-harassment systems. These attributes are necessary because the community is jaded to anti-harassment initiatives and will be viewed by the rest of the community as a smart way to tackle this issue because it will improve responses.

Discussion of Idea 11[edit]

How accurate can we realistically make this ai? I see a lot of potential here, even if the system isn't perfect. Pinging @MusikAnimal: to see if we can get an informed opinion (or if he can point us to someone who can give that opinion). Tazerdadog (talk) 06:28, 22 November 2017 (UTC)[reply]

AI would leverage human response, and input workflow for team. we already auto-filter vocabulary, could certainly flag vocabulary for action, by humans. Slowking4 (talk) 02:17, 24 November 2017 (UTC)[reply]

It seems reasonable that some type of pattern matching could be used to automate the identification of common harassment language and/or actions. 107.170.220.144 21:25, 27 November 2017 (UTC)[reply]

Tazerdadog To me, the accuracy depends on how much effort we devote to this. Not only in terms of technical development, but in terms of the member of the movement actually helping with the tests and using the tools. As an AI problem, it does seem that it would need a lot of development, but it is not beyond the realm of possibility, IMHO. Chico Venancio (talk) 01:47, 29 November 2017 (UTC)[reply]

An Investigations Tracking System[edit]

My solution will look like an harassment reporting system that can receive private information and track investigations by involved users. It will be located in Wikimedia servers, and maintain private and public information about each case. It will be used when cases are reported. The solution will specifically accomplish an effective tracking system for cases and investigations. Users who interact with this solution will expect that the solution will produce a better way to provide information and ask for action. This idea will make a good product/solution because harassment victims need a way to share evidence and admins need a way to track cases.

Discussion of Idea 12[edit]

Something like Phabricator, but for harassment instead of bugs? Tazerdadog (talk) 06:30, 22 November 2017 (UTC)[reply]

yes, you will need to have a tracking system, for metrics to report aggregate results for accountability. you would need to have input from all response processes, with an update of action taken and outcome, and user feedback. Slowking4 (talk) 02:19, 24 November 2017 (UTC)[reply]

Agreed that a tracking system will be necessary. A centralized tracking system would also be able to identify users who are abusive across wiki projects. 107.170.220.144 21:25, 27 November 2017 (UTC)[reply]

Tazerdadog, well, yes in the terms that Phabricator is a tracking system and this would be as well. It seems like a great tool to help with achieving the balance between transparency and privacy. Chico Venancio (talk) 01:49, 29 November 2017 (UTC)[reply]

General Discussion[edit]

Rather than spread the discussion out over a bunch of the proposals, let's address the elephant in the room now. Training users to deal with harassment seems to be a common thread in a number of these proposals. I'm pretty convinced this isn't going to work, but let's suspend judgement for now and just discuss the problem.

First of all, on en.wiki, requiring harassment training for advanced permissions is going to go over like a lead balloon. We can (and probably should) make such training available. However, we then wind up with a leading horse to water problem. If we just released a training course, I don't think it would be widely used. There is a widespread ethos on wikipedia that anyone can do any job with no special training. That means that a training course will be treated suspiciously and largely ignored. If one of these is made, it has to be very fast, and very well-made. 5 minutes of professionally made content seems like the maximum that will get through to a sufficient number of wikipedians.

A special case might exist for training oversighters. These users have made enough of an investment in the project that they'd be largely willing to sit through a training, and they're in a position to use that training. That sad, professionally made videos for 20 or 30 users (maybe up to a couple hundred if I incude all OSers in all languages plus stewards) seems like a colossal waste of resources. Tazerdadog (talk) 10:49, 24 November 2017 (UTC)[reply]

we have been trying technical solutions to people problems for 15 years; we have known since 2009, that we need to invest in culture change [3]; best to get started now. i am ready to cram down the lead balloon. the veterans can change the culture, or we can impose it by hostile take-over. they can be as truculent as they want to be; so long as they act in a professional manner. we will need graduate level human resource management training. OCLC just did an online course for librarians,[4] so can wikimedia. we can start with stewards and arbcom, but every public facing volunteer will need training. Slowking4 (talk) 02:13, 26 November 2017 (UTC)[reply]

Change management can be a gradual process, slowly raising overall standards. It can also be peer-based, with one admin encouraging others to go through the training. It could / should also be mandatory for any admin or user found to be harassing others. It is possible to stop most harassing behavior. But it requires dedication, authority, and willingness to make it happen. 107.170.220.144 21:25, 27 November 2017 (UTC)[reply]

I am with Slowking4, on this. But I do not see it as necessary to happen rapidly. Making the training available would be the first step, and we can look to the Community Capacity Development program as an example that there are a lot of editors willing to participate in trainings for whatever reason. How to scale training to a large section of the movement is a problem that can be taken on with small steps, it doesn't have to start as a requirement, but maybe start by having the voices of those that attended training have more impact on the decisions about harassment cases. Chico Venancio (talk) 01:56, 29 November 2017 (UTC)[reply]

i agree urgent that it happen, but not rapidly, the capacity development is a good model. timeline determined by ability to build human resources. can use circles to improve process over time. Slowking4 (talk) 20:25, 30 November 2017 (UTC)[reply]
I think the Community Capacity Development program is a great model, but needs a much larger scale. And perhaps other, online-based, models. Chico Venancio (talk) 20:34, 1 December 2017 (UTC)[reply]