Research:Beyond the Individual: Community-Engaged Design and Implementation of a Framework for Ethical Online Communities Research/Results Draft

From Meta, a Wikimedia project coordination wiki

Executive Summary[edit]

Wikipedia Community Values.

  1. Support for Editors: Safety, Respect, and Recognition for Effort
  2. Knowledge Production: The Five Pillars
  3. Systems for Administration: Democracy

Research Community-Level Harms/Benefits.

  • Benefits are closely tied to supporting community values: maximizing efficiency, improving discourse, and recognizing committed editors.
  • Harms from research demand more from the community than it contributes in value: research that fails to publish results, paywalls results, identifies non-novel findings, and fails to generalize to the community.

Decisions about what research takes place should...

  • Support the community's well-being.
  • Emphasize privacy.
  • Enforce community guidelines for pre-registration, transparency, and open publication.

Areas for improving how Wikipedia makes research decisions.

  • Increase the visibility of research when consulting the community.
  • Enforce guidelines for researchers to maintain stronger user profiles.

Instructions[edit]

This is an evolving document based on two research workshops with 15 editors conducted on 11/11/23 and 04/11/24. Given the small pool of participating editors, please consider leaving your thoughts or feedback so we can revise this draft to better reflect the broader community.

  • Please use the talk page/discussion feature to share your thoughts. If you prefer to leave private feedback, please send an email to zentx005@umn.edu.
  • Open to all editors. We invite feedback from all editors, especially those who didn't attend the workshop.
  • All forms of feedback are welcome. This document is intentionally conscience and your feedback can be too. A simple +1/-1 to any section or item on the talk page helps us know that you agree/disagree. Longer reflections are equally appreciated.
  • Final multi-stakeholder workshop. We will conduct a final workshop with Wikimedia Foundation Employees, Wikipedia Researches, and Editors on 04/30/24 from 3-5pm CST using feedback from this document. We are interested in supporting more editors during those discussions. If you are interested, please reach out to zentx005@umn.edu or leave a note on my talk page.

Preliminary Findings on Ethical Online Communities Research [Draft][edit]

What values are important to the broader Wikipedia community?[edit]

We asked editors to discuss and rank values present in the Wikipedia community by importance.

The community’s values support Wikipedia, the encyclopedia, and its Five Pillars, but may diverge in terms of priorities and importance. Below we characterize three high-level community values ranked by importance:

  1. Support for Editors: The most important community values support editor collaboration and prioritize the people who contribute. Where the five pillars leave no room for personal opinions, the community values Inclusivity, Safety, Respect, and Consensus acknowledge that community members have different experiences and are used to achieve neutrality. Additionally, Mutual Recognition for Effort (e.g., article nominations or barnstars) is important for building a sense of community that centers its committed editorial base.
  2. Shared Content and Knowledge Production: These are values that support Wikipedia’s mission and the five pillars. These values are already well-documented and support making a better encyclopedia.
  3. Community Systems for Administration: Finally, the community values Democracy and the horizontal justice systems for elected and accountable administrators. Editors felt WP:5P5 was closely related to this. Instead of enforcing firm rules, guidelines should be framed by the above community values.

What benefits/harms should future research strive for/avoid on Wikipedia?[edit]

We asked editors about ways future research can benefit and harm the community.

Generally, editors have positive outlooks on how research can bring value to the community. Community benefits are closely tied to the values outlined above. Examples include:

  1. Maximizing the efficiency of editors' contributions.
  2. Improving community discourse.
  3. Recognizing committed editors.

Community-level harms can be characterized by research that demands more from than it contributes in value to the community. Wikipedia’s guidelines for research capture many aspects of how to avoid harms. Here, we outline a distinct set of community harms salient to editors:

  1. Failing to publish results or publishing results in non-open-access venues.
  2. Mundane and already-known topics with non-novel research findings.
  3. Results that fail to generalize to the community as a whole—including but not limited to, oversampling and overloading subgroups outside the core editor base (i.e., affiliate groups, administrators, and minority groups).

How should people make decisions about future research on Wikipedia?[edit]

We asked editors to discuss, in the ideal situation, how different stakeholders should be involved in making decisions about what research is conducted with the Wikipedia community.

Core editors, elected administrators, and research committees are central to the decision-making process.

  1. All editors should have the opportunity to be involved in contributing to research decisions and should be the voice behind other’s decisions. Editors feel increased contribution to the community merits a more central role in this process.
  2. Wikipedia administrators are elected by and accountable to the community. While this role doesn’t necessarily come with additional knowledge for making research decisions, they have a vested interest in the community’s well-being.
  3. A research committee (e.g., former RCom), could maintain tools and guidelines for conducting research on Wikipedia. Though this position no longer exists, editors envision they would be well-equipped to make actionable changes in Wikipedia’s policies for research activity.

The way editors envision research decisions to be made mirrors existing structures with a few exceptions. Briefly, we will reinforce aspects of these processes that are important to editors.

  1. Decision makers should evaluate the work’s integrity and how it supports the community’s well-being.
  2. Research should emphasize privacy and anonymity when representing the community. For a detailed reflection on this central issue, we’d highly encourage you to read the Wikimedia Foundation’s Whitepaper Draft on Privacy.
  3. Researchers must comply with policies for pre-registration of proposed research, transparent documentation, and open publication.

We highlight potential two areas for improvement.

  1. Editors want to be notified about community research in a trustworthy way through a medium they can see. Only a small pool of editors use meta-wiki—using this as the primary venue for soliciting editor feedback degrades its visibility and biases who in the community contributes to making research decisions.
  2. Elaborating on researcher transparency, the track record of researchers is important to consider when making research decisions. The who behind the research is necessary to understand in order to evaluate the what. User profiles for researchers should contain information about their research history, motivations for research, and involvement in the community.