Jump to content

Grants:Project/JackieKoerner/Addressing Implicit Bias on Wikipedia

From Meta, a Wikimedia project coordination wiki


statusnot selected
Addressing Implicit Bias on Wikipedia
summaryI will identify biases affecting Wikipedia to improve neutrality and reliability.
targetEnglish Wikipedia
amount72,155
granteeJackiekoerner
contact• Jackie.koerner(_AT_)gmail.com
volunteerMasssly
this project needs...
volunteer
grantee
advisor
join
endorse
created on05:06, 21 February 2020 (UTC)


Project idea

[edit]

What is the problem you're trying to solve?

[edit]

Bias is a problem for Wikipedia. As contributors to Wikipedia we try to be as neutral as possible, but bias is part of everyone. As a result, our biases influence our actions, including building an encyclopedia. Even though contributors to Wikipedia aim to be neutral, and policies, practices, and tools were developed to support quality encyclopedia development, bias still creeps into Wikipedia simply because it is built by people.

No matter how objective we aim to be bias complicates everything. Bias is our default setting. Implicit bias is unconscious and affects people's actions, behaviors, and understanding without them even being aware of it. Implicit bias is learned through socialization beginning in childhood. It is reinforced through social interactions with like-minded individuals and through messages in social media, literature, and other cultural and environmental methods. Without challenging our biases and reflecting on how they impact our behavior, our biases can become problematic. They already have become problematic for Wikipedia.

We already know bias exists on Wikipedia but we now need to know its impact. We already know broader ideas about how bias impacts Wikipedia. We know bias prevents knowledge equity from being realized. We know bias creates barriers to participation, inhibits communication and collaboration, and limits content by way of unequal notability and verifiability standards.[1] Bias fuels the technological learning-curve and produces the potential for and actual adversarial experiences with community members. These situations were both started and are perpetuated because of implicit bias.[2] We need to "break down the social, political, and technical barriers preventing people from accessing and contributing to knowledge." We need to not just be able to broadly say bias exists on Wikipedia. We need to point to it and create solutions. This is what I will do with this project.

The existing content and policies are driven by the interests of the most active contributors, who are largely white and cisgender male.[3] Studies have shown 84% of Wikipedia articles focus on Europe and North America.[4] Additionally, many of the existing articles about emerging communities are authored by contributors in developed communities. These contributors in developed communities are constructing content using their implicit biases, which can lead to an imbalanced or biased content.[5] This bias does not just affect articles about emerging communities, but is simply more obvious in that situation.

Some of you might be thinking right now that bias only applies to some people. Wikipedians aim to be neutral and follow the policies. Bias comes out when people are upset, stressed, mad, or feel threatened. We are all passionate about Wikipedia and when someone does something which challenges our beliefs regarding Wikipedia our bias drives our immediate reaction. For example, this could be an edit to your favorite actor’s article about sexual assault allegations. You initially want to believe they are innocent, or might disagree with the way the content is written. Or think about an infuriating RfC (Request for Comment) you participated in. Or about notability and reliability. Our reactions to these situations are driven by bias. These examples are just a few examples of where bias shows up on Wikipedia. Bias on Wikipedia is everywhere. We already know this. What we need to learn is how it is impacting Wikipedia. I want to find out where it shows up, how it’s impacting neutrality, notability, and verifiability, and what we can do about it. Knowing where implicit bias affects content creation will help us develop reliable, neutral, high-quality content. Making implicit bias something we recognize in ourselves and challenge in others will make our community and content stronger.


What is your solution to this problem?

[edit]

For the problem you identified in the previous section, briefly describe your how you would like to address this problem. We recognize that there are many ways to solve a problem. We’d like to understand why you chose this particular solution, and why you think it is worth pursuing. Remember to review the tutorial for tips on how to answer this question.

I will study bias on English Wikipedia to find out how it is impacting neutrality, notability, and verifiability. I will use the data from the investigation to inform practice I will make recommendations for policies and policy changes. Additional solutions may be created based on the data from the investigation. This study will also provide strong research content for Wikipedia contributors to use to develop solutions or further research.

Project goals

[edit]

What are your goals for this project? Your goals should describe the top two or three benefits that will come out of your project. These should be benefits to the Wikimedia projects or Wikimedia communities. They should not be benefits to you individually. Remember to review the tutorial for tips on how to answer this question.

This proposed project will fulfill the following goals:

  1. Conduct an in-depth investigation of bias on English Wikipedia to better understand the problem.
  2. Interview Wikipedia contributors about their experience with bias.
  3. Review content pages on Wikipedia to find bias.
  4. Review Wikipedia related content off-wiki to find bias.
  5. Publish a narrative report about bias on English Wikipedia.
  6. Use data to develop recommendations to change implicit bias’ impact on Wikipedia.
  7. Increase knowledge about how bias impacts participation, policy, and infrastructure.
  8. Create recommendations for policies and policy updates.
  9. Provide a study design template for other communities to adapt and reuse.

Project impact

[edit]

How will you know if you have met your goals?

[edit]

For each of your goals, we’d like you to answer the following questions:

  1. During your project, what will you do to achieve this goal? (These are your outputs.)
  2. Once your project is over, how will it continue to positively impact the Wikimedia community or projects? (These are your outcomes.)

For each of your answers, think about how you will capture this information. Will you capture it with a survey? With a story? Will you measure it with a number? Remember, if you plan to measure a number, you will need to set a numeric target in your proposal (i.e. 45 people, 10 articles, 100 scanned documents). Remember to review the tutorial for tips on how to answer this question.

During this proposed project, I will observe and review talk pages, discussion pages, and other locations where conversations take place between contributors, whether on- or off-wiki. I will not be able to review all pages, but a sampling will be sufficient. I will begin my focus on topics where bias has an overt impact. This would be, for example, gendered topics. Then I will examine conversation on popular pages, the Village Pump, Articles for Deletion, Requests for Comment, policy and procedure pages, as well as others as deemed appropriate for the investigation.

I encourage community participation in submitting examples of implicit bias documented on-wiki, in conversations off-wiki, and through personal experiences. I will study policies and procedures. Throughout the proposed project, I will continue to review published material by other authors regarding bias on Wikipedia and other related content.

The study and supporting information will be developed into a narrative report to be published on meta. This narrative and full report of the proposed project findings will be published upon study completion. The narrative will provide the readers with an overview of the information and the effect of bias on Wikipedia. The report will contain more detailed information about bias, the impact, and what can be done.

Do you have any goals around participation or content?

[edit]

Are any of your goals related to increasing participation within the Wikimedia movement, or increasing/improving the content on Wikimedia projects? If so, we ask that you look through these three metrics, and include any that are relevant to your project. Please set a numeric target against the metrics, if applicable.

Community participation will enhance the activities of this proposed project. This is done by providing both feedback and participation. The broader Wikimedia community will be able to follow the process of this proposed project on Meta and provide feedback about the draft stages. A community discussion session can be organized among Wikimedia volunteers who show interest.

The community will participate in sharing information about this project and developing awareness of the project.

I plan to have community members participating in interviews and sharing examples of bias with me. I also seek their input for the report and policy recommendations to be developed.

I will convene with community members whenever possible to discuss the results and the implications. This will be conducted via online communication platforms (video chat or call).

I welcome additional community participation throughout the project.

Project plan

[edit]

Activities

[edit]

Tell us how you'll carry out your project. What will you and other organizers spend your time doing? What will you have done at the end of your project? How will you follow-up with people that are involved with your project? The methodological approach for this proposed project is grounded theory. This means I go into this study with a base knowledge of the subject, in this case bias, and see what I see. The information found through this proposed project will drive the solutions I develop and tell me what further research needs to be done.

For this proposed project, I will look at written dialogue, listen to experiences through semi-structured interviews, and review policies and procedures. The written dialogue will be talk pages, discussion pages, and other written conversations between contributors, both on- and off-wiki. The semi-structured interviews will take place via audio or video call, depending on the individual's comfort level. The interview is semi-structured meaning I will have a list of questions and topics to discuss, but will allow for the interview to be more like a conversation if the information discussed is relevant to the study. If the participant agrees, the interview will be audio recorded so I can go back and review later for clarification (See Privacy below). All audio recordings and identifiable information will strictly be kept private.

Additionally, policies and infrastructure of English Wikipedia will be examined for implicit bias. Throughout the proposed project, I will continue to review published materials about Wikipedia, as new findings by other authors and researchers will inform and guide this proposed project.

Community participation will enhance the activities of this proposed project. This is done by providing both feedback and participation. The broader Wikimedia community will be able to follow the process of this proposed project on Meta and provide feedback about educational materials in the draft stages. Community discussion sessions can be organized with Wikimedia volunteers who show interest.

The investigation and supporting information will be developed into a narrative and a report. This narrative and the full report of the proposed project findings will be published on-wiki upon completion. The narrative will provide the readers with an overview of the information and the impact of bias on contributors. The report will contain more detailed information.

In an effort to remain reflexive and as unbiased as possible, I will be maintaining a research journal on-wiki. This practice of keeping a research journal is beneficial in qualitative research studies in aiding the researcher to identify bias, and keep a record of thoughts and interpretation. Essentially, the researcher is the tool with which the data are reviewed. This journal aids in transparency so readers and researchers interested in replicating the proposed project can identify how the researcher came to these conclusions.

Considering traveling to various communities is uncertain given the Covid-19 pandemic, I will host online community discussions and virtual engagement whenever possible about the study, implications, and present at virtual conferences and meetings. Additionally, I welcome discussions with members of other communities about replicating and adapting this study in their communities currently or in the future.

Covid-19 Planning

[edit]

Due to the current Covid-19 pandemic this proposal has changed from the original format. Travel and attendance of in-person conferences have been removed from the activities and budget.

This actually may work out for the better considering there will be less cost to the proposal and outreach will be greater. I can engage with more communities virtually about this work. This is beneficial considering travel is not always feasible for all people in our communities.

Privacy

[edit]

While the Wikimedia community does not have an Institutional Review Board (IRB), I take the privacy of participants seriously, and adhere to privacy standards I have with other projects where I did have to have IRB approval. The materials from interviews will be anonymized. If a participant gives consent, full transcripts of their interview will be published. Otherwise, only excepts and essence from the interview will be published. In either case, participants will be able to review the written content from their interviews before publication in order to correct or clarify the content. Names or other identifying information about the participants will not be published. Quotations from conversations off-wiki will not be published. Only the essence of those conversations will be published. This is in an effort to protect the identities of the participants and in an effort to not violate the trust these communities have developed in their off-wiki communication. If I am able to access off-wiki communication, available in asynchronous communication platforms (Slack, Telegram, etc.), I will purely observe. No quotes will be taken. No names identified. Only observations of the essence of the conversation as pertaining to and deriving from bias will be included in the proposed project notes and published documents.

Budget

[edit]

How you will use the funds you are requesting? List bullet points for each expense. (You can create a table later if needed.) Don’t forget to include a total amount, and update this amount in the Probox at the top of your page too!

Budget item Description Amount (USD)
Software 9 month subscription to Dedoose (my choice CAQDAS), during the coding phase $134.55 ($14.95/mo.)
Salary Full-time, 12-month term, calculated by the average salary of a qualitative research analyst $72,000 ($34/hour based upon a 40 hour work week)
Translation Translation of research findings and the produced educational materials $25 per page, per language
Total (without translation costs) $72,134.55

I will fulfill the roles of qualitative research analyst and additionally schedule interviews, transcribe interviews, code data, write and edit the narrative and report.

Community engagement

[edit]

How will you let others in your community know about your project? Why are you targeting a specific audience? How will you engage the community you’re aiming to serve at various points during your project? Community input and participation helps make projects successful.

I will engage with communities and organizations with interest in the proposal subject-matter both on- and off-wiki, like:

  1. Wikiprojects focused on diversity and systemic bias (for example: English Wikipedia's Gender Gap Task Force, English Wikipedia's Women in Red, WikiProject:Disability)
  2. Groups that work both on- and off-wiki on systemic bias (e.g. AfroCROWD, Art+Feminism, Wiki Loves Women, Wiki Loves Africa)
  3. User groups focused on diversity and systemic bias (e.g. WikiWomen)
  4. Whose Knowledge "aims to correct the skewed representations of knowledge on the internet."

This list of communities and organizations was developed due to their existing experiences regarding bias and possible interest in bias due to their efforts or the subject-matter they deal with as a community, project, or organization. They have existing knowledge and experience dealing with bias on Wikipedia. The broader Wikimedia community can follow the process of this proposed project on Meta and provide feedback about educational materials in the draft stages. A community discussion session can be organized among Wikimedia volunteers who show interest. The community interested in changing implicit bias’ impact on Wikipedia will make this proposed project stronger. Suggestions and community participation are welcome! This list will be updated as further community engagement occurs.

Get involved

[edit]

Participants

[edit]

Please use this section to tell us more about who is working on this project. For each member of the team, please describe any project-related skills, experience, or other background you have that might help contribute to making this idea a success.

Jackie Koerner is an independent qualitative analyst studying and writing about bias, inclusion, and equity. She has a PhD from Saint Louis University in Higher Education and Qualitative Research. Recent research work includes studying the experiences of college students with disabilities, the experience of veterans on college campus, and international students inclusion on campus.

Since 2016 she has volunteered with the Wikipedia community planning conferences, local events, editing content as a visiting scholar, and working with the Community Health Working Group as part of the Wikimedia 2030 Strategy process.

  • Volunteer This sounds like a great idea I'd be happy to help out with. —M@sssly 19:20, 12 March 2020 (UTC)

Community notification

[edit]

You are responsible for notifying relevant communities of your proposal, so that they can help you! Depending on your project, notification may be most appropriate on a Village Pump, talk page, mailing list, etc.--> Please paste links below to where relevant communities have been notified of your proposal, and to any other relevant community discussions. Need notification tips?

Endorsements

[edit]

Do you think this project should be selected for a Project Grant? Please add your name and rationale for endorsing this project below! (Other constructive feedback is welcome on the discussion page).

  • need longitudinal study of bias and community. if you do not like the scope, write your own scope and put this first class researcher on expense account. Slowking4 (talk) 01:14, 11 March 2020 (UTC)
  • Support Support The problem of bias in the community is clearly important, but wicked hard, so qualitative projects like this one can help uncover the nature of the problem in ways that point to specific solutions down the line. I worry that this proposal might face headwinds as it is relatively open-ended, and kind of vague in terms of goals, but this is consistent with the inductive methodological approach. You can't know what you'll find before you look! I believe that Jackie is a qualified grounded theory researcher fully capable of this project and that the report she produces will provide insights of use to the community and future projects exploring more narrowly focused ideas about addressing implicit bias. Groceryheist (talk) 14:45, 12 March 2020 (UTC)
  • Support Support We are definitely short of qualitative projects run by seasoned Wikimedians and addressing this timely issue. Pundit (talk) 15:39, 12 March 2020 (UTC)
  • Support Support The problem is significant but can also be subtle. I appreciate the author's willingness to follow this thread where ever it may lead. Taking a broad perspective and mapping this landscape sounds like fuel for further inquiry and potential interventions from many different sources and stakeholders. Looking forward to the results of this project! Khascall (talk) 17:34, 12 March 2020 (UTC)
  • Support Support Bias, in various forms, has been identified as a significant problem for the Wikipedia community and this research builds on previous studies by Ford et al. I look forward to reading the results.Doctor 17 (talk) 07:31, 14 March 2020 (UTC)
  • Support Support This issue needs more research, and being familiar with the work of Jackie Koerner, I believe they are well-suited to facilitate this work. --Rosiestep (talk) 15:25, 16 March 2020 (UTC)
  • Support Support per Slowking4, Groceryheist, Pundit, Rosiestep, and common sense. Gamaliel (talk) 15:44, 16 March 2020 (UTC)
  • Support Support per everyone above. Also as a citizen of the "Global South" and a person with a disability I am acutely aware of this problem. Dodger67 (talk) 16:51, 16 March 2020 (UTC)
  • Support Support This is important and needed work that can be done with the guidance of an experienced scholar and active community leader, Jackiekoerner. -Mozucat (talk) 19:50, 16 March 2020 (UTC)
  • Support Support Important and needed work, and Jackie has the experience and talent to take on this project. Funcrunch (talk) 03:11, 17 March 2020 (UTC)
  • Seems like a very worthy project. Jackie has been working on this topic for some time now, and is certainly up to the task. — Rhododendrites talk \\ 15:15, 17 March 2020 (UTC)

References

[edit]
  1. Vetter, Matthew A. (May 2015). "Teaching Wikipedia: The pedagogy and politics of an open access writing community". 
  2. Ford, Heather (2016). "'Anyone can edit' not everyone does: Wikipedia and the gender gap". Social Studies of Science. 
  3. Samoilenko, Anna; Yasseri, Taha (2014). "The distorted mirror of Wikipedia: a quantitative analysis of Wikipedia coverage of academics". EPJ Data Science 3 (1). 
  4. Flick, Corinne M. "Geographies of the World's Knowledge" (PDF). University of Oxford. Retrieved 27 September 2017. 
  5. Graham, M; Hogan, B; Straumann, R.K.; Medhat, A (2014). "Uneven Geographies of User-Generated Information: Patterns of Increasing Informational Poverty" 104 (4).