Grants talk:Project/Ocaasi/Misinformation And Its Discontents: Narrative Recommendations on Wikipedia's Vulnerabilities and Resilience

From Meta, a Wikimedia project coordination wiki

Coordination and addition of false or misleading information to Wikipedia[edit]

Hi Ocaasi. I like this project a lot. In fact, I was thinking about the possibility of a research proposal in the future that would revolve around the automatic detection of coordinated edit. One thing closely related to the topic of misinformation is the fact that some actors are using coordinated attacks, for example, with networks of bots, fake profiles, and ad campaigns on Twitter and Facebook. I find that your research is very much a prerequisite for what I have in mind. Do expert editors notice that there may be instances of coordinations behind the addition of false information to Wikipedia? If so, can we detect this automatically? -- CristianCantoro (talk) 11:20, 20 February 2020 (UTC)[reply]

I suspect it's going to be quite hard to differentiate between a coordinated campaign and between various people who due to ongoing events are coming to Wikipedia. Jo-Jo Eumerus (talk, contributions) 08:07, 22 February 2020 (UTC)[reply]
Hi Jo-Jo Eumerus, yes, I agree. So, the point is try to understand which kind of coordination is considered harmful (i.e. vandalism) and which is not. Of course in principle simple agreement on a POV, for example, could lead to some sort of coordination, but I suspect this is part of the collaborative process of adding information to Wikipedia. What are the characteristics, instead, that make this behavior "coordinated vandalism"? --CristianCantoro (talk) 21:06, 27 February 2020 (UTC)[reply]
@CristianCantoro: Regarding "automatic detection of coordinated edits", there is talk in WMF research of building such a machine-learning model. It would be complicated. We don't have a lot of ground data for confirmed coordinated campaigns to model from. We don't have a large validated test set to run it on. In any case, building this algorithm is outside the scope of my narrative proposal, but speaking with those who are planning such an approach is certainly something I will do. Cheers, Jake Ocaasi (talk) 18:01, 23 March 2020 (UTC)[reply]
Thanks Ocaasi, I agree, it is for sure a good venue for a research project. At the moment I am not planning on working on the topic, but if I do, who should I talk to? -- CristianCantoro (talk) 15:29, 24 March 2020 (UTC)[reply]
@CristianCantoro: You'd want to talk to Jonathan Morgan, User:Jmorgan (WMF). Cheers! Ocaasi (talk) 21:32, 26 March 2020 (UTC)[reply]

Systemic biases are a form of misinformation[edit]

Here's a recent approach I took to opposing misinformation. In short, I identified an article which was getting a lot of attention from Education Program students, then found its most closely related articles which had not yet had any student project editors, and I told them that they should work on those instead. The theory being is that the kinds of systemic biases which manifest as the causes of the student-popular article's subject are all around us, and if we don't counter them with accuracy improvements then we will never have those opportunities for rewards. Does that make sense? EllenCT (talk) 09:41, 2 March 2020 (UTC)[reply]

@EllenCT: I think looking at related clusters of articles is important for mis/disinformation, because a sophisticated campaign would likely work from the fringes where there are fewer page watches and less scrutiny. Attacking the core article takes more investment in building up trusted actors. I don't know if systemic bias and disinformation overlap, but that's a good question to be asking. I can't promise to have capacity to look at systemic bias in depth, because it is its own rich area with many political and personal questions of implicit bias, whereas disinformation operates on an intentional and explicit agenda. This is an area for future study, and I'd be happy to speak with systemic bias experts to think coordinated bias campaigns can shed light on tactics or defenses against coordinated efforts that are not necessarily nefarious but still disruptive. Cheers, Ocaasi (talk) 18:05, 23 March 2020 (UTC)[reply]
I think that this is apples and oranges. Misinformation is malicious; systemic bias is not. Calling something misinformation implies that someone is deliberately lying, but systemic bias usually is simply a reflection of people's interests and not a malicious act. Jo-Jo Eumerus (talk, contributions) 08:08, 24 March 2020 (UTC)[reply]
@Jo-Jo Eumerus: I agree they're different, but they're adjacent. Some attempts at systemic bias include disinformation, and some attempts at disinformation further systemic bias. Cheers, Ocaasi (talk) 21:31, 26 March 2020 (UTC)[reply]
I don't think that a slight overlap/adjacence is a good reason for lumping both together. A very broad "false information" basket-task does not strike me as very useful. Jo-Jo Eumerus (talk, contributions) 08:52, 27 March 2020 (UTC)[reply]

Eligibility confirmed, Round 1 2020[edit]

This Project Grants proposal is under review!

We've confirmed your proposal is eligible for Round 1 2020 review. Please feel free to ask questions and make changes to this proposal as discussions continue during the community comments period, through March 16, 2020.

The Project Grant committee's formal review for Round 1 2020 will occur March 17 - April 8, 2020. We ask that you refrain from making changes to your proposal during the committee review period, so we can be sure that all committee members are seeing the same version of the proposal.

Grantees will be announced Friday, May 15, 2020.

Any changes to the review calendar will be posted on the Round 1 2020 schedule.

Questions? Contact us at projectgrants (_AT_) wikimedia  · org.

I JethroBT (WMF) (talk) 19:08, 2 March 2020 (UTC)[reply]

Aggregated feedback from the committee for Misinformation And Its Discontents: Narrative Recommendations on Wikipedia's Vulnerabilities and Resilience[edit]

Scoring rubric Score
(A) Impact potential
  • Does it have the potential to increase gender diversity in Wikimedia projects, either in terms of content, contributors, or both?
  • Does it have the potential for online impact?
  • Can it be sustained, scaled, or adapted elsewhere after the grant ends?
6.8
(B) Community engagement
  • Does it have a specific target community and plan to engage it often?
  • Does it have community support?
6.2
(C) Ability to execute
  • Can the scope be accomplished in the proposed timeframe?
  • Is the budget realistic/efficient ?
  • Do the participants have the necessary skills/experience?
7.4
(D) Measures of success
  • Are there both quantitative and qualitative measures of success?
  • Are they realistic?
  • Can they be measured?
7.6
Additional comments from the Committee:
  • This project has important implications for the integrity of knowledge across Wikimedia projects. The big question for me is whether there is capacity and interest to implement any resulting recommendations - it seems the WMF Knowledge Integrity program is no longer active and it’s unclear to me to what extent the recommendations could be implemented by WMF teams/departments. As far as I know there is also not any community organizing around this issue (e.g. a Wikiproject or user group), but perhaps this initial project could be the catalyst for this by bringing together information and contributors in one place
  • It fits the pillars of Wikipedia like the neutral point of view but it cannot be scaled. It seems a lot connected with a specific context and basically there is not a framework to follow or to be used to replicate the project.
  • I appreciate the subject-matter as the area needs attention. I do have concerns though about the lack of a timeline, scope, and not engaging with editors who are spreading misinformation. This, in my expertise, will provide critical information and guide actionable items.
  • This proposal is exactly the sort of community-led research we should be funding, as it has enormous implications for the perceived reliability of the encyclopedia.
  • Takes a holistic approach to addressing the issue and identifying solutions. The main risks are that the scope will be too large (as the research issue is a complicated one) and that the recommendations can’t/won’t be implemented
  • Weak innovation and no clear evaluation plan.
  • This proposal is investigative in nature, but lacks a scope, timeline, and I do not understand how participants will be selected to participate in interviews. I do not have a good grasp on how this will have an outcome beyond a report.
  • There have been several efforts on this, though this proposal seems to want to centralize what we know and produce recommendations. This is innovative in many ways as this idea is more important now than it ever has been.
  • Grantee has lots of relevant experience and is a capable lead with a good team of advisors behind him. In terms of budget, would be nice to have more of a detailed breakdown of costs per project element (e.g. by estimated hours)
  • The submitter is experienced.
  • I have complete faith in the volunteers working on this project. I am sure they can complete this within the 12 month timeframe.
  • Due to the scope of this proposal, I am not sure how "Disinformation landscape review" and solid, globally-informed recommendations can happen within this time. For example, the proposal seeks to answer "What policies, practices, and tools make some Wikipedias more robust against disinformation?" and "How do we vet neutrality and reliability of sources on articles about polarizing or controversial topics across multiple languages?" which is something that, given its current scope, needs to be narrowed and focused. As written, it appears this will be a comprehensive analysis across the entire Movement, which would itself require a full-time team and is not accounted for in the budget nor project plan.
  • I trust the ability of the grantee and his team to execute this project. The grantee is the founder of a well-established community project, TWL.
  • Good number of community endorsements. Would be nice to know if/how grantee has identified experts from the community.
  • I would like to see more about how they will engage with participants (as in seek participants). I understand the author does not feel there is community engagement, but the author could share more about updates posted on meta throughout the project, recruitment of participants, etc.
  • There is clear support and volunteers for this, which suggests this is an important and potentially valuable proposal.
  • This level of community engagement is sufficient.
  • Yes but I have some concerns about the impact it can generate.
  • I’d love to hear more about the above concerns before funding this project. Once these are clarified, however, I’d be happy to support funding this project.
  • The scope (across what aspects of the Wikimedia Projects and languages) needs to be narrowed to make this doable. Also, as this is presented as a qualitative study, I do not see evidence of qualitative methodological rigor or a guiding framework, both of which are needed to keep this focused and doable. I would also advocate for increasing the funding to bring on others in to the project who could add these capacities if needed, but as listed now the proposal is not focused enough given the timeline, scope, and budget.

This proposal has been recommended for due diligence review.

The Project Grants Committee has conducted a preliminary assessment of your proposal and recommended it for due diligence review. This means that a majority of the committee reviewers favorably assessed this proposal and have requested further investigation by Wikimedia Foundation staff.


Next steps:

  • Aggregated committee comments from the committee are posted above. Note that these comments may vary, or even contradict each other, since they reflect the conclusions of multiple individual committee members who independently reviewed this proposal.
  • If you have had an interview with a Program Officer, you may have orally responded to some of the committee comments already. Your interview comments will be relayed to the committee during the deliberations call.
  • You are welcome to respond to aggregated comments here on the talkpage to publicly share any feedback, clarifications or questions you have.
  • Following due diligence review, a final funding decision will be announced on May 29, 2020.
If you have any questions, please contact us at projectgrants (_AT_) wikimedia  · org.

--Marti (WMF) (talk) 22:46, 13 May 2020 (UTC)[reply]

Round 1 2020 decision[edit]

Congratulations! Your proposal has been selected for a Project Grant.

The committee has recommended this proposal and WMF has approved funding for the full amount of your request, US$30,000

Comments regarding this decision:
The committee is pleased to support your work investigating practices and systems across Wikipedia projects in relation to the detection, handling, and prevention of misinformation. The committee agrees that concerns around misinformation are both pervasive and significant, and presents risks in how readers use and perceive content on Wikipedia. Furthermore, there is a need to better understand and appraise the effectiveness of different practices in the movement around misinformation. The committee appreciates the addition of a project admin to support this complex and sustained investigation in this domain, involving deep dives into larger and smaller Wikipedia communities, conducting interviews, and developing an initial set of recommendations on effective practices related to misinformation. Finally, the committee also appreciates the substantial effort you put into community engagement and recruitment of domain-relevant advisors both at the Wikimedia Foundation and among community members.

Next steps:

  1. You will be contacted to sign a grant agreement and setup a monthly check-in schedule.
  2. Review the information for grantees.
  3. Use the new buttons on your original proposal to create your project pages.
  4. Start work on your project!

Upcoming changes to Wikimedia Foundation Grants

Over the last year, the Wikimedia Foundation has been undergoing a community consultation process to launch a new grants strategy. Our proposed programs are posted on Meta here: Grants Strategy Relaunch 2020-2021. If you have suggestions about how we can improve our programs in the future, you can find information about how to give feedback here: Get involved. We are also currently seeking candidates to serve on regional grants committees and we'd appreciate it if you could help us spread the word to strong candidates--you can find out more here. We will launch our new programs in July 2021. If you are interested in submitting future proposals for funding, stay tuned to learn more about our future programs.


I JethroBT (WMF) (talk) 23:20, 29 May 2020 (UTC)[reply]