Grants talk:Project/Ocaasi/Misinformation And Its Discontents: Narrative Recommendations on Wikipedia's Vulnerabilities and Resilience
Add topicCoordination and addition of false or misleading information to Wikipedia[edit]
Hi Ocaasi. I like this project a lot. In fact, I was thinking about the possibility of a research proposal in the future that would revolve around the automatic detection of coordinated edit. One thing closely related to the topic of misinformation is the fact that some actors are using coordinated attacks, for example, with networks of bots, fake profiles, and ad campaigns on Twitter and Facebook. I find that your research is very much a prerequisite for what I have in mind. Do expert editors notice that there may be instances of coordinations behind the addition of false information to Wikipedia? If so, can we detect this automatically? -- CristianCantoro (talk) 11:20, 20 February 2020 (UTC)
- I suspect it's going to be quite hard to differentiate between a coordinated campaign and between various people who due to ongoing events are coming to Wikipedia. Jo-Jo Eumerus (talk, contributions) 08:07, 22 February 2020 (UTC)
- Hi Jo-Jo Eumerus, yes, I agree. So, the point is try to understand which kind of coordination is considered harmful (i.e. vandalism) and which is not. Of course in principle simple agreement on a POV, for example, could lead to some sort of coordination, but I suspect this is part of the collaborative process of adding information to Wikipedia. What are the characteristics, instead, that make this behavior "coordinated vandalism"? --CristianCantoro (talk) 21:06, 27 February 2020 (UTC)
- @CristianCantoro: Regarding "automatic detection of coordinated edits", there is talk in WMF research of building such a machine-learning model. It would be complicated. We don't have a lot of ground data for confirmed coordinated campaigns to model from. We don't have a large validated test set to run it on. In any case, building this algorithm is outside the scope of my narrative proposal, but speaking with those who are planning such an approach is certainly something I will do. Cheers, Jake Ocaasi (talk) 18:01, 23 March 2020 (UTC)
- Thanks Ocaasi, I agree, it is for sure a good venue for a research project. At the moment I am not planning on working on the topic, but if I do, who should I talk to? -- CristianCantoro (talk) 15:29, 24 March 2020 (UTC)
- @CristianCantoro: You'd want to talk to Jonathan Morgan, User:Jmorgan (WMF). Cheers! Ocaasi (talk) 21:32, 26 March 2020 (UTC)
- Thanks Ocaasi, I agree, it is for sure a good venue for a research project. At the moment I am not planning on working on the topic, but if I do, who should I talk to? -- CristianCantoro (talk) 15:29, 24 March 2020 (UTC)
- @CristianCantoro: Regarding "automatic detection of coordinated edits", there is talk in WMF research of building such a machine-learning model. It would be complicated. We don't have a lot of ground data for confirmed coordinated campaigns to model from. We don't have a large validated test set to run it on. In any case, building this algorithm is outside the scope of my narrative proposal, but speaking with those who are planning such an approach is certainly something I will do. Cheers, Jake Ocaasi (talk) 18:01, 23 March 2020 (UTC)
- Hi Jo-Jo Eumerus, yes, I agree. So, the point is try to understand which kind of coordination is considered harmful (i.e. vandalism) and which is not. Of course in principle simple agreement on a POV, for example, could lead to some sort of coordination, but I suspect this is part of the collaborative process of adding information to Wikipedia. What are the characteristics, instead, that make this behavior "coordinated vandalism"? --CristianCantoro (talk) 21:06, 27 February 2020 (UTC)
Systemic biases are a form of misinformation[edit]
Here's a recent approach I took to opposing misinformation. In short, I identified an article which was getting a lot of attention from Education Program students, then found its most closely related articles which had not yet had any student project editors, and I told them that they should work on those instead. The theory being is that the kinds of systemic biases which manifest as the causes of the student-popular article's subject are all around us, and if we don't counter them with accuracy improvements then we will never have those opportunities for rewards. Does that make sense? EllenCT (talk) 09:41, 2 March 2020 (UTC)
- @EllenCT: I think looking at related clusters of articles is important for mis/disinformation, because a sophisticated campaign would likely work from the fringes where there are fewer page watches and less scrutiny. Attacking the core article takes more investment in building up trusted actors. I don't know if systemic bias and disinformation overlap, but that's a good question to be asking. I can't promise to have capacity to look at systemic bias in depth, because it is its own rich area with many political and personal questions of implicit bias, whereas disinformation operates on an intentional and explicit agenda. This is an area for future study, and I'd be happy to speak with systemic bias experts to think coordinated bias campaigns can shed light on tactics or defenses against coordinated efforts that are not necessarily nefarious but still disruptive. Cheers, Ocaasi (talk) 18:05, 23 March 2020 (UTC)
- I think that this is apples and oranges. Misinformation is malicious; systemic bias is not. Calling something misinformation implies that someone is deliberately lying, but systemic bias usually is simply a reflection of people's interests and not a malicious act. Jo-Jo Eumerus (talk, contributions) 08:08, 24 March 2020 (UTC)
- @Jo-Jo Eumerus: I agree they're different, but they're adjacent. Some attempts at systemic bias include disinformation, and some attempts at disinformation further systemic bias. Cheers, Ocaasi (talk) 21:31, 26 March 2020 (UTC)
- I don't think that a slight overlap/adjacence is a good reason for lumping both together. A very broad "false information" basket-task does not strike me as very useful. Jo-Jo Eumerus (talk, contributions) 08:52, 27 March 2020 (UTC)
- @Jo-Jo Eumerus: I agree they're different, but they're adjacent. Some attempts at systemic bias include disinformation, and some attempts at disinformation further systemic bias. Cheers, Ocaasi (talk) 21:31, 26 March 2020 (UTC)
Eligibility confirmed, Round 1 2020[edit]
This Project Grants proposal is under review!
We've confirmed your proposal is eligible for Round 1 2020 review. Please feel free to ask questions and make changes to this proposal as discussions continue during the community comments period, through March 16, 2020.
The Project Grant committee's formal review for Round 1 2020 will occur March 17 - April 8, 2020. We ask that you refrain from making changes to your proposal during the committee review period, so we can be sure that all committee members are seeing the same version of the proposal.
Grantees will be announced Friday, May 15, 2020.
Any changes to the review calendar will be posted on the Round 1 2020 schedule.
Questions? Contact us at projectgrants wikimedia · org.
I JethroBT (WMF) (talk) 19:08, 2 March 2020 (UTC)
Aggregated feedback from the committee for Misinformation And Its Discontents: Narrative Recommendations on Wikipedia's Vulnerabilities and Resilience[edit]
Scoring rubric | Score | |
(A) Impact potential
|
6.8 | |
(B) Community engagement
|
6.2 | |
(C) Ability to execute
|
7.4 | |
(D) Measures of success
|
7.6 | |
Additional comments from the Committee:
|
This proposal has been recommended for due diligence review.
The Project Grants Committee has conducted a preliminary assessment of your proposal and recommended it for due diligence review. This means that a majority of the committee reviewers favorably assessed this proposal and have requested further investigation by Wikimedia Foundation staff.
Next steps:
- Aggregated committee comments from the committee are posted above. Note that these comments may vary, or even contradict each other, since they reflect the conclusions of multiple individual committee members who independently reviewed this proposal.
- If you have had an interview with a Program Officer, you may have orally responded to some of the committee comments already. Your interview comments will be relayed to the committee during the deliberations call.
- You are welcome to respond to aggregated comments here on the talkpage to publicly share any feedback, clarifications or questions you have.
- Following due diligence review, a final funding decision will be announced on May 29, 2020.
--Marti (WMF) (talk) 22:46, 13 May 2020 (UTC)
Round 1 2020 decision[edit]
Congratulations! Your proposal has been selected for a Project Grant.
The committee has recommended this proposal and WMF has approved funding for the full amount of your request, US$30,000
Comments regarding this decision:
The committee is pleased to support your work investigating practices and systems across Wikipedia projects in relation to the detection, handling, and prevention of misinformation. The committee agrees that concerns around misinformation are both pervasive and significant, and presents risks in how readers use and perceive content on Wikipedia. Furthermore, there is a need to better understand and appraise the effectiveness of different practices in the movement around misinformation. The committee appreciates the addition of a project admin to support this complex and sustained investigation in this domain, involving deep dives into larger and smaller Wikipedia communities, conducting interviews, and developing an initial set of recommendations on effective practices related to misinformation. Finally, the committee also appreciates the substantial effort you put into community engagement and recruitment of domain-relevant advisors both at the Wikimedia Foundation and among community members.
Next steps:
- You will be contacted to sign a grant agreement and setup a monthly check-in schedule.
- Review the information for grantees.
- Use the new buttons on your original proposal to create your project pages.
- Start work on your project!
Upcoming changes to Wikimedia Foundation Grants
Over the last year, the Wikimedia Foundation has been undergoing a community consultation process to launch a new grants strategy. Our proposed programs are posted on Meta here: Grants Strategy Relaunch 2020-2021. If you have suggestions about how we can improve our programs in the future, you can find information about how to give feedback here: Get involved. We are also currently seeking candidates to serve on regional grants committees and we'd appreciate it if you could help us spread the word to strong candidates--you can find out more here. We will launch our new programs in July 2021. If you are interested in submitting future proposals for funding, stay tuned to learn more about our future programs.