Strategy/Wikimedia movement/2017/Sources/Considering 2030: Misinformation, verification, and propaganda
As Wikimedia looks toward 2030, how can the movement help people find trustworthy sources of knowledge?[Note 1]
As part of the Wikimedia 2030 strategy process, the Wikimedia Foundation is working with independent research consultants to understand the key trends that will affect the future of free knowledge and share this information with the movement.[1] This report was prepared by: Dot Connector Studio is a Philadelphia-based media research and strategy firm focused on how emerging platforms can be used for social impact, and Lutman & Associates is a St. Paul-based strategy, planning, and evaluation firm focused on the intersections of culture, media, and philanthropy.
Building on a solid base
[edit]Since the launch of Wikipedia in 2001, Wikimedians have developed a robust, tested, transparent editing process, relying on volunteer writers and editors to rigorously source statements to verifiable sources to help readers decide if a given entry is comprehensive, verified, and fact-based. Editors may and do change content if it does not live up to Wikipedia standards, but are asked to include explanations of changes they make.
Of course, the Wikipedia editing process can be messy. There may be conflict, particularly on entries that are politically or culturally sensitive,[2][3] or that could affect an entity's commercial success and profit.[4] Around the world, the Wikimedia projects may be affected by government, political, cultural, or profit-driven censorship and misinformation campaigns, as well as outright falsified content. All of this may be exacerbated by current and future developments in audio-visual media manipulation and verification. Many of the same challenges that the news industry is grappling with in the current moment of heightened interest about "fake news," or misinformation, may also affect Wikipedians striving to create trustworthy content for audiences all over the globe. Controversies over educational materials—that very well could be used as reference material on Wikipedia—provide another example of the ways in which information can be suppressed or manipulated.
As part of our research for the Wikimedia 2030 strategy process, we reviewed over one hundred reports, articles, and academic studies to tease out current and future trends in misinformation that seem likely to affect Wikimedia, which inform the chart below.
The misinformation horizon ahead
[edit]In considering challenges Wikimedia may need to address with respect to misinformation over the next 10–15 years, it can be helpful to divide the trends into two categories: "content" and "access." Each has three major animating global influences.
- Content refers to trends that can affect the actual sources used by Wikimedians to develop reliable information.
- Access refers to how and whether Wikipedia users are able to use the platform.
Global influences that shape these two trends include technology, governments and politics, and commerce.
Content | Access | |
---|---|---|
Influence: technology | information created via new means, such as AI, bots, big data, virtual reality, media format manipulation | new means of content delivery, such as wearables, immersive experiences, voice-activated digital assistants |
Influence: government and politics | rise in misinformation, threats to press or academic freedom | censoring/blocking Wikimedia platform or other sources, blocking online access altogether, monitoring/surveilling online access |
Influence: commerce | sponsored research, advertorials, paid promotional advocates, clickbait content | "filter bubbles", proprietary devices and platforms |
Influence: technology
[edit]Technology continues to evolve rapidly, presenting a continual challenge to adapt.
Content
[edit]Technology creates numerous opportunities and challenges in the creation of content. For example, bots have been part of the Wikimedia mix from the beginning, many designed to do "housekeeping tasks," such as adding links, correcting spelling errors, and undoing vandalism. A few Wikipedias maintain a bot policy requiring that bots be registered; on the English Wikipedia, they are overseen by a human board known as the Bot Approvals Group.
This technology-augmented content creation may be just the beginning. Over the next ten to fifteen years, artificial intelligence tools are expected to grow significantly more sophisticated. Indeed, a recent review published in Scientific American declares, "It can be expected that supercomputers will soon surpass human capabilities in almost all areas—somewhere between 2020 and 2060."[5] Big data will also get even bigger and automated means of collecting and analyzing information will become more powerful.
Wikimedia already has an eye on this future, with tools such as the Objective Revision Evaluation Service (ORES), which helps editors spot potentially "damaging" edits based on previous assessments of article quality by Wikimedia editors. In a Reddit AMA from June 2017, Wikimedia Foundation's principal research scientist, Aaron Halfaker, noted both the strengths and the dangers of this approach to flagging edits. "Predictions can affect people's judgement," he writes. "If we have an AI with a little bit of bias, that can direct people to perpetuate the bias ... So we're very cautious about training on past behavior."[6]
The unprecedented surge in automation of knowledge creation and analysis brings both advantages and challenges.
On the plus side, these tools are helping information producers. For example, researchers are working on methods to apply machine intelligence tools to TV broadcasts in order to more easily identify speakers and talking points.[7] Developing ways to mine this type of information could become useful for Wikimedians evaluating video content when sourcing entries. In a report on the future of augmented journalism, the Associated Press notes that developments in machine intelligence will enable reporters (and others) to "analyze data; identify patterns, trends and actionable insights from multiple sources; see things that the naked eye can't see; turn data and spoken words into text; text into audio and video; understand sentiment; analyze scenes for objects, faces, text or colors—and more."[8]
AI can also help shape learning environments, directing users to appropriate knowledge sources based on data about how they have previously interacted with similar information resources, and revealing insights about how and when these resources are valuable.[9] As a result, for example, AI might even be used to assemble various Wikipedia articles into custom textbooks on the fly. It is not difficult to imagine how Wikimedia editors could deploy such advancements to strengthen Wikimedia content.
However, the development of new tools also can lead to more misleading content that could pose challenges when sourcing entries: "At corporations and universities across [the U.S.], incipient technologies appear likely to soon obliterate the line between real and fake….[A]dvancements in audio and video technology are becoming so sophisticated that they will be able to replicate real news—real TV broadcasts, for instance, or radio interviews—in unprecedented, and truly indecipherable, ways," predicts Vanity Fair writer Nick Bilton.[10]
In response, between now and 2030, the Wikimedia movement will need to remain vigilant and to develop new methods of verification that match these new technological capabilities. In turn, this will mean that the process for determining verifiability and reliable sources might need to evolve—or that the movement may need to build their own corresponding tools to keep up with edits from competing interests.
As Kevin Kelly observes in The Inevitable: Understanding the 12 Technological Forces That Will Shape Our Future, "Wikipedia continues to evolve its process. Each year more structure is layered in. Controversial articles can be 'frozen' by top editors so they can no longer be edited by any random person, only designated editors. There are more rules about what is permissible to write, more required formatting, more approval needed. But the quality improves too. I would guess that in 50 years a significant portion of Wikipedia articles will have controlled edits, peer review, verification locks, authentication certificates and so on."[11]
Access
[edit]Technology also presents myriad obstacles to accessing the content served up on Wikimedia platforms. For example, the browser-based, computer-accessed model for Wikipedia is already challenged; depending on where in the world you live, you are likely to be reading and editing Wikipedia entries on a mobile device rather than a computer. This trend toward mobile is well underway and increasingly global.[12] (In future briefs, we'll explore how other developments in technology, such as the increasing sophistication of wearables and the rise of audio-based virtual assistants, will affect access to Wikipedia and other Wikimedia projects.)
Influence: government and politics
[edit]Governments and political actors have the power to both suppress and distort content, and to restrict access to Wikimedia platforms. For example, the Turkish Internet Regulator blocked access to all language versions of Wikipedia on April 29, 2017.
Content
[edit]Governments around the world can and do monitor and crack down on activists, journalists, academics and other citizens who otherwise might be creators of reliable source material for Wikimedia or create or edit Wikimedia content. Textbooks and reference content are also a target for repressive regimes, as noted in a June 2017 report from the U.S. based political rights organization Freedom House.[13] In India, for example, model textbooks published by the National Council of Educational Research and Training have been accused of reflecting the political views of those in power.
This will continue to be a crucial challenge to realizing Wikimedia's vision over the next 15 years. The most recent analysis of press freedom worldwide from Reporters Without Borders declared that the globe is "darkening", as press freedom decreases in specific nations, and generally across the globe.[14]
Meanwhile, authoritarian governments continued to persecute and prosecute journalists. Turkey jailed 120 journalists between July and November 2016, following a failed coup against the government in power.[15] Reporters Without Borders argues that the "political rhetoric" used by U.S. President Donald Trump in the 2016 U.S. presidential elections has influenced political discourse on press freedom around the globe. For example, in Tanzania, President John Magufuli warned newspapers that "their days are numbered."[16] Such government repression not only reduces source material for Wikipedia editors, but can also result in an overall chilling effect for those seeking to produce or verify information.
A related trend among governments and political actors is purposeful propagation of misinformation, disinformation, or propaganda. This sort of manipulation may weaken the overall information ecosystem, creating an overall culture of doubt related to the reliability of online information. This could have an impact on the reliability of sources, and therefore content, on Wikipedia. The current global battle to identify and control misinformation seems likely to shape the information environment over the next 15 years. And not all disinformation is the same: conflation of what, exactly, constitutes "fake news," inspired Claire Wardle, research director for First Draft News, to create a taxonomy of "mis" and "dis" information, ranging from satire to propaganda to all-out fabricated accounts.[17]
For example, a 2016 paper published by RAND Corporation charges Russia with trafficking in a "firehose of falsehood," broadcasting "partial truths or outright fictions" on high numbers of communication channels, following a philosophy of quantity over quality.[18] BuzzFeed reported in January 2017 that in France, Trump supporters in the U.S. were posing as French citizens online to manipulate the French election outcomes.[19] Days before the French election, hackers breached servers for leading candidate (and eventual victor) Emmanuel Macron, in an attack likened to the theft of emails from the U.S. Democratic National Committee.[20]
This spread of misinformation online is occuring despite recent growth in the number of organizations dedicated to fact-checking: world-wide, at least 114 "dedicated fact-checking teams" are working in 47 countries.[21]
Looking into the future, what's safe to expect? First, global freedom of expression will wax and wane depending on national and international political developments. Less clear is whether global trends toward autocracy will continue—or whether free societies will have a resurgence, grappling successfully with pressures on the press and academy, and the politicization of facts as merely individual biased perspectives.
Second, we can expect that politically motivated disinformation and misinformation campaigns will always be with us. Indeed, the phenomenon of "fake news," misinformation, or "alternative facts" can be traced to some of the earliest recorded history, with examples dating back to ancient times.[22]
The Wikimedia movement will need to remain nimble and editors become well-versed in the always-morphing means by which information can be misleading or falsified. It will be helpful to keep abreast of techniques developed and used by journalists and researchers when verifying information, such as those described in the Verification Handbook, available in several languages.[23]
Access
[edit]A May 2017 study commissioned by the Wikimedia Foundation from Harvard's Berkman Klein Center shows that outright censoring of the Wikipedia platform or articles is trending down globally, and was doing so even before the Wikimedia Foundation in June 2015 deployed "HTTPS" technology that makes blocking individual pages more difficult.[2]
With HTTPS, censors can't see which page within a website was visited, so censors must choose whether to block access to the entire site in order to restrict access to single pages. The Berkman Klein study used both client-side availability monitoring data to collect information from 40 countries, as well as server-side data to find anomalies in requests for a set of 1.7 million articles covering 286 Wikipedia language projects.[2]
However, while the overall trend may be down, the analysis did show that certain governments, such as China, are continuing to censor, with full censorship of the zh.wikipedia.org domain through 2016. (Access appears to be allowed to other Wikimedia subdomains.) The analysis also found anomalies that could indicate censorship, but have not been explained, such as Thailand's apparent blocking of Yiddish-language Wikipedia.[2]
Overall, the authors conclude: "[T]he shift to HTTPS has been a good one in terms of ensuring accessibility to knowledge." A technological change on the back end, in other words, has improved access on the front end. In general, technological solutions (HTTPS) to technological problems (outright blocking) can sometimes bring relief—until the next technological challenge emerges.[2]
Influence: Commerce
[edit]Commercial actors may also influence both access and content for the Wikimedia movement, which deliberately models a platform that is free and open to contributions.
Content
[edit]The rise of commercial social media platforms such as Twitter and Facebook over the last decade has been accompanied by a concurrent decline of and trust in traditional modern media institutions. This is true within open societies that once enjoyed a competitive and productive press sector, creating concerns about new ways that misinformation is being filtered and delivered online and used in public discourse and decision-making.
There is also some overlap with other challenges noted above—for example, spurious technology claims may be disseminated in order to drive up stock prices, or misinformation campaigns aimed at affecting public policy may be run for profit-seeking reasons, as when companies or industry groups pay for research to prove a policy point. For example, an examination of industry documents over time showed that the U.S. sugar industry throughout the 1960s and 1970s sponsored research that promoted dietary fat as a bigger health risk than sugar.[24]
Online platforms have recently announced steps seeking to address the dissemintion of misinformation. Google has introduced changes within its search function, returning fact-checking organizations' content alongside results. They have also introduced means of reporting problematic results (i.e. autocomplete that suggests questions about whether the Holocaust happened.)[25] Facebook offered fact-checks on articles based on their URLs, and tips on media literacy.[26]
Feature and functionality changes by these large platforms in the coming years may both inform and compete with parallel approaches developed for the Wikimedia projects.
The next frontier in understanding how to combat misinformation involves developing a more sophisticated grasp on how networks help to spread it. The Europe-based Public Data Lab released the first several chapters of a "Field Guide to Fake News," which emphasizes the importance of building tools that reveal the "thick web" of how fake news spreads online—showing how a story or idea is shared across networks to help viewers put it in context.[27]
The next level of innovation may involve ubiquitous fact-checking—a solution that could potentially leverage Wikipedia content as a key source. For example, the nonprofit organization Hypothes.is continues to develop an open platform that supplies a layer on the web where users can create annotations that give context to the content.[28] In addition, "big data" can be harnessed to help provide context to public discourse, for example by tapping into data about the flow of money between political entities engaged in misinformation.[Note 2]
With the infusion of philanthropic investment and widespread experimentation, there is a chance that social media networks or online information companies such as Google may successfully tweak algorithms to prevent some of the most widespread sharing of false information online. However, new technologies are likely to inspire profit-seekers and political actors to develop new ways to manipulate systems, and extend these manipulations beyond text to other media, such as data, videos, photos, and more.
Access
[edit]Concerns over how commercial companies could limit access to Wikimedia platforms over the next 15 years will be addressed in forthcoming briefs about the future of the commons, and the emergence and use of new platforms. These include but aren't limited to battles over net neutrality, the rise of proprietary apps and platforms, and corporations' willingness (or unwillingness) to provide access to Wikipedia content from within their own content properties and devices.
Concluding thoughts and questions
[edit]How might Wikimedia plan for combating misinformation and censorship in the decades to come?
- Encourage and embrace experiments in artificial intelligence and machine learning that could help enrich Wikipedia content.
- Track developments in journalism and academia for new ways to fact-check and verify information that may be used as sources for Wikimedia platforms, such as evaluating video or other new media, also valuable for content.
- Collaborate with other public interest organizations to advocate for press freedom, free speech, universal internet access, and other policy goals that ensure access and the free flow of information.
- Continue to monitor carefully access to Wikimedia platforms from around the globe, deploying technical changes where appropriate.
- Monitor the solutions being developed by commercial platforms and publications, both to see how they might be applied to improving content verification methods on Wikimedia platforms, and might offer opportunities for increasing access to that content.
Notes
[edit]- ↑ While many of the links on this page point to the English Wikipedia, according to the language in which this brief was originally written, similar pages and policies exist on many other Wikimedia sites. Translators are welcome to substitute links here with the equivalents on other language Wikimedia wikis.
- ↑ See, for example, data available from the Center for Responsive Politics on campaign donors, lobbyists, and more in the U.S.; and Transparency International's data on lobbyists in the European Union as well as other data.
References
[edit]- ↑ "How will external forces hinder or help the future of the Wikimedia movement? – Wikimedia Blog". Retrieved 2017-07-13.
- ↑ a b c d e Clark, Justin, Robert Faris, Rebekah Heacock Jones. Analyzing Accessibility of Wikipedia Projects Around the World. Cambridge: Berkman Klein Center for Internet & Society, 2017. Accessed May 25, 2017.
- ↑ Alcantara, Chris. "The most challenging job of the 2016 race: Editing the candidates' Wikipedia pages." Washington Post. October 27, 2016. Accessed May 25, 2017.
- ↑ Kiberd, Roison. "The Brutal Edit War Over a 3D Printer's Wikipedia Page." Motherboard. March 23, 2016. Accessed June 1, 2017.
- ↑ Helbing, Dirk, Bruno S. Frey, Gerd Gigerenzer, Ernst Hafen, Michael Hagner, Yvonne Hofstetter, Jeroen van den Hoven, Roberto V. Zicari, and Andrej Zwitter, "Will Democracy Survive Big Data and Artificial Intelligence?" Scientific American. February 25, 2017. Accessed May 28, 2017. https://www.scientificamerican.com/article/will-democracy-survive-big-data-and-artificial-intelligence/.
- ↑ Halfaker, Aaron. "I'm the principal research scientist at the nonprofit behind Wikipedia. I study AIs and community dynamics. AMA!" Reddit. June 2, 2017. Accessed June 7, 2017.
- ↑ Watzman, Nancy. "Internet Archive's Trump Archive launches today." Internet Archive Blogs. January 5, 2017. Accessed May 19, 2017.
- ↑ Marconi, Francesco, Alex Siegman, and Machine Journalist. The Future of Augmented Journalism. New York: Associated Press, 2017. Accessed May 30, 2017.
- ↑ Luckin, Rose, Wayne Holmes, Mark Griffiths, and Laurie B. Forcier. Intelligence Unleashed: An Argument for AI in Education. London: Pearson, 2016. Accessed June 8, 2017.
- ↑ Bilton, Nick. "Fake news is about to get even scarier than you ever dreamed." Vanity Fair. January 26, 2017. Accessed May 30, 2017.
- ↑ Kelly, Kevin. The Inevitable: Understanding the 12 Technological Forces That Will Shape Our Future. New York: Viking, 2016.
- ↑ GSMA. "The Mobile Economy 2017." Accessed June 1, 2017.
- ↑ Puddington, Arch. Breaking Down Democracy: Goals, Strategies, and Methods of Modern Authoritarians. Washington, D.C.: Freedom House, 2017. Accessed June 8, 2017.
- ↑ Reporters Without Borders. "2017 World Press Freedom Index – tipping point?" April 26, 2017. Updated May 15, 2017. Accessed May 28, 2017.
- ↑ Nordland, Rod. "Turkey's Free Press Withers as Erdogan Jails 120 Journalists." The New York Times. November 17, 2016. Accessed June 7, 2017.
- ↑ Reporters Without Borders. "Journalism weakened by democracy's erosion." Accessed May 29, 2017.
- ↑ Wardle, Claire. "Fake News. It's Complicated." First Draft News. February 16, 2017. Accessed June 7, 2017.
- ↑ Paul, Christopher and Miriam Matthews. The Russian "Firehose of Falsehood" Propaganda Model: Why It Might Work and Options to Counter It. Santa Monica: RAND Corporation, 2016.
- ↑ Broderick, Ryan. "Trump Supporters Online Are Pretending To Be French To Manipulate France's Election." BuzzFeed. January 24, 2017. Accessed June 7, 2017.
- ↑ Tufekci, Zeynep. "Dear France: You Just Got Hacked. Don't Make The Same Mistakes We Did." BuzzFeed. May 5, 2017. Accessed June 7, 2017.
- ↑ Stencel, Mark. "International Fact-Checking Gains Ground, Duke Census Finds." Duke Reporters Lab. February 28, 2017. Accessed June 7, 2017. https://reporterslab.org/international-fact-checking-gains-ground/.
- ↑ Darnton, Robert. "The True History of Fake News." The New York Review of Books. February 13, 2017. Accessed June 7, 2017.
- ↑ Silverman, Craig, ed. Verification Handbook: A Definitive Guide to Verifying Digital Content for Emergency Coverage. Maastricht: European Journalism Centre, 2016. Accessed May 29, 2017.
- ↑ Kearns, Cristin E., Laura A. Schmidt, and Stanton A.Glantz. "Sugar Industry and Coronary Heart Disease Research: A Historical Analysis of Internal Industry Documents." JAMA Intern Med 176, no. 11 (2016): 1680-1685. Accessed June 8, 2017. doi:10.1001/jamainternmed.2016.5394.
- ↑ Gomes, Ben. "Our latest quality improvements for search." The Keyword. Google. April 25, 2017. Accessed May 19, 2017.
- ↑ Simo, Fidji. "Introducing: the Facebook Journalism Project." Facebook Media. January 11, 2017. Accessed May 19, 2017.
- ↑ Public Data Lab and First Draft News. "A Field Guide to Fake News." Accessed May 19, 2017.
- ↑ The Hypothesis Project. "To Enable a Conversation Over the World's Knowledge: Hypothesis Mission." Accessed 22 May 2017.