কৌশল/উইকিমিডিয়া আন্দোলন/২০১৭/উৎস/২০৩০ সাপেক্ষে: ভুল তথ্য, যাচাইকরণ এবং প্রচার (জুলাই ২০১৭)

From Meta, a Wikimedia project coordination wiki
Outdated translations are marked like this.
লুটম্যান এসোসিয়েশন ও ডট কানেক্টর স্টুডিওর গবেষণা (জুলাই ২০১৭)
যেহেতু আমাদের আন্দোলন ২০৩০ সালের দিকে যাচ্ছে সেহেতু আমরা কিভাবে নির্ভরযোগ্য তথ্য খুঁজে বের করতে পারি?[Note 1]

উইকিমিডিয়া কৌশল আন্দোলনের প্রক্রিয়ায় উইকিমিডিয়া ফাউন্ডেশন অনেক সতন্ত্র গবেষণা সংস্থার সাথে মিলে কৌশল আন্দোলনের প্রধান বিষয়গুলো নির্ণয় করার চেষ্ঠা করেছে।[1] এই প্রতিবেদনটি ডট কানেক্টর কর্তৃক তৈরি করা হয়েছে যারা ফিলাডেলফিয়া ভিত্তিক একটি একটি গবেষণা সংস্থারা যারা উদীয়মান সম্প্রদায় নিয়ে কাজ করে থাকে।

একটি সঠিক কাঠামো তৈরি

২০০১ সালে উইকিপিডিয়া প্রতিষ্ঠার পর থেকে এখানকার সম্প্রদায়ের সদস্যরা একটি সম্পাদনা নীতিমালা তৈরি করেছে যার মাধ্যমে তারা বিভিন্ন উৎসের মাধ্যমে কোন একটি তথ্যের নির্ভরযোগ্যতা যাচাই করে সেগুলো নিবন্ধে যুক্ত করে থাকে। সম্পাদকরা বিভিন্ন নিবন্ধের তথ্য পরিবর্তন করে থাকে যদি এটি উইকিপিডিয়ার নীতিমালার সাথে সামঞ্জস্যপূর্ণ না হয় তবে তারা এটা করার সময় কারণ উল্লেখ করে থাকেন।

অবশ্যই উইকিমিডিয়ার সম্পাদনা প্রক্রিয়া জটিল হতে পারে। এমনও হতে পারে যে রাজনৈতিক ও সংস্কৃতিগতভাবে স্পর্শকাতর নিবন্ধগুলেঅতে বিতর্ক হতে পারে[2][3] এমনও হতে পারে যে, এ জন্য বানিজ্যিক প্রতিষ্ঠানের লাভেও সমস্যা হতে পারে।[4] বিশ্বের বিভিন্ন সাথে উইকিমিডিয়া প্রকল্পের সাথে সরকার ও বিভিন্ন প্রতিষ্ঠানের সংঘর্ষ তৈরি হতে পারে। এ সব কিছুই হয়ত ভবিষ্যতে ভিজুয়্যাল বিষয়বস্তু তৈরির পথে বাধা হতে পারে। এরকমভাবে আরও অনেক নতুন প্রতিষ্ঠান ভুল সংবাদের জন্য বিভিন্ন ভাবে সমস্যার সম্মুখীন হচ্ছেন। এসব কিছুই তথ্যসূত্র হিসেবে ব্যবহার করে দুটি মতকেই উপস্থাপনের মাধ্যমে বিশ্বকে দেখানো যেতে পারে যে, বিশ্বে ভুল তথ্যের কি পরিমাণ প্রভাব রয়েছে।

উইকিমিডিয়া ২০৩০ কৌলশ আন্দোলন প্রক্রিয়ার অংশ হিসেবে অামরা একশ এর বেশি প্রতিবেদন নিয়ে গবেষণা করেছি যা নিচে চার্ট আকারে দেওয়া হয়েছে।

ভুল তথ্যের সমাগম

আগামী ১০ থেকে ১৫ বছর উইকিমিডিয়া ভুল তথ্যের ছড়াছড়ির এই যুগে ভবিষ্যতে যে সমস্যার সম্মুখীন হবে সেটাকে দুইভাগে ভাগ করা যায়। বিষয়বস্তু ও প্রবেশাধিকার।

  • বিষয়বস্তু এটা প্রভাব পরবে উইকিমিডিয়ানরা মূলত তথ্য দিতে যে উৎস ব্যবহার করে থাকে সেগুলোর উপর সাথে সাথে তাদের বিষয়বস্তুর উপর।
  • প্রবেশাধিকার এটা দ্বারা বুঝায় উইপিডিয়া ব্যবহারকারীরা কিভাবে এই প্লাটফর্ম ব্যবহার করে তার উপর।

বৈশ্বিক প্রভাব যেমন, প্রযুক্তি, সরকার, রাজনীতি ও বাণিজ্য এই দুটি নিয়ামককে প্রভাবিত করতে পারে।

বিষয়বস্তু প্রবেশাধিকার
প্রভাব: প্রযুক্তি যেসব তথ্য এআই, বট, বিগ ডাটা, ভার্চুয়াল রিয়ালিটি দ্বারা তৈরি হয় নতুনভাবে তথ্য আদান প্রদান
প্রভাব: সরকার ও রাজনীতি ভুল তথ্যের ছড়াছড়ি, প্রেস ও মিডিয়ার উপর আঘাত উইকিপিডিয়া বা অন্যান্য তথ্য প্রদানের মাধ্যমে বাধাদান বা একেবারেই প্রবেশাধিকর সীমিত করা
প্রভাব: বানিজ্য বিভিন্ন টাকার বিনিময়ে গবেষণা, বিজ্ঞাপণ, পেইড কনটেন্ট ও ক্লিকবেইট বিষয়বস্তু ফিল্টার বাবল, মালিকানাধীন ডিভাইস এবং প্ল্যাটফর্ম

Influence: technology

Technology continues to evolve rapidly, presenting a continual challenge to adapt.

Content

Technology creates numerous opportunities and challenges in the creation of content. For example, bots have been part of the Wikimedia mix from the beginning, many designed to do "housekeeping tasks," such as adding links, correcting spelling errors, and undoing vandalism. A few Wikipedias maintain a bot policy requiring that bots be registered; on the English Wikipedia, they are overseen by a human board known as the Bot Approvals Group.

This technology-augmented content creation may be just the beginning. Over the next ten to fifteen years, artificial intelligence tools are expected to grow significantly more sophisticated. Indeed, a recent review published in Scientific American declares, "It can be expected that supercomputers will soon surpass human capabilities in almost all areas—somewhere between 2020 and 2060."[5] Big data will also get even bigger and automated means of collecting and analyzing information will become more powerful.

Wikimedia already has an eye on this future, with tools such as the Objective Revision Evaluation Service (ORES), which helps editors spot potentially "damaging" edits based on previous assessments of article quality by Wikimedia editors. In a Reddit AMA from June 2017, Wikimedia Foundation's principal research scientist, Aaron Halfaker, noted both the strengths and the dangers of this approach to flagging edits. "Predictions can affect people's judgement," he writes. "If we have an AI with a little bit of bias, that can direct people to perpetuate the bias ... So we're very cautious about training on past behavior."[6]

The unprecedented surge in automation of knowledge creation and analysis brings both advantages and challenges.

On the plus side, these tools are helping information producers. For example, researchers are working on methods to apply machine intelligence tools to TV broadcasts in order to more easily identify speakers and talking points.[7] Developing ways to mine this type of information could become useful for Wikimedians evaluating video content when sourcing entries. In a report on the future of augmented journalism, the Associated Press notes that developments in machine intelligence will enable reporters (and others) to "analyze data; identify patterns, trends and actionable insights from multiple sources; see things that the naked eye can't see; turn data and spoken words into text; text into audio and video; understand sentiment; analyze scenes for objects, faces, text or colors—and more."[8]

AI can also help shape learning environments, directing users to appropriate knowledge sources based on data about how they have previously interacted with similar information resources, and revealing insights about how and when these resources are valuable.[9] As a result, for example, AI might even be used to assemble various Wikipedia articles into custom textbooks on the fly. It is not difficult to imagine how Wikimedia editors could deploy such advancements to strengthen Wikimedia content.

However, the development of new tools also can lead to more misleading content that could pose challenges when sourcing entries: "At corporations and universities across [the U.S.], incipient technologies appear likely to soon obliterate the line between real and fake….[A]dvancements in audio and video technology are becoming so sophisticated that they will be able to replicate real news—real TV broadcasts, for instance, or radio interviews—in unprecedented, and truly indecipherable, ways," predicts Vanity Fair writer Nick Bilton.[10]

In response, between now and 2030, the Wikimedia movement will need to remain vigilant and to develop new methods of verification that match these new technological capabilities. In turn, this will mean that the process for determining verifiability and reliable sources might need to evolve—or that the movement may need to build their own corresponding tools to keep up with edits from competing interests.

As Kevin Kelly observes in The Inevitable: Understanding the 12 Technological Forces That Will Shape Our Future, "Wikipedia continues to evolve its process. Each year more structure is layered in. Controversial articles can be 'frozen' by top editors so they can no longer be edited by any random person, only designated editors. There are more rules about what is permissible to write, more required formatting, more approval needed. But the quality improves too. I would guess that in 50 years a significant portion of Wikipedia articles will have controlled edits, peer review, verification locks, authentication certificates and so on."[11]

Access

Technology also presents myriad obstacles to accessing the content served up on Wikimedia platforms. For example, the browser-based, computer-accessed model for Wikipedia is already challenged; depending on where in the world you live, you are likely to be reading and editing Wikipedia entries on a mobile device rather than a computer. This trend toward mobile is well underway and increasingly global.[12] (In future briefs, we'll explore how other developments in technology, such as the increasing sophistication of wearables and the rise of audio-based virtual assistants, will affect access to Wikipedia and other Wikimedia projects.)

Influence: government and politics

Governments and political actors have the power to both suppress and distort content, and to restrict access to Wikimedia platforms. For example, the Turkish Internet Regulator blocked access to all language versions of Wikipedia on April 29, 2017.

Content

Governments around the world can and do monitor and crack down on activists, journalists, academics and other citizens who otherwise might be creators of reliable source material for Wikimedia or create or edit Wikimedia content. Textbooks and reference content are also a target for repressive regimes, as noted in a June 2017 report from the U.S. based political rights organization Freedom House.[13] In India, for example, model textbooks published by the National Council of Educational Research and Training have been accused of reflecting the political views of those in power.

This will continue to be a crucial challenge to realizing Wikimedia's vision over the next 15 years. The most recent analysis of press freedom worldwide from Reporters Without Borders declared that the globe is "darkening", as press freedom decreases in specific nations, and generally across the globe.[14]

Meanwhile, authoritarian governments continued to persecute and prosecute journalists. Turkey jailed 120 journalists between July and November 2016, following a failed coup against the government in power.[15] Reporters Without Borders argues that the "political rhetoric" used by U.S. President Donald Trump in the 2016 U.S. presidential elections has influenced political discourse on press freedom around the globe. For example, in Tanzania, President John Magufuli warned newspapers that "their days are numbered."[16] Such government repression not only reduces source material for Wikipedia editors, but can also result in an overall chilling effect for those seeking to produce or verify information.

A related trend among governments and political actors is purposeful propagation of misinformation, disinformation, or propaganda. This sort of manipulation may weaken the overall information ecosystem, creating an overall culture of doubt related to the reliability of online information. This could have an impact on the reliability of sources, and therefore content, on Wikipedia. The current global battle to identify and control misinformation seems likely to shape the information environment over the next 15 years. And not all disinformation is the same: conflation of what, exactly, constitutes "fake news," inspired Claire Wardle, research director for First Draft News, to create a taxonomy of "mis" and "dis" information, ranging from satire to propaganda to all-out fabricated accounts.[17]

For example, a 2016 paper published by RAND Corporation charges Russia with trafficking in a "firehose of falsehood," broadcasting "partial truths or outright fictions" on high numbers of communication channels, following a philosophy of quantity over quality.[18] BuzzFeed reported in January 2017 that in France, Trump supporters in the U.S. were posing as French citizens online to manipulate the French election outcomes.[19] Days before the French election, hackers breached servers for leading candidate (and eventual victor) Emmanuel Macron, in an attack likened to the theft of emails from the U.S. Democratic National Committee.[20]

This spread of misinformation online is occuring despite recent growth in the number of organizations dedicated to fact-checking: world-wide, at least 114 "dedicated fact-checking teams" are working in 47 countries.[21]

Looking into the future, what's safe to expect? First, global freedom of expression will wax and wane depending on national and international political developments. Less clear is whether global trends toward autocracy will continue—or whether free societies will have a resurgence, grappling successfully with pressures on the press and academy, and the politicization of facts as merely individual biased perspectives.

Second, we can expect that politically motivated disinformation and misinformation campaigns will always be with us. Indeed, the phenomenon of "fake news," misinformation, or "alternative facts" can be traced to some of the earliest recorded history, with examples dating back to ancient times.[22]

The Wikimedia movement will need to remain nimble and editors become well-versed in the always-morphing means by which information can be misleading or falsified. It will be helpful to keep abreast of techniques developed and used by journalists and researchers when verifying information, such as those described in the Verification Handbook, available in several languages.[23]

Access

A May 2017 study commissioned by the Wikimedia Foundation from Harvard's Berkman Klein Center shows that outright censoring of the Wikipedia platform or articles is trending down globally, and was doing so even before the Wikimedia Foundation in June 2015 deployed "HTTPS" technology that makes blocking individual pages more difficult.[2]

With HTTPS, censors can't see which page within a website was visited, so censors must choose whether to block access to the entire site in order to restrict access to single pages. The Berkman Klein study used both client-side availability monitoring data to collect information from 40 countries, as well as server-side data to find anomalies in requests for a set of 1.7 million articles covering 286 Wikipedia language projects.[2]

However, while the overall trend may be down, the analysis did show that certain governments, such as China, are continuing to censor, with full censorship of the zh.wikipedia.org domain through 2016. (Access appears to be allowed to other Wikimedia subdomains.) The analysis also found anomalies that could indicate censorship, but have not been explained, such as Thailand's apparent blocking of Yiddish-language Wikipedia.[2]

Overall, the authors conclude: "[T]he shift to HTTPS has been a good one in terms of ensuring accessibility to knowledge." A technological change on the back end, in other words, has improved access on the front end. In general, technological solutions (HTTPS) to technological problems (outright blocking) can sometimes bring relief—until the next technological challenge emerges.[2]

Influence: Commerce

Commercial actors may also influence both access and content for the Wikimedia movement, which deliberately models a platform that is free and open to contributions.

=== Content === The rise of commercial social media platforms such as Twitter and Facebook over the last decade has been accompanied by a concurrent decline of and trust in traditional modern media institutions. This is true within open societies that once enjoyed a competitive and productive press sector, creating concerns about new ways that misinformation is being filtered and delivered online and used in public discourse and decision-making.

There is also some overlap with other challenges noted above—for example, spurious technology claims may be disseminated in order to drive up stock prices, or misinformation campaigns aimed at affecting public policy may be run for profit-seeking reasons, as when companies or industry groups pay for research to prove a policy point. For example, an examination of industry documents over time showed that the U.S. sugar industry throughout the 1960s and 1970s sponsored research that promoted dietary fat as a bigger health risk than sugar.[24]

Online platforms have recently announced steps seeking to address the dissemintion of misinformation. Google has introduced changes within its search function, returning fact-checking organizations' content alongside results. They have also introduced means of reporting problematic results (i.e. autocomplete that suggests questions about whether the Holocaust happened.)[25] Facebook offered fact-checks on articles based on their URLs, and tips on media literacy.[26]

Feature and functionality changes by these large platforms in the coming years may both inform and compete with parallel approaches developed for the Wikimedia projects.

The next frontier in understanding how to combat misinformation involves developing a more sophisticated grasp on how networks help to spread it. The Europe-based Public Data Lab released the first several chapters of a "Field Guide to Fake News," which emphasizes the importance of building tools that reveal the "thick web" of how fake news spreads online—showing how a story or idea is shared across networks to help viewers put it in context.[27]

The next level of innovation may involve ubiquitous fact-checking—a solution that could potentially leverage Wikipedia content as a key source. For example, the nonprofit organization Hypothes.is continues to develop an open platform that supplies a layer on the web where users can create annotations that give context to the content.[28] In addition, "big data" can be harnessed to help provide context to public discourse, for example by tapping into data about the flow of money between political entities engaged in misinformation.[Note 2]

With the infusion of philanthropic investment and widespread experimentation, there is a chance that social media networks or online information companies such as Google may successfully tweak algorithms to prevent some of the most widespread sharing of false information online. However, new technologies are likely to inspire profit-seekers and political actors to develop new ways to manipulate systems, and extend these manipulations beyond text to other media, such as data, videos, photos, and more.

Access

Concerns over how commercial companies could limit access to Wikimedia platforms over the next 15 years will be addressed in forthcoming briefs about the future of the commons, and the emergence and use of new platforms. These include but aren't limited to battles over net neutrality, the rise of proprietary apps and platforms, and corporations' willingness (or unwillingness) to provide access to Wikipedia content from within their own content properties and devices.

Concluding thoughts and questions

How might Wikimedia plan for combating misinformation and censorship in the decades to come?

  • Encourage and embrace experiments in artificial intelligence and machine learning that could help enrich Wikipedia content.
  • Track developments in journalism and academia for new ways to fact-check and verify information that may be used as sources for Wikimedia platforms, such as evaluating video or other new media, also valuable for content.
  • Collaborate with other public interest organizations to advocate for press freedom, free speech, universal internet access, and other policy goals that ensure access and the free flow of information.
  • Continue to monitor carefully access to Wikimedia platforms from around the globe, deploying technical changes where appropriate.
  • Monitor the solutions being developed by commercial platforms and publications, both to see how they might be applied to improving content verification methods on Wikimedia platforms, and might offer opportunities for increasing access to that content.

Notes

  1. যদিও এই পাতার অনেক লিংক ইংরেজি উইকিপিডিয়ার দিকে নির্দেশ করছে তথাপি অনেক নীতিমালা অঅসলে আরও অনেক ভাষার উইকিপিডিয়াতে লেখা হয়েছে।
  2. See, for example, data available from the Center for Responsive Politics on campaign donors, lobbyists, and more in the U.S.; and Transparency International's data on lobbyists in the European Union as well as other data.

References

  1. "কিভাবে বহিঃস্থ সংস্থা বা প্রতিষ্ঠান উইকিমিডিয়া আন্দোলনকে প্রভাবিত করছে? – উইকিমিডিয়া ব্লগ". Retrieved 2017-07-13. 
  2. a b c d e Clark, Justin, Robert Faris, Rebekah Heacock Jones. Analyzing Accessibility of Wikipedia Projects Around the World. Cambridge: Berkman Klein Center for Internet & Society, 2017. Accessed May 25, 2017.
  3. Alcantara, Chris. "The most challenging job of the 2016 race: Editing the candidates' Wikipedia pages." Washington Post. October 27, 2016. Accessed May 25, 2017.
  4. Kiberd, Roison. "The Brutal Edit War Over a 3D Printer's Wikipedia Page." Motherboard. March 23, 2016. Accessed June 1, 2017.
  5. Helbing, Dirk, Bruno S. Frey, Gerd Gigerenzer, Ernst Hafen, Michael Hagner, Yvonne Hofstetter, Jeroen van den Hoven, Roberto V. Zicari, and Andrej Zwitter, "Will Democracy Survive Big Data and Artificial Intelligence?" Scientific American. February 25, 2017. Accessed May 28, 2017. https://www.scientificamerican.com/article/will-democracy-survive-big-data-and-artificial-intelligence/.
  6. Halfaker, Aaron. "I'm the principal research scientist at the nonprofit behind Wikipedia. I study AIs and community dynamics. AMA!" Reddit. June 2, 2017. Accessed June 7, 2017.
  7. Watzman, Nancy. "Internet Archive's Trump Archive launches today." Internet Archive Blogs. January 5, 2017. Accessed May 19, 2017.
  8. Marconi, Francesco, Alex Siegman, and Machine Journalist. The Future of Augmented Journalism. New York: Associated Press, 2017. Accessed May 30, 2017.
  9. Luckin, Rose, Wayne Holmes, Mark Griffiths, and Laurie B. Forcier. Intelligence Unleashed: An Argument for AI in Education. London: Pearson, 2016. Accessed June 8, 2017.
  10. Bilton, Nick. "Fake news is about to get even scarier than you ever dreamed." Vanity Fair. January 26, 2017. Accessed May 30, 2017.
  11. Kelly, Kevin. The Inevitable: Understanding the 12 Technological Forces That Will Shape Our Future. New York: Viking, 2016.
  12. GSMA. "The Mobile Economy 2017." Accessed June 1, 2017.
  13. Puddington, Arch. Breaking Down Democracy: Goals, Strategies, and Methods of Modern Authoritarians. Washington, D.C.: Freedom House, 2017. Accessed June 8, 2017.
  14. Reporters Without Borders. "2017 World Press Freedom Index – tipping point?" April 26, 2017. Updated May 15, 2017. Accessed May 28, 2017.
  15. Nordland, Rod. "Turkey's Free Press Withers as Erdogan Jails 120 Journalists." The New York Times. November 17, 2016. Accessed June 7, 2017.
  16. Reporters Without Borders. "Journalism weakened by democracy's erosion." Accessed May 29, 2017.
  17. Wardle, Claire. "Fake News. It's Complicated." First Draft News. February 16, 2017. Accessed June 7, 2017.
  18. Paul, Christopher and Miriam Matthews. The Russian "Firehose of Falsehood" Propaganda Model: Why It Might Work and Options to Counter It. Santa Monica: RAND Corporation, 2016.
  19. Broderick, Ryan. "Trump Supporters Online Are Pretending To Be French To Manipulate France's Election." BuzzFeed. January 24, 2017. Accessed June 7, 2017.
  20. Tufekci, Zeynep. "Dear France: You Just Got Hacked. Don't Make The Same Mistakes We Did." BuzzFeed. May 5, 2017. Accessed June 7, 2017.
  21. Stencel, Mark. "International Fact-Checking Gains Ground, Duke Census Finds." Duke Reporters Lab. February 28, 2017. Accessed June 7, 2017. https://reporterslab.org/international-fact-checking-gains-ground/.
  22. Darnton, Robert. "The True History of Fake News." The New York Review of Books. February 13, 2017. Accessed June 7, 2017.
  23. Silverman, Craig, ed. Verification Handbook: A Definitive Guide to Verifying Digital Content for Emergency Coverage. Maastricht: European Journalism Centre, 2016. Accessed May 29, 2017.
  24. Kearns, Cristin E., Laura A. Schmidt, and Stanton A.Glantz. "Sugar Industry and Coronary Heart Disease Research: A Historical Analysis of Internal Industry Documents." JAMA Intern Med 176, no. 11 (2016): 1680-1685. Accessed June 8, 2017. doi:10.1001/jamainternmed.2016.5394.
  25. Gomes, Ben. "Our latest quality improvements for search." The Keyword. Google. April 25, 2017. Accessed May 19, 2017.
  26. Simo, Fidji. "Introducing: the Facebook Journalism Project." Facebook Media. January 11, 2017. Accessed May 19, 2017.
  27. Public Data Lab and First Draft News. "A Field Guide to Fake News." Accessed May 19, 2017.
  28. The Hypothesis Project. "To Enable a Conversation Over the World's Knowledge: Hypothesis Mission." Accessed 22 May 2017.