Why False News Spreads Like Wildfire

By:  |  Category: Blog, Security Thursday, March 29th, 2018  |  No Comments
False News

How is it that “false news” is so easily dispersed, accepted and promulgated?

Oxford University’s Computational Propaganda Project looked into the social media phenomenon of “junk news” — which they define as “the tabloidization, false content, conspiracy theories, and political propaganda assaulting us on a daily basis.

Reason #1: Algorithms
Search algorithms are the basic element of our experience of the Internet today. Without this crucial step, we would have to sort through massive amounts of data. The fact that algorithms prioritize certain content is not a revelation. In the past, both individuals and businesses have tried to manipulate or “game” these systems for marketing purposes. What is new is that these business and marketing techniques are now being channeled to the political realm.

Social media platforms depend on algorithms to determine how news and content are disseminated and consumed. The information that is delivered through Facebook’s Newsfeed, Google’s search, and Twitter’s trending topics, is selected and prioritized by complex algorithms that have been coded to sort, filter, and deliver content in a manner that is designed to maximize users’ engagement with the content and time spent on the platform.

Unfortunately, the “ethics” of how algorithms select and prioritize information have been heavily criticized: instead of promoting the free flow and transparent exchange of ideas that is necessary for a healthy democracy, the personalization of content has created filter bubbles that limit information flows and perpetuate bias.

Additionally, most of the filtering of information that takes place on social media is not the product of the conscious choices of human users. Instead, what we see on our social media feeds and in our Google search results is the product of calculations made by powerful algorithms and machine learning models. These bits of code make decisions for us and about us by personalizing content and tailoring search results to reflect our individual interests, past behaviors, and even geographic location.

Algorithmic content curation has important consequences for how individuals find news and other important political information that is necessary for a healthy democracy. Instead of human editors selecting important sources of news and information for public consumption, complex algorithmic code determines what information to deliver or exclude. Popularity and the degree to which information provokes outrage, confirmation bias, or engagement are increasingly important in driving its spread. The speed and scale at which content “goes viral” grows exponentially, regardless of whether or not the information it contains is true.

Although the Internet has provided more opportunities to access information, algorithms have made it harder for individuals to find information from critical or diverse viewpoints.

Reason #2: Advertising
Social media platforms are built on collecting user data and selling it to companies to enable them to better understand populations of users, while offering companies the ability to craft and deliver micro-targeted messages to those populations. This is why social media accounts are “free” to use; individuals who sign up for their services pay with their personal information.

This advertising model contributes to the spread of junk news in two important ways. First, the advertising model itself rewards viral content, which has given rise to clickbait. Clickbait is content designed to attract attention — often by stimulating outrage, curiosity, or both — in order to encourage visitors to click on a link to a webpage.

The economics of clickbait help explain why so many stories around the events of 2016 and 2017 were designed to provoke particular emotional responses that increase the likelihood, intensity, and duration of engagement with the content. In practice, one effective way to do this has been to play to people’s existing biases and sense of outrage when their identity or values are perceived to be threatened. This has directly fueled the rise of exaggerated, inaccurate, misleading, and polarizing content.

The second way that social media’s data-based advertising model contributes to the spread of junk news is by empowering various actors to micro-target potential voters, with very little transparency or accountability around who sponsored the advertisements or why. Instead of encouraging users to the go to a certain restaurant or buy a particular brand, political campaigns and foreign operatives have used social media advertising to target voters with strategic, manipulative messages.

Reason #3: Exposure
While algorithms and advertisements filter and deliver information, users also select what they want to see or ignore. Scholars have emphasized the important role that individuals play in exercising their information preferences on the Internet. Online friend networks often perform a social filtering of content, which diminishes the diversity of information that users are exposed to. Academic studies have demonstrated that people are more likely to share information with their social networks that conforms to their pre-existing beliefs, deepening ideological differences between individuals and groups. As a result, voters do not get a representative, balanced, or accurate selection of news and information during an election, nor is the distribution of important information randomly distributed across a voting population.

What might explain why people selectively expose themselves to political news and information? The partisanship explanation theorizes that people pay attention to political content that fits an ideological package that they already subscribe to. If they’ve already expressed a preference for a particular candidate, they will select messages that strengthen, not weaken, that preference. Essentially, this means that voters tend not to change political parties or favored candidates because they are unlikely to voluntarily or proactively acquire radically new information that challenges their perspectives and undermines their preferences.

Another explanation for selective exposure focuses on one’s “schemata” — cognitive representations of generic concepts with consistent attributes that can be applied to new relationships and new kinds of information (Fisk and Kinder 1983). While the partisanship explanation emphasizes deference to already preferred political figures and groups, the schemata explanation emphasizes that we take cognitive short cuts and depend on ready-made prior knowledge. Basically, humans are prone to mental shortcuts based on what we know and like.

A third option is that we rely on selective exposure because we don’t want to face the cognitive dissonance of exposure to radically new and challenging information. However, there is minimal research into this explanation. It is a possibility, however, because investigations of context collapse have revealed that people have very real, jarring experiences when presented with unexpected information and social anecdotes over digital media. For example, ever seen rants on Facebook? People can get pretty tripped up over false stories without taking the time to decipher the truth.

This article is adapted from Samantha Bradshaw and Phillip N. Howard, Why Does Junk News Spread So Quickly Across Social Media? Algorithms, Advertising and Exposure in Public Life (Knight Foundation, March 2018), and is part of a white paper series on media and democracy commissioned by the John S. and James L. Knight Foundation. Samantha Bradshaw is a D.Phil. Candidate at the Oxford Internet Institute, a Researcher on the Computational Propaganda Project, and a Senior Fellow at the Canadian International Council (CIC). Phillip N. Howard is a professor of Internet Studies at Oxford University. This article first appeared in Medium, and is published courtesy of the Knight Foundation.

 

EnhancedTECH is a Premier Provider of Managed IT Services and Security. If your business needs assistance please give us a call at 714-970-9330 or contact us at sales@enhancedtech.com.

Samantha Keller

Samantha Keller

Director of Marketing and PR at EnhancedTECH
Samantha Keller (AKA Sam) is a published author, tech-blogger, event-planner and mother of three fabulous humans. Samantha has worked in the IT field for the last fifteen years, intertwining a freelance writing career along with technology sales, events and marketing. She began working for EnhancedTECH ten years ago after earning her Bachelor’s degree from UCLA and attending Fuller Seminary. She is a lover of kickboxing, extra-strong coffee, and Wolfpack football.Her regular blog columns feature upcoming tech trends, cybersecurity tips, and practical solutions geared towards enhancing your business through technology.
Samantha Keller

Latest posts by Samantha Keller (see all)

Leave a Comment
Read previous post:
Artificial Intelligence-Beneficial or Dangerous?

The ethical implications in the rise of Artificial Intelligence are raising some great conversations. Kevin Townsend recently penned an intriguing article about...

Close