Feature

How to Fight Fake News—Cognitive biases can make discerning truth from fiction difficult, but recent research suggests there may be hope


By Jeff Reeves

Precept 1 of the Code of Professional Conduct that all practicing U.S. actuaries must adhere to says, in part, that actuaries should “act honestly, with integrity and competence” and that they should avoid “dishonesty, fraud, deceit, or misrepresentation.”

Those are indeed key principles for the profession. But perhaps more fundamentally, if you ask a random passer-by on the street, chances are that person would agree these traits are universally important regardless of age, profession, or circumstance.

The sad reality in 2019, however, is that many people—particularly in America—appear to have a fraught relationship with the truth.

According to an estimate published[1] by the National Bureau of Economic Research, the average American was exposed to as many as 14 news stories during the 2016 elections that conveyed misleading or downright false information.

Furthermore, a recent Ipsos study[2] on the phenomenon of “fake news” found that 77 percent of Americans believe “the average person in the country lives in a bubble on the internet”—well above the global average of 65 percent—but amusingly only 32 percent think such intellectual myopia applies to their own personal behavior.

It’s no wonder, then, that Oxford Dictionaries selected “post-truth” as its 2016 Word of the Year,[3] in large part because of the pervasiveness of half-truths and outright lies that fueled successful political campaigns. Nor is it a surprise that Yale Law School’s list of the most notable quotes of 2017 included the proliferating use of “alternative facts.”[4]

Those who still value truth and objectivity—chief among them the scrupulous actuaries who rely on hard data and empirical evidence—may find these circumstances daunting.

But there is hope for the truth in 2019.

And the facts—if you can believe them, of course—show that many of the challenges humans have in consuming and analyzing information are neither a recent result of technology nor a signal that party politics have destroyed American democracy.

The Country That Cried ‘Fake News’

To begin with, it’s worth exploring the difference between materially false information and the presentation of opinions that just happen to differ from your personal point of view.

Consider that in a book by a French philosopher lauding the potential intellectual advancements a free press offers, then-President John Adams scrawled in the margin[5] of his copy a caustic retort: “There has been more new error propagated by the press in the last ten years than in an hundred years before 1798.” At the same time, Adams’ chief political rival, Thomas Jefferson, was equally critical of the press, saying, “Nothing can now be believed which is seen in a newspaper.”[6]

Those are rather remarkable points of view from Jefferson and Adams, considering their intimate associations with the news media of the day; Jefferson helped found the National Gazette newspaper in the 1790s as counterpoint to the federalist Gazette of the United States that regularly published the views of his rival John Adams.

It is logical, then, to assume that Jefferson was not referring to the work at a newspaper he championed as error-prone, but rather to the writings of those like Adams who disagreed with him in print—and vice versa.

After all, newspapers in the late 1700s were highly opinionated and made no attempt at balanced coverage. And being a political leader at the end of the 18th century, during the golden age of the party press and during a time of great change in the nascent republic, sometimes meant entire publications were organized by dissenters to dismantle the opposition and give voice to their particular point of view.

Historical archives show that this kind of coverage was simply the price of participating in the rough-and-tumble politics of early American democracy. But it certainly didn’t stop America’s leaders from labeling persistent unfavorable coverage as “fake news.”

Of course, the party press of the 1790s isn’t entirely analogous with today’s polarized media environment. And that’s less a function of tone and tactics than it is of technology.

Though Gutenberg’s movable type press became the first target vehicle for truly mass communication in Europe, and soon after in America, the reach of a newspaper in the early United States was decidedly limited. Literacy rates were much lower, access to printers was limited to those with money and connections, and finished print products moved slowly around the nation.

In contrast, today social media makes anyone a de facto publisher and smartphone newsfeeds are updated every second.

As a result, fringe groups thrive in 2019. That includes those untethered from political agendas, including conspiracy theorists at The Flat Earth Society who believe a “planar conspiracy” exists to perpetuate the lie of a round earth and contend that “gravity as a theory is false.”[7]

It may sound absurd that such a group can persist in an age of such advanced scientific tools. But in the end, a justification of Flat Earth Society views is similar to that of Thomas Jefferson: don’t believe the “fake news” you read in the newspapers.

Cognitive Biases

A digital age has certainly brought about unique challenges to publishers and media consumers alike. But our founding fathers show that the desire to reject opposing points of view out of hand is centuries old—and more recent psychological research shows it may be a trait that has plagued humans since the very beginning.

In the 1950s, an intellectual movement began to explore what would come to be known as “cognitive science.” The field grew to include the work of behaviorist B.F. Skinner, who explored the impact of positive and negative reinforcement on behavior, as well as artificial intelligence researchers such as MIT ‘s AI Laboratory co-founder Marvin Minsky.

At its core, cognitive science is the study of how our minds process and transform information. And over the past few decades, researchers have found that a number of glitches are hard-wired into the human brain in ways that adversely affect our ability to seek out objective truth.

These include, but are not limited to, the following areas:

  • Confirmation Bias: Confirmation bias is the phenomenon where we see what we want to see, reaching a preconceived conclusion and then fitting facts in place to support it. One famous example of confirmation bias at work involved a study of Dartmouth and Princeton students, where participants watched a particularly violent football game where both quarterbacks left the field with injuries. When then asked who was most responsible for the rough play, most students predictably blamed the other school was to blame. The study’s authors presented the findings[8] as proof that “out of all the occurrences going on in the environment, a person selects only those that have some significance for him from his own egocentric position.”
  • Optimism Bias: Nobody wants to believe they are inferior, and as a result most humans overestimate their abilities to do just about anything. However, there are countless studies illustrating the absurdity of just how optimistic humans as a group can be. More than 90 percent[9] of college professors said they had above-average classroom skills in one famous 1977 survey, and a 1986 study indicated that up to 80 percent of drivers rate themselves above average behind the wheel. It doesn’t take an actuary to know that the optimistic perceptions that were self-reported in these studies are highly unlikely to be scientifically true.
  • Self-Serving Bias: Just as humans like to be good at things, we also like to believe that good things have happened because of our hard work and attention—and, of course, that failures are because other people were dragging us down. In one 1986 study,[10] subjects were organized into pairs with one leader and one subordinate and then given performance feedback on tasks. When a low performance grade was recorded, the leaders as a group largely blamed the subordinate—while the subordinates as a group largely blamed the leaders.
  • Recency Bias: Humans are good at learning from history, but unfortunately we place outsized value on what has happened most recently, regardless of whether those events are likely to happen again anytime soon. One of the most common manifestations of this trend is in investing, when folks willingly buy stocks at the top of the market simply because they have run up so much recently and sell in a panic when it is too late and the damage has already been done. Consider a recent Gallup poll[11] that showed 52 percent of adults younger than age 35 owned stocks in the seven years leading up to the financial crisis and market crash of 2008. By 2017 and 2018, only 37 percent did despite the fact that “older Americans, who had seen the market recover from previous shocks, have been more willing to hold on to their stocks.”
  • A Bias for Bias: Perhaps most pernicious of all is our bias to believe what we believe—regardless of how those impressions are formed. In a 1975 study,[12] Stanford researchers gave purported suicide notes to test groups and asked them to identify which ones were real and which ones were fake. Researchers then fabricated scores, telling some groups they were incredibly accurate at identifying real notes and telling other groups they were way off. After awarding these divergent scores, researchers then openly admitted the results were made up, and that they really wanted to know whether these students believed they were right in their analysis. Researchers found those with fake scores that were high were significantly more likely to believe they were good at discerning real notes from fake notes, while those with low scores believed they were significantly worse than their peers. In reality, the groups’ performance was not materially different.

Lies That Go Viral

When you marry a partisan media environment with our ingrained cognitive biases, it’s easy to understand why American discourse is at where it is at. The typical person is eager to accept any information—regardless of its veracity—that fits in with their personal worldviews, and the wide variety of information on the internet makes it easy to build your own echo chamber. Similarly, it’s also easy to understand why so many of us credulously disseminate that one-sided viewpoint on social media to keep the cycle going for others.

Humans are imperfect, so it’s only natural our communication is imperfect as well. However, in 2019 we face a new wrinkle in the form of willful deceptions that take on a life of their own thanks to the speed and ease of digital communications.

Russia’s Internet Research Agency, a propaganda arm of the government, is believed to have opened up 99 accounts on photo-sharing app Instagram that lured more than 600,000 Americans into following its posts during the 2016 election cycle.[13] Furthermore, in 2018 the Justice Department charged 13 Russians and three Russian companies[14] with conspiracy, identity theft, failing to register as foreign agents, and other criminal charges related to a broad campaign targeting U.S. voters with misinformation.

It’s not just Russia, either.

Far-right conspiracy theorist Alex Jones founded his divisive website InfoWars in 1999, peddling outrageous fabrications for almost two decades. Some segments have been merely absurd, such as a claim[15] that “the majority of frogs in most areas of the United States are gay” because of a chemical “gay bomb” used by the Pentagon. Others have been inflammatory, such as a debunked and offensive assertion that the shooting deaths of 26 students and teachers at Sandy Hook Elementary School in 2012 was a hoax staged by “crisis actors.”

And in mid-2018, Infowars boasted 1.4 million visits each day to its website, videos, and Facebook pages according to a New York Times analysis.[16]

The ingrained biases of American media consumers assuredly lead them to readily accept false information that is put in front of them and ignore reality. In fact, 46 percent[17] of Americans polled in July did not believe Russia meddled in the 2016 election despite clear assertions by organizations such as the Central Intelligence Agency, the Federal Bureau of Investigation, and the Justice Department asserting that Russia indeed interfered in the process.

But it’s worth acknowledging it is not enough to simply encourage people to be skeptical when there are such active and aggressive campaigns of misinformation.

There are early signs that legislators and business leaders understand these risks to the system. Internet giants including Facebook willingly testified before lawmakers about Russia’s election interference, with CEO Mark Zuckerberg pledging, “I don’t want anyone to use our tools to undermine democracy.[18]

There have also been efforts to ban Alex Jones and Infowars from various outlets, including Apple’s iTunes taking down his podcast and YouTube banning his video channel. There is also a legal fight mounted by the families of Sandy Hook victims aimed at punishing Jones and InfoWars for its past behavior.

These are not comprehensive solutions, however. Alex Jones is still independently publishing on InfoWars.com; one can argue the concerns over 2016 election interference are as much about how easy it is for any outside actors to influence American democracy as they are about Russia motivations three years ago.

Equally disturbing is that recent examination of Alex Jones and Russian misinformation campaigns has had a chilling effect on the entire media landscape. These true instances of “fake news” prove to some that it is difficult to believe anything at all.

According to a Northeastern University study on media consumption,[19] almost half of the nearly 6,000 American college students surveyed said they lacked confidence in discerning real from fake news on social media. Additionally, 36 percent of them said the threat of misinformation made them trust all media less.

Those are disturbing facts that show what’s at stake in the age of fake news, and why skepticism alone may not be enough.

Will the Truth Win Out?

All this may sound like quite a challenge for American media, and perhaps even American democracy. It is certainly a challenge for all of us, actuaries included. The reality, though, is that technologies that tear down barriers have always been disruptive—and our modern digital media environment is just the latest proof of this.

But regulators and government officials are starting to catch up. Beyond the much-publicized hearings before the U.S. Congress, internet companies have been taken to task for their behavior globally, and in March 2018 the European Commission published a report[20] on misinformation in the 21st century and committed itself to both media literacy and continued research into the impact of “fake news” going forward.

Beyond public policy responses, there is also an encouraging history of various media businesses self-policing in an effort to maintain the public trust—as well as their business models. From voluntary schemes such as the Motion Picture Association of America’s film rating system to the Recording Industry Association of America’s efforts to crack down on piracy to the Poyner Institute’s news fact checking arm Politifact, there are many examples of industry-led efforts to make media more hygienic without trampling on anyone’s First Amendment rights to free speech.

And if employing a healthy dose of skepticism as we consume information is always a good thing, then it’s worth questioning whether misinformation is indeed as big a problem as certain headlines would like us to believe.For instance, a recent study in the journal Science found that only 5 five percent of political content shared on social media platform Twitter could be defined as “fake news.” Furthermore, the study found a mere 1 percent of users consumed a massive 80 percent of fake content on Twitter.[21]

A separate study by New York University and Princeton focused on Facebook, and found that less than 9 percent of links shared during the 2016 election fell under the definition of “fake news.”[22]

These two studies seems to indicate that while a small portion of the population may be quite gullible, the vast majority of Americans are reasonably discerning consumers of media.

Lastly, it’s also important to acknowledge that the typical person encounters plenty of misinformation and half-truths every day but still manages to be a functional member of society. That’s because while the human mind is hardwired for bias, that does not mean people are incapable of changing their minds or understanding a different point of view.

Consider that reliance on fake news was found to be prevalent in those who are simply “less analytic and less actively open-minded thinking,” according to 2018 research published in the Journal of Applied Research in Memory and Cognition.[23] As an important sign of hope, that research noted that simple “interventions” that encourage more analytic behaviors can help reduce a person’s willingness to simply accept outlandish headlines as real and make them a more discerning media consumer.

Similarly, it has been shown that “those with segregated social networks are significantly more likely to believe ideologically aligned articles, perhaps because they are less likely to receive disconfirmatory information from their friends,” according to one study.[24] That means simply widening your communication network may be enough to help shut down fake news.

The challenges in today’s media environment are real and go far beyond any one politician’s casual relationship with the truth.

But the fact that the American public has at least acknowledged the pernicious phenomenon of fake news and has continually explored the topic in recent years is in itself a sign of hope that the truth may ultimately win out.

Many actuaries have seen the John Ruskin quote, “The work of science is to substitute facts for appearances, and demonstrations for impressions,” and thought it described, or should describe, what they do.

Together with others who emphasize objectivity and independence in their approach to their lives at work and elsewhere, actuaries can help us all.

 

JEFF REEVES is a financial journalist with almost two decades of newsroom and markets experience. His commentary has appeared in USA Today, U.S. News & World Report, CNBC, and the Fox Business Network.

 

References

[1] “Social Media and Fake News in the 2016 Election”; National Bureau of Economic Research; January 2017.

[2] “Fake News, Filter Bubbles, and Post-Truth Are Other People’s Problems”; Ipsos; Sept. 5, 2018.

[3] “Word of the Year 2016 is…”; Oxford Living Dictionaries; November 2016.

[4] “‘Alternative Facts’ Remark Tops 2017 List of Notable Quotes”; U.S. News; Dec. 12, 2017.

[5]  James Madison and the Spirit of Republican Self-Government, by Colleen A. Sheehan. Cambridge University Press; 2009.

[6] “Memo to Donald Trump: Thomas Jefferson invented hating the media,”; The Washington Post; Feb. 18,. 2017.

[7] “About the Flat Earth Society: FAQ”; TheFlatEarthSociety.org; 2016

[8] “The Hastorf and Cantril Case Study;” Explorable.com; May 2010

[9] “Not can, but will college teaching be improved?” New Directions for Higher Education; 1977

[10] Self-Serving Biases in Leadership: A Laboratory Experiment”; Journal of Management; Dec. 1, 1986

[11] “Young Americans Still Wary of Investing in Stocks”; Gallup; May 4, 2018

[12] “Clinging to Beliefs: A Constraint-satisfaction Model”; Department of Psychology at McGill University.

[13] “Russian trolls reached hundreds of thousands of US Instagram users before Facebook removed them on eve of midterms”; CNBC; Nov. 13, 2018.

[14] “Special counsel indicts Russian nationals for interfering with U.S. elections and political processes”; USA Today; Feb. 16, 2018.

[15] “Alex Jones’ 5 most disturbing and ridiculous conspiracy theories”; CNBC; Sept. 14, 2018.

[16] “Alex Jones Said Bans Would Strengthen Him. He was Wrong.”; The New York Times; Sept. 4, 2018.

[17] “Poll: 60 percent of Americans say Russia meddled in 2016 election”; Politico. July 18, 2018.

[18] “Facebook is promising major ad changes to stop Russia and other foreign actors from influencing U.S. elections”; Recode; Sept. 21, 2017.

[19] “Faced with a daily barrage of news, college students find it hard to tell what’s real and what’s ‘fake news’ “; News @ Northeastern; Oct. 16, 2018.

[20] “Final report of the High Level Expert Group on Fake News and Online Disinformation” European Commission; March 12, 2018.

[21] “Fake news on Twitter during the 2016 U.S. presidential election”; Science; Jan. 25, 2019.

[22] “Less than you think: Prevalence and predictors of fake news dissemination on Facebook”; Science Advances; Jan. 9, 2019.

[23] “Belief in Fake News is Associated with Delusionality, Dogmatism, Religious Fundamentalism, and Reduced Analytic Thinking”; Journal of Applied Research in Memory and Cognition. Oct. 24, 2018.

[24] “Social Media and Fake News in the 2016 Election”; The Journal of Economic Perspectives; Spring 2017.

 

Required Reading
The Art of Thinking Clearly, by Rolf Dobelli

A compilation of some of the more common cognitive biases, the book provides bite-sized examples of common stumbling blocks on the way to making good decisions and filtering information better. Structured into 99 pithy chapters with titles like “Does Harvard Make You Smarter?” and “Don’t Accept Free Drinks,” the folksy anecdotes Dobelli uses are much more vivid and relatable than simply a parade of scientific studies.

Thinking Fast and Slow, by Daniel Kahneman

As the title implies, there are times when a fast and effortless way of thinking is best and others when a deliberate use of logic can yield better results. But the challenge is that that most people can’t keep proper perspective on which method is most appropriate. As Kahneman writes rather cuttingly, “Nothing in life is as important as you think it is, when you are thinking it.” The 400-page book is thicker than Dobelli’s and filled with much more psychological rigor, in part because Kahneman is an academic, but is still applicable to real-world situations like why attractive people are deemed more competent in the workplace.

Predictably Irrational, by Dan Ariely

One of the pioneers of “behavioral economics,” Duke professor Dan Ariely is focused largely on the effect of our biases on how we behave as investors, savers, and spenders. For many years, economists relied on ideas such as supply and demand or market efficiency to explain the ups and downs of business cycles. But as this book helps illustrate, any economic theory is incomplete unless it shows the emotional and psychological hang-ups that make real-world money decisions so downright irrational.

Nudge, by Richard Thaler and Cass Sunstein

An interesting mashup of psychological insights compiled by an economist and a law professor, the book raises “serious questions about the rationality of many judgments and decisions that people make” and then aims to offer public policy solutions for the betterment of society. These ideas are meant to “nudge” individuals into making decisions that have long-term impact for large numbers of people, which is no easy task when we are hardwired to think about ourselves and to focus on the short term. For instance, simply enrolling employees in a retirement savings plan like a 401(k) by default rather than requiring workers to think beyond short-term desires and voluntarily sign up could drastically increase savings rates, the authors posit.

The Art of Choosing, by Shena Iyengar

Decisions like whom to hire or where to live have huge influence over the direction of our lives. But sadly, some of us make bad decisions with lasting impact. This book explores how and why humans typically choose the things they do, even when it is against their best interests, thanks to a combination of biological and cultural influences. Iyengar’s research is centered around business and economic issues, less a self-help guide to making good choices day-to-day and more of a treatise on the logic behind decision-making.

print
Next article By the Numbers: A deep dive into the data that matters
Previous article A New Kind of Reality Television

Related posts