Why Your Brain Falls for Misinformation – And How to Avoid It
This article is part of the magazine, "The Future of Science In America: The Election Issue," co-published by LeapsMag, the Aspen Institute Science & Society Program, and GOOD.
Whenever you hear something repeated, it feels more true. In other words, repetition makes any statement seem more accurate. So anything you hear again will resonate more each time it's said.
Do you see what I did there? Each of the three sentences above conveyed the same message. Yet each time you read the next sentence, it felt more and more true. Cognitive neuroscientists and behavioral economists like myself call this the "illusory truth effect."
Go back and recall your experience reading the first sentence. It probably felt strange and disconcerting, perhaps with a note of resistance, as in "I don't believe things more if they're repeated!"
Reading the second sentence did not inspire such a strong reaction. Your reaction to the third sentence was tame by comparison.
Why? Because of a phenomenon called "cognitive fluency," meaning how easily we process information. Much of our vulnerability to deception in all areas of life—including to fake news and misinformation—revolves around cognitive fluency in one way or another. And unfortunately, such misinformation can swing major elections.
The Lazy Brain
Our brains are lazy. The more effort it takes to process information, the more uncomfortable we feel about it and the more we dislike and distrust it.
By contrast, the more we like certain data and are comfortable with it, the more we feel that it's accurate. This intuitive feeling in our gut is what we use to judge what's true and false.
Yet no matter how often you heard that you should trust your gut and follow your intuition, that advice is wrong. You should not trust your gut when evaluating information where you don't have expert-level knowledge, at least when you don't want to screw up. Structured information gathering and decision-making processes help us avoid the numerous errors we make when we follow our intuition. And even experts can make serious errors when they don't rely on such decision aids.
These mistakes happen due to mental errors that scholars call "cognitive biases." The illusory truth effect is one of these mental blindspots; there are over 100 altogether. These mental blindspots impact all areas of our life, from health and politics to relationships and even shopping.
We pay the most attention to whatever we find most emotionally salient in our environment, as that's the information easiest for us to process.
The Maladapted Brain
Why do we have so many cognitive biases? It turns out that our intuitive judgments—our gut reactions, our instincts, whatever you call them—aren't adapted for the modern environment. They evolved from the ancestral savanna environment, when we lived in small tribes of 15–150 people and spent our time hunting and foraging.
It's not a surprise, when you think about it. Evolution works on time scales of many thousands of years; our modern informational environment has been around for only a couple of decades, with the rise of the internet and social media.
Unfortunately, that means we're using brains adapted for the primitive conditions of hunting and foraging to judge information and make decisions in a very different world. In the ancestral environment, we had to make quick snap judgments in order to survive, thrive, and reproduce; we're the descendants of those who did so most effectively.
In the modern environment, we can take our time to make much better judgments by using structured evaluation processes to protect yourself from cognitive biases. We have to train our minds to go against our intuitions if we want to figure out the truth and avoid falling for misinformation.
Yet it feels very counterintuitive to do so. Again, not a surprise: by definition, you have to go against your intuitions. It's not easy, but it's truly the only path if you don't want to be vulnerable to fake news.
The Danger of Cognitive Fluency and Illusory Truth
We already make plenty of mistakes by ourselves, without outside intervention. It's especially difficult to protect ourselves against those who know how to manipulate us. Unfortunately, the purveyors of misinformation excel at exploiting our cognitive biases to get us to buy into fake news.
Consider the illusory truth effect. Our vulnerability to it stems from how our brain processes novel stimuli. The first time we hear something new to us, it's difficult to process mentally. It has to integrate with our existing knowledge framework, and we have to build new neural pathways to make that happen. Doing so feels uncomfortable for our lazy brain, so the statement that we heard seems difficult to swallow to us.
The next time we hear that same thing, our mind doesn't have to build new pathways. It just has to go down the same ones it built earlier. Granted, those pathways are little more than trails, newly laid down and barely used. It's hard to travel down that newly established neural path, but much easier than when your brain had to lay down that trail. As a result, the statement is somewhat easier to swallow.
Each repetition widens and deepens the trail. Each time you hear the same thing, it feels more true, comfortable, and intuitive.
Does it work for information that seems very unlikely? Science says yes! Researchers found that the illusory truth effect applies strongly to implausible as well as plausible statements.
What about if you know better? Surely prior knowledge prevents this illusory truth! Unfortunately not: even if you know better, research shows you're still vulnerable to this cognitive bias, though less than those who don't have prior knowledge.
Sadly, people who are predisposed to more elaborate and sophisticated thinking—likely you, if you're reading the article—are more likely to fall for the illusory truth effect. And guess what: more sophisticated thinkers are also likelier than less sophisticated ones to fall for the cognitive bias known as the bias blind spot, where you ignore your own cognitive biases. So if you think that cognitive biases such as the illusory truth effect don't apply to you, you're likely deluding yourself.
That's why the purveyors of misinformation rely on repeating the same thing over and over and over and over again. They know that despite fact-checking, their repetition will sway people, even some of those who think they're invulnerable. In fact, believing that you're invulnerable will make you more likely to fall for this and other cognitive biases, since you won't be taking the steps necessary to address them.
Other Important Cognitive Biases
What are some other cognitive biases you need to beware? If you've heard of any cognitive biases, you've likely heard of the "confirmation bias." That refers to our tendency to look for and interpret information in ways that conform to our prior beliefs, intuitions, feelings, desires, and preferences, as opposed to the facts.
Again, cognitive fluency deserves blame. It's much easier to build neural pathways to information that we already possess, especially that around which we have strong emotions; it's much more difficult to break well-established neural pathways if we need to change our mind based on new information. Consequently, we instead look for information that's easy to accept, that which fits our prior beliefs. In turn, we ignore and even actively reject information that doesn't fit our beliefs.
Moreover, the more educated we are, the more likely we are to engage in such active rejection. After all, our smarts give us more ways of arguing against new information that counters our beliefs. That's why research demonstrates that the more educated you are, the more polarized your beliefs will be around scientific issues that have religious or political value overtones, such as stem cell research, human evolution, and climate change. Where might you be letting your smarts get in the way of the facts?
Our minds like to interpret the world through stories, meaning explanatory narratives that link cause and effect in a clear and simple manner. Such stories are a balm to our cognitive fluency, as our mind constantly looks for patterns that explain the world around us in an easy-to-process manner. That leads to the "narrative fallacy," where we fall for convincing-sounding narratives regardless of the facts, especially if the story fits our predispositions and our emotions.
You ever wonder why politicians tell so many stories? What about the advertisements you see on TV or video advertisements on websites, which tell very quick visual stories? How about salespeople or fundraisers? Sure, sometimes they cite statistics and scientific reports, but they spend much, much more time telling stories: simple, clear, compelling narratives that seem to make sense and tug at our heartstrings.
Now, here's something that's actually true: the world doesn't make sense. The world is not simple, clear, and compelling. The world is complex, confusing, and contradictory. Beware of simple stories! Look for complex, confusing, and contradictory scientific reports and high-quality statistics: they're much more likely to contain the truth than the easy-to-process stories.
Another big problem that comes from cognitive fluency: the "attentional bias." We pay the most attention to whatever we find most emotionally salient in our environment, as that's the information easiest for us to process. Most often, such stimuli are negative; we feel a lesser but real attentional bias to positive information.
That's why fear, anger, and resentment represent such powerful tools of misinformers. They know that people will focus on and feel more swayed by emotionally salient negative stimuli, so be suspicious of negative, emotionally laden data.
You should be especially wary of such information in the form of stories framed to fit your preconceptions and repeated. That's because cognitive biases build on top of each other. You need to learn about the most dangerous ones for evaluating reality clearly and making wise decisions, and watch out for them when you consume news, and in other life areas where you don't want to make poor choices.
Fixing Our Brains
Unfortunately, knowledge only weakly protects us from cognitive biases; it's important, but far from sufficient, as the study I cited earlier on the illusory truth effect reveals.
What can we do?
The easiest decision aid is a personal commitment to twelve truth-oriented behaviors called the Pro-Truth Pledge, which you can make by signing the pledge at ProTruthPledge.org. All of these behaviors stem from cognitive neuroscience and behavioral economics research in the field called debiasing, which refers to counterintuitive, uncomfortable, but effective strategies to protect yourself from cognitive biases.
What are these behaviors? The first four relate to you being truthful yourself, under the category "share truth." They're the most important for avoiding falling for cognitive biases when you share information:
Share truth
- Verify: fact-check information to confirm it is true before accepting and sharing it
- Balance: share the whole truth, even if some aspects do not support my opinion
- Cite: share my sources so that others can verify my information
- Clarify: distinguish between my opinion and the facts
The second set of four are about how you can best "honor truth" to protect yourself from cognitive biases in discussions with others:
Honor truth
- Acknowledge: when others share true information, even when we disagree otherwise
- Reevaluate: if my information is challenged, retract it if I cannot verify it
- Defend: defend others when they come under attack for sharing true information, even when we disagree otherwise
- Align: align my opinions and my actions with true information
The last four, under the category "encourage truth," promote broader patterns of truth-telling in our society by providing incentives for truth-telling and disincentives for deception:
Encourage truth
- Fix: ask people to retract information that reliable sources have disproved even if they are my allies
- Educate: compassionately inform those around me to stop using unreliable sources even if these sources support my opinion
- Defer: recognize the opinions of experts as more likely to be accurate when the facts are disputed
- Celebrate: those who retract incorrect statements and update their beliefs toward the truth
Peer-reviewed research has shown that taking the Pro-Truth Pledge is effective for changing people's behavior to be more truthful, both in their own statements and in interactions with others. I hope you choose to join the many thousands of ordinary citizens—and over 1,000 politicians and officials—who committed to this decision aid, as opposed to going with their gut.
[Adapted from: Dr. Gleb Tsipursky and Tim Ward, Pro Truth: A Practical Plan for Putting Truth Back Into Politics (Changemakers Books, 2020).]
[Editor's Note: To read other articles in this special magazine issue, visit the beautifully designed e-reader version.]
Two Conservative Icons Gave Opposite Advice on COVID-19. Those Misinformed Died in Higher Numbers, New Study Reports.
The news sources that you consume can kill you - or save you. That's the fundamental insight of a powerful new study about the impact of watching either Sean Hannity's news show Hannity or Tucker Carlson's Tucker Carlson Tonight. One saved lives and the other resulted in more deaths, due to how each host covered COVID-19.
Carlson took the threat of COVID-19 seriously early on, more so than most media figures on the right or left.
This research illustrates the danger of falling for health-related misinformation due to judgment errors known as cognitive biases. These dangerous mental blindspots stem from the fact that our gut reactions evolved for the ancient savanna environment, not the modern world; yet the vast majority of advice on decision making is to "go with your gut," despite the fact that doing so leads to so many disastrous outcomes. These mental blind spots impact all areas of our life, from health to politics and even shopping, as a survey by a comparison purchasing website reveals. We need to be wary of cognitive biases in order to survive and thrive during this pandemic.
Sean Hannity vs. Tucker Carlson Coverage of COVID-19
Hannity and Tucker Carlson Tonight are the top two U.S. cable news shows, both on Fox News. Hannity and Carlson share very similar ideological profiles and have similar viewership demographics: older adults who lean conservative.
One notable difference, however, relates to how both approached coverage of COVID-19, especially in February and early March 2020. Researchers at the Becker Friedman Institute for Economics at the University of Chicago decided to study the health consequences of this difference.
Carlson took the threat of COVID-19 seriously early on, more so than most media figures on the right or left. Already on January 28, way earlier than most, Carlson spent a significant part of his show highlighting the serious dangers of a global pandemic. He continued his warnings throughout February. On February 25, Carlson told his viewers: "In this country, more than a million would die."
By contrast, Hannity was one of the Fox News hosts who took a more extreme position in downplaying COVID-19, frequently comparing it to the flu. On February 27, he said "And today, thankfully, zero people in the United States of America have died from the coronavirus. Zero. Now, let's put this in perspective. In 2017, 61,000 people in this country died from influenza, the flu. Common flu." Moreover, Hannity explicitly politicized COVID-19, claiming that "[Democrats] are now using the natural fear of a virus as a political weapon. And we have all the evidence to prove it, a shameful politicizing, weaponizing of, yes, the coronavirus."
However, after President Donald Trump declared COVID-19 a national emergency in mid-March, Hannity -- and other Fox News hosts -- changed their tune to align more with Carlson's, acknowledging the serious dangers of the virus.
The Behavior and Health Consequences
The Becker Friedman Institute researchers investigated whether the difference in coverage impacted behaviors. They conducted a nationally representative survey of over 1,000 people who watch Fox News at least once a week, evaluating both viewership and behavior changes in response to the pandemic, such as social distancing and improving hygiene.
Next, the study compared people's behavior changes to viewing patterns. The researchers found that "viewers of Hannity changed their behavior five days later than viewers of other shows, while viewers of Tucker Carlson Tonight changed their behavior three days earlier than viewers of other shows." The statistical difference was more than enough to demonstrate significance; in other words, it was extremely unlikely to occur by chance -- so unlikely as to be negligible.
Did these behavior changes lead to grave consequences? Indeed.
The paper compared the popularity of each show in specific counties to data on COVID-19 infections and deaths. Controlling for a wide variety of potential confounding variables, the study found that areas of the country where Hannity is more popular had more cases and deaths two weeks later, the time that it would take for the virus to start manifesting itself. By March 21st, the researchers found, there were 11 percent more deaths among Hannity's viewership than among Carlson's, again with a high degree of statistical significance.
The study's authors concluded: "Our findings indicate that provision of misinformation in the early stages of a pandemic can have important consequences for health outcomes."
Such outcomes stem from excessive trust that our minds tend to give those we see as having authority, even if they don't possess expertise in the relevant subject era.
Cognitive Biases and COVID-19 Misinformation
It's critically important to recognize that the study's authors did not seek to score any ideological points, given the broadly similar ideological profiles of the two hosts. The researchers simply explored the impact of accurate and inaccurate information about COVID-19 on the viewership. Clearly, the false information had deadly consequences.
Such outcomes stem from excessive trust that our minds tend to give those we see as having authority, even if they don't possess expertise in the relevant subject era -- such as media figures that we follow. This excessive trust - and consequent obedience - is called the "authority bias."
A related mental pattern is called "emotional contagion," in which we are unwittingly infected with the emotions of those we see as leaders. Emotions can motivate action even in the absence of formal authority, and are particularly important for those with informal authority, including thought leaders like Carlson and Hannity.
Thus, Hannity telling his audience that Democrats used anxiety about the virus as a political weapon led his audience to reject fears of COVID-19, even though such a reaction and consequent behavioral changes were the right response. Carlson's emphasis on the deadly nature of this illness motivated his audience to take appropriate precautions.
Authority bias and emotional contagion facilitate the spread of misinformation and its dangers, at least when we don't take the steps necessary to figure out the facts. Such steps can range from following best fact-checking practices to getting your information from news sources that commit publicly to being held accountable for truthfulness. Remember, the more important and impactful such information may be for your life, the more important it is to take the time to evaluate it accurately to help you make the best decisions.
Today's growing distrust of science is not an academic problem. It can be a matter of life and death.
Take, for example, the tragic incident in 2016 when at least 10 U.S. children died and over 400 were sickened after they tried homeopathic teething medicine laced with a poisonous herb called "deadly nightshade." Carried by CVS, Walgreens, and other major American pharmacies, the pills contained this poison based on the alternative medicine principle of homeopathy, the treatment of medical conditions by tiny doses of natural substances that produce symptoms of disease.
Such "alternative medicines" take advantage of the lack of government regulation and people's increasing hostility toward science.
These children did not have to die. Numerous research studies show that homeopathy does not work. Despite this research, homeopathy is a quickly-growing multi-billion dollar business.
Such "alternative medicines" take advantage of the lack of government regulation and people's increasing hostility toward science. Polling shows that the number of people who believe that science has "made life more difficult" increased by 50 percent from 2009 to 2015. According to a 2017 survey, only 35 percent of respondents have "a lot" of trust in scientists; the number of people who do "not at all" trust scientists increased by over 50 percent from a similar poll conducted in December 2013.
Children dying from deadly nightshade is only one consequence of this crisis of trust. For another example, consider the false claim that vaccines cause autism. This belief has spread widely across the US, and led to a host of problems. For instance, measles was practically eliminated in the US by 2000. However, in recent years outbreaks of measles have been on the rise, driven by parents failing to vaccinate their children in a number of communities.
The Internet Is for… Misinformation
The rise of the Internet, and more recently social media, is key to explaining the declining public confidence in science.
Before the Internet, the information accessible to the general public about any given topic usually came from experts. For instance, researchers on autism were invited to talk on mainstream media, they wrote encyclopedia articles, and they authored books distributed by large publishers.
The Internet has enabled anyone to be a publisher of content, connecting people around the world with any and all sources of information. On the one hand, this freedom is empowering and liberating, with Wikipedia a great example of a highly-curated and accurate source on the vast majority of subjects. On the other, anyone can publish a blog piece making false claims about links between vaccines and autism or the effectiveness of homeopathic medicine. If they are skilled at search engine optimization, or have money to invest in advertising, they can get their message spread widely. Russia has done so extensively to influence elections outside of its borders, whether in the E.U. or the U.S.
Unfortunately, research shows that people lack the skills for differentiating misinformation from true information. This lack of skills has clear real-world effects: U.S. adults believed 75 percent of fake news stories about the 2016 US Presidential election. The more often someone sees a piece of misinformation, the more likely they are to believe it.
To make matters worse, we all suffer from a series of thinking errors such as the confirmation bias, our tendency to look for and interpret information in ways that conform to our intuitions.
Blogs with falsehoods are bad enough, but the rise of social media has made the situation even worse. Most people re-share news stories without reading the actual article, judging the quality of the story by the headline and image alone. No wonder research has indicated that misinformation spreads as much as 10 times faster and further on social media than true information. After all, creators of fake news are free to devise the most appealing headline and image, while credible sources of information have to stick to factual headlines and images.
To make matters worse, we all suffer from a series of thinking errors such as the confirmation bias, our tendency to look for and interpret information in ways that conform to our intuitions and preferences, as opposed to the facts. Our inherent thinking errors combined with the Internet's turbine power has exploded the prevalence of misinformation.
So it's no wonder we see troubling gaps between what scientists and the public believe about issues like climate change, evolution, genetically modified organisms, and vaccination.
What Can We Do?
Fortunately, there are proactive steps we can take to address the crisis of trust in science and academia. The Pro-Truth Pledge, founded by a group of behavioral science experts (including myself) and concerned citizens, calls on public figures, organizations, and private citizens to commit to 12 behaviors listed on the pledge website that research in behavioral science shows correlate with truthfulness.
Signers are held accountable through a crowdsourced reporting and evaluation mechanism while getting reputational rewards because of their commitment. The scientific consensus serves as a key measure of credibility, and the pledge encourages pledge-takers to recognize the opinions of experts - especially scientists - as more likely to be true when the facts are disputed.
The pledge "really does seem to change one's habits," encouraging signers to have attitudes "of honesty and moral sincerity."
Launched in December 2016, the pledge has surprising traction. Over 6200 private citizens took the pledge. So did more than 500 politicians, including members of US state legislatures Eric Nelson (PA), James White (TX), and Ogden Driskell (WY), and national politicians such as members of U.S. Congress Beto O'Rourke (TX), Matt Cartwright (PA), and Marcia Fudge (OH). Over 700 other public figures, such as globally-known public intellectuals Peter Singer, Steven Pinker, Michael Shermer, and Jonathan Haidt, took the pledge, as well as 70 organizations such as Media Bias/Fact Check, Fugitive Watch, Earth Organization for Sustainability, and One America Movement.
The pledge is effective in changing behaviors. A candidate for Congress, Michael Smith, took the Pro-Truth Pledge. He later posted on his Facebook wall a screenshot of a tweet by Donald Trump criticizing minority and disabled children. However, after being called out that the tweet was a fake, he went and searched Trump's feed. He could not find the original tweet, and while Trump may have deleted it, the candidate edited his own Facebook post to say, "Due to a Truth Pledge I have taken, I have to say I have not been able to verify this post." He indicated that he would be more careful with future postings.
U.S. Army veteran and pledge-taker John Kirbow described how the pledge "really does seem to change one's habits," helping push him both to correct his own mistakes with an "attitude of humility and skepticism, and of honesty and moral sincerity," and also to encourage "friends and peers to do so as well."
His experience is confirmed by research on the pledge. Two research studies at Ohio State University demonstrated the effectiveness of the pledge in changing the behavior of pledge-takers to be more truthful with a strong statistical significance.
Taking the pledge yourself, and encouraging people you know and your elected representatives to do the same, is an easy and effective way to fight misinformation and to promote a culture that values the truth.