Why Your Brain Falls for Misinformation – And How to Avoid It
This article is part of the magazine, "The Future of Science In America: The Election Issue," co-published by LeapsMag, the Aspen Institute Science & Society Program, and GOOD.
Whenever you hear something repeated, it feels more true. In other words, repetition makes any statement seem more accurate. So anything you hear again will resonate more each time it's said.
Do you see what I did there? Each of the three sentences above conveyed the same message. Yet each time you read the next sentence, it felt more and more true. Cognitive neuroscientists and behavioral economists like myself call this the "illusory truth effect."
Go back and recall your experience reading the first sentence. It probably felt strange and disconcerting, perhaps with a note of resistance, as in "I don't believe things more if they're repeated!"
Reading the second sentence did not inspire such a strong reaction. Your reaction to the third sentence was tame by comparison.
Why? Because of a phenomenon called "cognitive fluency," meaning how easily we process information. Much of our vulnerability to deception in all areas of life—including to fake news and misinformation—revolves around cognitive fluency in one way or another. And unfortunately, such misinformation can swing major elections.
The Lazy Brain
Our brains are lazy. The more effort it takes to process information, the more uncomfortable we feel about it and the more we dislike and distrust it.
By contrast, the more we like certain data and are comfortable with it, the more we feel that it's accurate. This intuitive feeling in our gut is what we use to judge what's true and false.
Yet no matter how often you heard that you should trust your gut and follow your intuition, that advice is wrong. You should not trust your gut when evaluating information where you don't have expert-level knowledge, at least when you don't want to screw up. Structured information gathering and decision-making processes help us avoid the numerous errors we make when we follow our intuition. And even experts can make serious errors when they don't rely on such decision aids.
These mistakes happen due to mental errors that scholars call "cognitive biases." The illusory truth effect is one of these mental blindspots; there are over 100 altogether. These mental blindspots impact all areas of our life, from health and politics to relationships and even shopping.
We pay the most attention to whatever we find most emotionally salient in our environment, as that's the information easiest for us to process.
The Maladapted Brain
Why do we have so many cognitive biases? It turns out that our intuitive judgments—our gut reactions, our instincts, whatever you call them—aren't adapted for the modern environment. They evolved from the ancestral savanna environment, when we lived in small tribes of 15–150 people and spent our time hunting and foraging.
It's not a surprise, when you think about it. Evolution works on time scales of many thousands of years; our modern informational environment has been around for only a couple of decades, with the rise of the internet and social media.
Unfortunately, that means we're using brains adapted for the primitive conditions of hunting and foraging to judge information and make decisions in a very different world. In the ancestral environment, we had to make quick snap judgments in order to survive, thrive, and reproduce; we're the descendants of those who did so most effectively.
In the modern environment, we can take our time to make much better judgments by using structured evaluation processes to protect yourself from cognitive biases. We have to train our minds to go against our intuitions if we want to figure out the truth and avoid falling for misinformation.
Yet it feels very counterintuitive to do so. Again, not a surprise: by definition, you have to go against your intuitions. It's not easy, but it's truly the only path if you don't want to be vulnerable to fake news.
The Danger of Cognitive Fluency and Illusory Truth
We already make plenty of mistakes by ourselves, without outside intervention. It's especially difficult to protect ourselves against those who know how to manipulate us. Unfortunately, the purveyors of misinformation excel at exploiting our cognitive biases to get us to buy into fake news.
Consider the illusory truth effect. Our vulnerability to it stems from how our brain processes novel stimuli. The first time we hear something new to us, it's difficult to process mentally. It has to integrate with our existing knowledge framework, and we have to build new neural pathways to make that happen. Doing so feels uncomfortable for our lazy brain, so the statement that we heard seems difficult to swallow to us.
The next time we hear that same thing, our mind doesn't have to build new pathways. It just has to go down the same ones it built earlier. Granted, those pathways are little more than trails, newly laid down and barely used. It's hard to travel down that newly established neural path, but much easier than when your brain had to lay down that trail. As a result, the statement is somewhat easier to swallow.
Each repetition widens and deepens the trail. Each time you hear the same thing, it feels more true, comfortable, and intuitive.
Does it work for information that seems very unlikely? Science says yes! Researchers found that the illusory truth effect applies strongly to implausible as well as plausible statements.
What about if you know better? Surely prior knowledge prevents this illusory truth! Unfortunately not: even if you know better, research shows you're still vulnerable to this cognitive bias, though less than those who don't have prior knowledge.
Sadly, people who are predisposed to more elaborate and sophisticated thinking—likely you, if you're reading the article—are more likely to fall for the illusory truth effect. And guess what: more sophisticated thinkers are also likelier than less sophisticated ones to fall for the cognitive bias known as the bias blind spot, where you ignore your own cognitive biases. So if you think that cognitive biases such as the illusory truth effect don't apply to you, you're likely deluding yourself.
That's why the purveyors of misinformation rely on repeating the same thing over and over and over and over again. They know that despite fact-checking, their repetition will sway people, even some of those who think they're invulnerable. In fact, believing that you're invulnerable will make you more likely to fall for this and other cognitive biases, since you won't be taking the steps necessary to address them.
Other Important Cognitive Biases
What are some other cognitive biases you need to beware? If you've heard of any cognitive biases, you've likely heard of the "confirmation bias." That refers to our tendency to look for and interpret information in ways that conform to our prior beliefs, intuitions, feelings, desires, and preferences, as opposed to the facts.
Again, cognitive fluency deserves blame. It's much easier to build neural pathways to information that we already possess, especially that around which we have strong emotions; it's much more difficult to break well-established neural pathways if we need to change our mind based on new information. Consequently, we instead look for information that's easy to accept, that which fits our prior beliefs. In turn, we ignore and even actively reject information that doesn't fit our beliefs.
Moreover, the more educated we are, the more likely we are to engage in such active rejection. After all, our smarts give us more ways of arguing against new information that counters our beliefs. That's why research demonstrates that the more educated you are, the more polarized your beliefs will be around scientific issues that have religious or political value overtones, such as stem cell research, human evolution, and climate change. Where might you be letting your smarts get in the way of the facts?
Our minds like to interpret the world through stories, meaning explanatory narratives that link cause and effect in a clear and simple manner. Such stories are a balm to our cognitive fluency, as our mind constantly looks for patterns that explain the world around us in an easy-to-process manner. That leads to the "narrative fallacy," where we fall for convincing-sounding narratives regardless of the facts, especially if the story fits our predispositions and our emotions.
You ever wonder why politicians tell so many stories? What about the advertisements you see on TV or video advertisements on websites, which tell very quick visual stories? How about salespeople or fundraisers? Sure, sometimes they cite statistics and scientific reports, but they spend much, much more time telling stories: simple, clear, compelling narratives that seem to make sense and tug at our heartstrings.
Now, here's something that's actually true: the world doesn't make sense. The world is not simple, clear, and compelling. The world is complex, confusing, and contradictory. Beware of simple stories! Look for complex, confusing, and contradictory scientific reports and high-quality statistics: they're much more likely to contain the truth than the easy-to-process stories.
Another big problem that comes from cognitive fluency: the "attentional bias." We pay the most attention to whatever we find most emotionally salient in our environment, as that's the information easiest for us to process. Most often, such stimuli are negative; we feel a lesser but real attentional bias to positive information.
That's why fear, anger, and resentment represent such powerful tools of misinformers. They know that people will focus on and feel more swayed by emotionally salient negative stimuli, so be suspicious of negative, emotionally laden data.
You should be especially wary of such information in the form of stories framed to fit your preconceptions and repeated. That's because cognitive biases build on top of each other. You need to learn about the most dangerous ones for evaluating reality clearly and making wise decisions, and watch out for them when you consume news, and in other life areas where you don't want to make poor choices.
Fixing Our Brains
Unfortunately, knowledge only weakly protects us from cognitive biases; it's important, but far from sufficient, as the study I cited earlier on the illusory truth effect reveals.
What can we do?
The easiest decision aid is a personal commitment to twelve truth-oriented behaviors called the Pro-Truth Pledge, which you can make by signing the pledge at ProTruthPledge.org. All of these behaviors stem from cognitive neuroscience and behavioral economics research in the field called debiasing, which refers to counterintuitive, uncomfortable, but effective strategies to protect yourself from cognitive biases.
What are these behaviors? The first four relate to you being truthful yourself, under the category "share truth." They're the most important for avoiding falling for cognitive biases when you share information:
Share truth
- Verify: fact-check information to confirm it is true before accepting and sharing it
- Balance: share the whole truth, even if some aspects do not support my opinion
- Cite: share my sources so that others can verify my information
- Clarify: distinguish between my opinion and the facts
The second set of four are about how you can best "honor truth" to protect yourself from cognitive biases in discussions with others:
Honor truth
- Acknowledge: when others share true information, even when we disagree otherwise
- Reevaluate: if my information is challenged, retract it if I cannot verify it
- Defend: defend others when they come under attack for sharing true information, even when we disagree otherwise
- Align: align my opinions and my actions with true information
The last four, under the category "encourage truth," promote broader patterns of truth-telling in our society by providing incentives for truth-telling and disincentives for deception:
Encourage truth
- Fix: ask people to retract information that reliable sources have disproved even if they are my allies
- Educate: compassionately inform those around me to stop using unreliable sources even if these sources support my opinion
- Defer: recognize the opinions of experts as more likely to be accurate when the facts are disputed
- Celebrate: those who retract incorrect statements and update their beliefs toward the truth
Peer-reviewed research has shown that taking the Pro-Truth Pledge is effective for changing people's behavior to be more truthful, both in their own statements and in interactions with others. I hope you choose to join the many thousands of ordinary citizens—and over 1,000 politicians and officials—who committed to this decision aid, as opposed to going with their gut.
[Adapted from: Dr. Gleb Tsipursky and Tim Ward, Pro Truth: A Practical Plan for Putting Truth Back Into Politics (Changemakers Books, 2020).]
[Editor's Note: To read other articles in this special magazine issue, visit the beautifully designed e-reader version.]
This man spent over 70 years in an iron lung. What he was able to accomplish is amazing.
It’s a sight we don’t normally see these days: A man lying prone in a big, metal tube with his head sticking out of one end. But it wasn’t so long ago that this sight was unfortunately much more common.
In the first half of the 20th century, tens of thousands of people each year were infected by polio—a highly contagious virus that attacks nerves in the spinal cord and brainstem. Many people survived polio, but a small percentage of people who did were left permanently paralyzed from the virus, requiring support to help them breathe. This support, known as an “iron lung,” manually pulled oxygen in and out of a person’s lungs by changing the pressure inside the machine.
Paul Alexander was one of several thousand who were infected and paralyzed by polio in 1952. That year, a polio epidemic swept the United States, forcing businesses to close and polio wards in hospitals all over the country to fill up with sick children. When Paul caught polio in the summer of 1952, doctors urged his parents to let him rest and recover at home, since the hospital in his home suburb of Dallas, Texas was already overrun with polio patients.
Paul rested in bed for a few days with aching limbs and a fever. But his condition quickly got worse. Within a week, Paul could no longer speak or swallow, and his parents rushed him to the local hospital where the doctors performed an emergency procedure to help him breathe. Paul woke from the surgery three days later, and found himself unable to move and lying inside an iron lung in the polio ward, surrounded by rows of other paralyzed children.
Hospitals were commonly filled with polio patients who had been paralyzed by the virus before a vaccine became widely available in 1955. Associated Press
Paul struggled inside the polio ward for the next 18 months, bored and restless and needing to hold his breath when the nurses opened the iron lung to help him bathe. The doctors on the ward frequently told his parents that Paul was going to die.But against all odds, Paul lived. And with help from a physical therapist, Paul was able to thrive—sometimes for small periods outside the iron lung.
The way Paul did this was to practice glossopharyngeal breathing (or as Paul called it, “frog breathing”), where he would trap air in his mouth and force it down his throat and into his lungs by flattening his tongue. This breathing technique, taught to him by his physical therapist, would allow Paul to leave the iron lung for increasing periods of time.
With help from his iron lung (and for small periods of time without it), Paul managed to live a full, happy, and sometimes record-breaking life. At 21, Paul became the first person in Dallas, Texas to graduate high school without attending class in person, owing his success to memorization rather than taking notes. After high school, Paul received a scholarship to Southern Methodist University and pursued his dream of becoming a trial lawyer and successfully represented clients in court.
Paul Alexander, pictured here in his early 20s, mastered a type of breathing technique that allowed him to spend short amounts of time outside his iron lung. Paul Alexander
Paul practiced law in North Texas for more than 30 years, using a modified wheelchair that held his body upright. During his career, Paul even represented members of the biker gang Hells Angels—and became so close with them he was named an honorary member.Throughout his long life, Paul was also able to fly on a plane, visit the beach, adopt a dog, fall in love, and write a memoir using a plastic stick to tap out a draft on a keyboard. In recent years, Paul joined TikTok and became a viral sensation with more than 330,000 followers. In one of his first videos, Paul advocated for vaccination and warned against another polio epidemic.
Paul was reportedly hospitalized with COVID-19 at the end of February and died on March 11th, 2024. He currently holds the Guiness World Record for longest survival inside an iron lung—71 years.
Polio thankfully no longer circulates in the United States, or in most of the world, thanks to vaccines. But Paul continues to serve as a reminder of the importance of vaccination—and the power of the human spirit.
““I’ve got some big dreams. I’m not going to accept from anybody their limitations,” he said in a 2022 interview with CNN. “My life is incredible.”
When doctors couldn’t stop her daughter’s seizures, this mom earned a PhD and found a treatment herself.
Twenty-eight years ago, Tracy Dixon-Salazaar woke to the sound of her daughter, two-year-old Savannah, in the midst of a medical emergency.
“I entered [Savannah’s room] to see her tiny little body jerking about violently in her bed,” Tracy said in an interview. “I thought she was choking.” When she and her husband frantically called 911, the paramedic told them it was likely that Savannah had had a seizure—a term neither Tracy nor her husband had ever heard before.
Over the next several years, Savannah’s seizures continued and worsened. By age five Savannah was having seizures dozens of times each day, and her parents noticed significant developmental delays. Savannah was unable to use the restroom and functioned more like a toddler than a five-year-old.
Doctors were mystified: Tracy and her husband had no family history of seizures, and there was no event—such as an injury or infection—that could have caused them. Doctors were also confused as to why Savannah’s seizures were happening so frequently despite trying different seizure medications.
Doctors eventually diagnosed Savannah with Lennox-Gaustaut Syndrome, or LGS, an epilepsy disorder with no cure and a poor prognosis. People with LGS are often resistant to several kinds of anti-seizure medications, and often suffer from developmental delays and behavioral problems. People with LGS also have a higher chance of injury as well as a higher chance of sudden unexpected death (SUDEP) due to the frequent seizures. In about 70 percent of cases, LGS has an identifiable cause such as a brain injury or genetic syndrome. In about 30 percent of cases, however, the cause is unknown.
Watching her daughter struggle through repeated seizures was devastating to Tracy and the rest of the family.
“This disease, it comes into your life. It’s uninvited. It’s unannounced and it takes over every aspect of your daily life,” said Tracy in an interview with Today.com. “Plus it’s attacking the thing that is most precious to you—your kid.”
Desperate to find some answers, Tracy began combing the medical literature for information about epilepsy and LGS. She enrolled in college courses to better understand the papers she was reading.
“Ironically, I thought I needed to go to college to take English classes to understand these papers—but soon learned it wasn’t English classes I needed, It was science,” Tracy said. When she took her first college science course, Tracy says, she “fell in love with the subject.”
Tracy was now a caregiver to Savannah, who continued to have hundreds of seizures a month, as well as a full-time student, studying late into the night and while her kids were at school, using classwork as “an outlet for the pain.”
“I couldn’t help my daughter,” Tracy said. “Studying was something I could do.”
Twelve years later, Tracy had earned a PhD in neurobiology.
After her post-doctoral training, Tracy started working at a lab that explored the genetics of epilepsy. Savannah’s doctors hadn’t found a genetic cause for her seizures, so Tracy decided to sequence her genome again to check for other abnormalities—and what she found was life-changing.
Tracy discovered that Savannah had a calcium channel mutation, meaning that too much calcium was passing through Savannah’s neural pathways, leading to seizures. The information made sense to Tracy: Anti-seizure medications often leech calcium from a person’s bones. When doctors had prescribed Savannah calcium supplements in the past to counteract these effects, her seizures had gotten worse every time she took the medication. Tracy took her discovery to Savannah’s doctor, who agreed to prescribe her a calcium blocker.
The change in Savannah was almost immediate.
Within two weeks, Savannah’s seizures had decreased by 95 percent. Once on a daily seven-drug regimen, she was soon weaned to just four, and then three. Amazingly, Tracy started to notice changes in Savannah’s personality and development, too.
“She just exploded in her personality and her talking and her walking and her potty training and oh my gosh she is just so sassy,” Tracy said in an interview.
Since starting the calcium blocker eleven years ago, Savannah has continued to make enormous strides. Though still unable to read or write, Savannah enjoys puzzles and social media. She’s “obsessed” with boys, says Tracy. And while Tracy suspects she’ll never be able to live independently, she and her daughter can now share more “normal” moments—something she never anticipated at the start of Savannah’s journey with LGS. While preparing for an event, Savannah helped Tracy get ready.
“We picked out a dress and it was the first time in our lives that we did something normal as a mother and a daughter,” she said. “It was pretty cool.”