Why Your Brain Falls for Misinformation – And How to Avoid It
This article is part of the magazine, "The Future of Science In America: The Election Issue," co-published by LeapsMag, the Aspen Institute Science & Society Program, and GOOD.
Whenever you hear something repeated, it feels more true. In other words, repetition makes any statement seem more accurate. So anything you hear again will resonate more each time it's said.
Do you see what I did there? Each of the three sentences above conveyed the same message. Yet each time you read the next sentence, it felt more and more true. Cognitive neuroscientists and behavioral economists like myself call this the "illusory truth effect."
Go back and recall your experience reading the first sentence. It probably felt strange and disconcerting, perhaps with a note of resistance, as in "I don't believe things more if they're repeated!"
Reading the second sentence did not inspire such a strong reaction. Your reaction to the third sentence was tame by comparison.
Why? Because of a phenomenon called "cognitive fluency," meaning how easily we process information. Much of our vulnerability to deception in all areas of life—including to fake news and misinformation—revolves around cognitive fluency in one way or another. And unfortunately, such misinformation can swing major elections.
The Lazy Brain
Our brains are lazy. The more effort it takes to process information, the more uncomfortable we feel about it and the more we dislike and distrust it.
By contrast, the more we like certain data and are comfortable with it, the more we feel that it's accurate. This intuitive feeling in our gut is what we use to judge what's true and false.
Yet no matter how often you heard that you should trust your gut and follow your intuition, that advice is wrong. You should not trust your gut when evaluating information where you don't have expert-level knowledge, at least when you don't want to screw up. Structured information gathering and decision-making processes help us avoid the numerous errors we make when we follow our intuition. And even experts can make serious errors when they don't rely on such decision aids.
These mistakes happen due to mental errors that scholars call "cognitive biases." The illusory truth effect is one of these mental blindspots; there are over 100 altogether. These mental blindspots impact all areas of our life, from health and politics to relationships and even shopping.
We pay the most attention to whatever we find most emotionally salient in our environment, as that's the information easiest for us to process.
The Maladapted Brain
Why do we have so many cognitive biases? It turns out that our intuitive judgments—our gut reactions, our instincts, whatever you call them—aren't adapted for the modern environment. They evolved from the ancestral savanna environment, when we lived in small tribes of 15–150 people and spent our time hunting and foraging.
It's not a surprise, when you think about it. Evolution works on time scales of many thousands of years; our modern informational environment has been around for only a couple of decades, with the rise of the internet and social media.
Unfortunately, that means we're using brains adapted for the primitive conditions of hunting and foraging to judge information and make decisions in a very different world. In the ancestral environment, we had to make quick snap judgments in order to survive, thrive, and reproduce; we're the descendants of those who did so most effectively.
In the modern environment, we can take our time to make much better judgments by using structured evaluation processes to protect yourself from cognitive biases. We have to train our minds to go against our intuitions if we want to figure out the truth and avoid falling for misinformation.
Yet it feels very counterintuitive to do so. Again, not a surprise: by definition, you have to go against your intuitions. It's not easy, but it's truly the only path if you don't want to be vulnerable to fake news.
The Danger of Cognitive Fluency and Illusory Truth
We already make plenty of mistakes by ourselves, without outside intervention. It's especially difficult to protect ourselves against those who know how to manipulate us. Unfortunately, the purveyors of misinformation excel at exploiting our cognitive biases to get us to buy into fake news.
Consider the illusory truth effect. Our vulnerability to it stems from how our brain processes novel stimuli. The first time we hear something new to us, it's difficult to process mentally. It has to integrate with our existing knowledge framework, and we have to build new neural pathways to make that happen. Doing so feels uncomfortable for our lazy brain, so the statement that we heard seems difficult to swallow to us.
The next time we hear that same thing, our mind doesn't have to build new pathways. It just has to go down the same ones it built earlier. Granted, those pathways are little more than trails, newly laid down and barely used. It's hard to travel down that newly established neural path, but much easier than when your brain had to lay down that trail. As a result, the statement is somewhat easier to swallow.
Each repetition widens and deepens the trail. Each time you hear the same thing, it feels more true, comfortable, and intuitive.
Does it work for information that seems very unlikely? Science says yes! Researchers found that the illusory truth effect applies strongly to implausible as well as plausible statements.
What about if you know better? Surely prior knowledge prevents this illusory truth! Unfortunately not: even if you know better, research shows you're still vulnerable to this cognitive bias, though less than those who don't have prior knowledge.
Sadly, people who are predisposed to more elaborate and sophisticated thinking—likely you, if you're reading the article—are more likely to fall for the illusory truth effect. And guess what: more sophisticated thinkers are also likelier than less sophisticated ones to fall for the cognitive bias known as the bias blind spot, where you ignore your own cognitive biases. So if you think that cognitive biases such as the illusory truth effect don't apply to you, you're likely deluding yourself.
That's why the purveyors of misinformation rely on repeating the same thing over and over and over and over again. They know that despite fact-checking, their repetition will sway people, even some of those who think they're invulnerable. In fact, believing that you're invulnerable will make you more likely to fall for this and other cognitive biases, since you won't be taking the steps necessary to address them.
Other Important Cognitive Biases
What are some other cognitive biases you need to beware? If you've heard of any cognitive biases, you've likely heard of the "confirmation bias." That refers to our tendency to look for and interpret information in ways that conform to our prior beliefs, intuitions, feelings, desires, and preferences, as opposed to the facts.
Again, cognitive fluency deserves blame. It's much easier to build neural pathways to information that we already possess, especially that around which we have strong emotions; it's much more difficult to break well-established neural pathways if we need to change our mind based on new information. Consequently, we instead look for information that's easy to accept, that which fits our prior beliefs. In turn, we ignore and even actively reject information that doesn't fit our beliefs.
Moreover, the more educated we are, the more likely we are to engage in such active rejection. After all, our smarts give us more ways of arguing against new information that counters our beliefs. That's why research demonstrates that the more educated you are, the more polarized your beliefs will be around scientific issues that have religious or political value overtones, such as stem cell research, human evolution, and climate change. Where might you be letting your smarts get in the way of the facts?
Our minds like to interpret the world through stories, meaning explanatory narratives that link cause and effect in a clear and simple manner. Such stories are a balm to our cognitive fluency, as our mind constantly looks for patterns that explain the world around us in an easy-to-process manner. That leads to the "narrative fallacy," where we fall for convincing-sounding narratives regardless of the facts, especially if the story fits our predispositions and our emotions.
You ever wonder why politicians tell so many stories? What about the advertisements you see on TV or video advertisements on websites, which tell very quick visual stories? How about salespeople or fundraisers? Sure, sometimes they cite statistics and scientific reports, but they spend much, much more time telling stories: simple, clear, compelling narratives that seem to make sense and tug at our heartstrings.
Now, here's something that's actually true: the world doesn't make sense. The world is not simple, clear, and compelling. The world is complex, confusing, and contradictory. Beware of simple stories! Look for complex, confusing, and contradictory scientific reports and high-quality statistics: they're much more likely to contain the truth than the easy-to-process stories.
Another big problem that comes from cognitive fluency: the "attentional bias." We pay the most attention to whatever we find most emotionally salient in our environment, as that's the information easiest for us to process. Most often, such stimuli are negative; we feel a lesser but real attentional bias to positive information.
That's why fear, anger, and resentment represent such powerful tools of misinformers. They know that people will focus on and feel more swayed by emotionally salient negative stimuli, so be suspicious of negative, emotionally laden data.
You should be especially wary of such information in the form of stories framed to fit your preconceptions and repeated. That's because cognitive biases build on top of each other. You need to learn about the most dangerous ones for evaluating reality clearly and making wise decisions, and watch out for them when you consume news, and in other life areas where you don't want to make poor choices.
Fixing Our Brains
Unfortunately, knowledge only weakly protects us from cognitive biases; it's important, but far from sufficient, as the study I cited earlier on the illusory truth effect reveals.
What can we do?
The easiest decision aid is a personal commitment to twelve truth-oriented behaviors called the Pro-Truth Pledge, which you can make by signing the pledge at ProTruthPledge.org. All of these behaviors stem from cognitive neuroscience and behavioral economics research in the field called debiasing, which refers to counterintuitive, uncomfortable, but effective strategies to protect yourself from cognitive biases.
What are these behaviors? The first four relate to you being truthful yourself, under the category "share truth." They're the most important for avoiding falling for cognitive biases when you share information:
Share truth
- Verify: fact-check information to confirm it is true before accepting and sharing it
- Balance: share the whole truth, even if some aspects do not support my opinion
- Cite: share my sources so that others can verify my information
- Clarify: distinguish between my opinion and the facts
The second set of four are about how you can best "honor truth" to protect yourself from cognitive biases in discussions with others:
Honor truth
- Acknowledge: when others share true information, even when we disagree otherwise
- Reevaluate: if my information is challenged, retract it if I cannot verify it
- Defend: defend others when they come under attack for sharing true information, even when we disagree otherwise
- Align: align my opinions and my actions with true information
The last four, under the category "encourage truth," promote broader patterns of truth-telling in our society by providing incentives for truth-telling and disincentives for deception:
Encourage truth
- Fix: ask people to retract information that reliable sources have disproved even if they are my allies
- Educate: compassionately inform those around me to stop using unreliable sources even if these sources support my opinion
- Defer: recognize the opinions of experts as more likely to be accurate when the facts are disputed
- Celebrate: those who retract incorrect statements and update their beliefs toward the truth
Peer-reviewed research has shown that taking the Pro-Truth Pledge is effective for changing people's behavior to be more truthful, both in their own statements and in interactions with others. I hope you choose to join the many thousands of ordinary citizens—and over 1,000 politicians and officials—who committed to this decision aid, as opposed to going with their gut.
[Adapted from: Dr. Gleb Tsipursky and Tim Ward, Pro Truth: A Practical Plan for Putting Truth Back Into Politics (Changemakers Books, 2020).]
[Editor's Note: To read other articles in this special magazine issue, visit the beautifully designed e-reader version.]
Indigenous wisdom plus honeypot ants could provide new antibiotics
For generations, the Indigenous Tjupan people of Australia enjoyed the sweet treat of honey made by honeypot ants. As a favorite pastime, entire families would go searching for the underground colonies, first spotting a worker ant and then tracing it to its home. The ants, which belong to the species called Camponotus inflatus, usually build their subterranean homes near the mulga trees, Acacia aneura. Having traced an ant to its tree, it would be the women who carefully dug a pit next to a colony, cautious not to destroy the entire structure. Once the ant chambers were exposed, the women would harvest a small amount to avoid devastating the colony’s stocks—and the family would share the treat.
The Tjupan people also knew that the honey had antimicrobial properties. “You could use it for a sore throat,” says Danny Ulrich, a member of the Tjupan nation. “You could also use it topically, on cuts and things like that.”
These hunts have become rarer, as many of the Tjupan people have moved away and, up until now, the exact antimicrobial properties of the ant honey remained unknown. But recently, scientists Andrew Dong and Kenya Fernandes from the University of Sydney, joined Ulrich, who runs the Honeypot Ants tours in Kalgoorlie, a city in Western Australia, on a honey-gathering expedition. Afterwards, they ran a series of experiments analyzing the honey’s antimicrobial activity—and confirmed that the Indigenous wisdom was true. The honey was effective against Staphylococcus aureus, a common pathogen responsible for sore throats, skin infections like boils and sores, and also sepsis, which can result in death. Moreover, the honey also worked against two species of fungi, Cryptococcus and Aspergillus, which can be pathogenic to humans, especially those with suppressed immune systems.
In the era of growing antibiotic resistance and the rising threat of pathogenic fungi, these findings may help scientists identify and make new antimicrobial compounds. “Natural products have been honed over thousands and millions of years by nature and evolution,” says Fernandes. “And some of them have complex and intricate properties that make them really important as potential new antibiotics. “
In an era of growing resistance to antibiotics and new threats of fungi infections, the latest findings about honeypot ants are helping scientists identify new antimicrobial drugs.
Danny Ulrich
Bee honey is also known for its antimicrobial properties, but bees produce it very differently than the ants. Bees collect nectar from flowers, which they regurgitate at the hive and pack into the hexagonal honeycombs they build for storage. As they do so, they also add into the mix an enzyme called glucose oxidase produced by their glands. The enzyme converts atmospheric oxygen into hydrogen peroxide, a reactive molecule that destroys bacteria and acts as a natural preservative. After the bees pack the honey into the honeycombs, they fan it with their wings to evaporate the water. Once a honeycomb is full, the bees put a beeswax cover on it, where it stays well-preserved thanks to the enzymatic action, until the bees need it.
Less is known about the chemistry of ants’ honey-making. Similarly to bees, they collect nectar. They also collect the sweet sap of the mulga tree. Additionally, they also “milk” the aphids—small sap-sucking insects that live on the tree. When ants tickle the aphids with their antennae, the latter release a sweet substance, which the former also transfer to their colonies. That’s where the honey management difference becomes really pronounced. The ants don’t build any kind of structures to store their honey. Instead, they store it in themselves.
The workers feed their harvest to their fellow ants called repletes, stuffing them up to the point that their swollen bellies outgrow the ants themselves, looking like amber-colored honeypots—hence the name. Because of their size, repletes don’t move, but hang down from the chamber’s ceiling, acting as living feedstocks. When food becomes scarce, they regurgitate their reserves to their colony’s brethren. It’s not clear whether the repletes die afterwards or can be restuffed again. “That's a good question,” Dong says. “After they've been stretched, they can't really return to exactly the same shape.”
These replete ants are the “treat” the Tjupan women dug for. Once they saw the round-belly ants inside the chambers, they would reach in carefully and get a few scoops of them. “You see a lot of honeypot ants just hanging on the roof of the little openings,” says Ulrich’s mother, Edie Ulrich. The women would share the ants with family members who would eat them one by one. “They're very delicate,” shares Edie Ulrich—you have to take them out carefully, so they don’t accidentally pop and become a wasted resource. “Because you’d lose all this precious honey.”
Dong stumbled upon the honeypot ants phenomenon because he was interested in Indigenous foods and went on Ulrich’s tour. He quickly became fascinated with the insects and their role in the Indigenous culture. “The honeypot ants are culturally revered by the Indigenous people,” he says. Eventually he decided to test out the honey’s medicinal qualities.
The researchers were surprised to see that even the smallest, eight percent concentration of honey was able to arrest the growth of S. aureus.
To do this, the two scientists first diluted the ant honey with water. “We used something called doubling dilutions, which means that we made 32 percent dilutions, and then we halve that to 16 percent and then we half that to eight percent,” explains Fernandes. The goal was to obtain as much results as possible with the meager honey they had. “We had very, very little of the honeypot ant honey so we wanted to maximize the spectrum of results we can get without wasting too much of the sample.”
After that, the researchers grew different microbes inside a nutrient rich broth. They added the broth to the different honey dilutions and incubated the mixes for a day or two at the temperature favorable to the germs’ growth. If the resulting solution turned turbid, it was a sign that the bugs proliferated. If it stayed clear, it meant that the honey destroyed them. The researchers were surprised to see that even the smallest, eight percent concentration of honey was able to arrest the growth of S. aureus. “It was really quite amazing,” Fernandes says. “Eight milliliters of honey in 92 milliliters of water is a really tiny amount of honey compared to the amount of water.”
Similar to bee honey, the ants’ honey exhibited some peroxide antimicrobial activity, researchers found, but given how little peroxide was in the solution, they think the honey also kills germs by a different mechanism. “When we measured, we found that [the solution] did have some hydrogen peroxide, but it didn't have as much of it as we would expect based on how active it was,” Fernandes says. “Whether this hydrogen peroxide also comes from glucose oxidase or whether it's produced by another source, we don't really know,” she adds. The research team does have some hypotheses about the identity of this other germ-killing agent. “We think it is most likely some kind of antimicrobial peptide that is actually coming from the ant itself.”
The honey also has a very strong activity against the two types of fungi, Cryptococcus and Aspergillus. Both fungi are associated with trees and decaying leaves, as well as in the soils where ants live, so the insects likely have evolved some natural defense compounds, which end up inside the honey.
It wouldn’t be the first time when modern medicines take their origin from the natural world or from the indigenous people’s knowledge. The bark of the cinchona tree native to South America contains quinine, a substance that treats malaria. The Indigenous people of the Andes used the bark to quell fever and chills for generations, and when Europeans began to fall ill with malaria in the Amazon rainforest, they learned to use that medicine from the Andean people.
The wonder drug aspirin similarly takes its origin from a bark of a tree—in this case a willow.
Even some anticancer compounds originated from nature. A chemotherapy drug called Paclitaxel, was originally extracted from the Pacific yew trees, Taxus brevifolia. The samples of the Pacific yew bark were first collected in 1962 by researchers from the United States Department of Agriculture who were looking for natural compounds that might have anti-tumor activity. In December 1992, the FDA approved Paclitaxel (brand name Taxol) for the treatment of ovarian cancer and two years later for breast cancer.
In the era when the world is struggling to find new medicines fast enough to subvert a fungal or bacterial pandemic, these discoveries can pave the way to new therapeutics. “I think it's really important to listen to indigenous cultures and to take their knowledge because they have been using these sources for a really, really long time,” Fernandes says. Now we know it works, so science can elucidate the molecular mechanisms behind it, she adds. “And maybe it can even provide a lead for us to develop some kind of new treatments in the future.”
Lina Zeldovich has written about science, medicine and technology for Popular Science, Smithsonian, National Geographic, Scientific American, Reader’s Digest, the New York Times and other major national and international publications. A Columbia J-School alumna, she has won several awards for her stories, including the ASJA Crisis Coverage Award for Covid reporting, and has been a contributing editor at Nautilus Magazine. In 2021, Zeldovich released her first book, The Other Dark Matter, published by the University of Chicago Press, about the science and business of turning waste into wealth and health. You can find her on http://linazeldovich.com/ and @linazeldovich.
Blood Test Can Detect Lymphoma Cells Before a Tumor Grows Back
When David M. Kurtz was doing his clinical fellowship at Stanford University Medical Center in 2009, specializing in lymphoma treatments, he found himself grappling with a question no one could answer. A typical regimen for these blood cancers prescribed six cycles of chemotherapy, but no one knew why. "The number seemed to be drawn out of a hat," Kurtz says. Some patients felt much better after just two doses, but had to endure the toxic effects of the entire course. For some elderly patients, the side effects of chemo are so harsh, they alone can kill. Others appeared to be cancer-free on the CT scans after the requisite six but then succumbed to it months later.
"Anecdotally, one patient decided to stop therapy after one dose because he felt it was so toxic that he opted for hospice instead," says Kurtz, now an oncologist at the center. "Five years down the road, he was alive and well. For him, just one dose was enough." Others would return for their one-year check up and find that their tumors grew back. Kurtz felt that while CT scans and MRIs were powerful tools, they weren't perfect ones. They couldn't tell him if there were any cancer cells left, stealthily waiting to germinate again. The scans only showed the tumor once it was back.
Blood cancers claim about 68,000 people a year, with a new diagnosis made about every three minutes, according to the Leukemia Research Foundation. For patients with B-cell lymphoma, which Kurtz focuses on, the survival chances are better than for some others. About 60 percent are cured, but the remaining 40 percent will relapse—possibly because they will have a negative CT scan, but still harbor malignant cells. "You can't see this on imaging," says Michael Green, who also treats blood cancers at University of Texas MD Anderson Medical Center.
The new blood test is sensitive enough to spot one cancerous perpetrator amongst one million other DNA molecules.
Kurtz wanted a better diagnostic tool, so he started working on a blood test that could capture the circulating tumor DNA or ctDNA. For that, he needed to identify the specific mutations typical for B-cell lymphomas. Working together with another fellow PhD student Jake Chabon, Kurtz finally zeroed-in on the tumor's genetic "appearance" in 2017—a pair of specific mutations sitting in close proximity to each other—a rare and telling sign. The human genome contains about 3 billion base pairs of nucleotides—molecules that compose genes—and in case of the B-cell lymphoma cells these two mutations were only a few base pairs apart. "That was the moment when the light bulb went on," Kurtz says.
The duo formed a company named Foresight Diagnostics, focusing on taking the blood test to the clinic. But knowing the tumor's mutational signature was only half the process. The other was fishing the tumor's DNA out of patients' bloodstream that contains millions of other DNA molecules, explains Chabon, now Foresight's CEO. It would be like looking for an escaped criminal in a large crowd. Kurtz and Chabon solved the problem by taking the tumor's "mug shot" first. Doctors would take the biopsy pre-treatment and sequence the tumor, as if taking the criminal's photo. After treatments, they would match the "mug shot" to all DNA molecules derived from the patient's blood sample to see if any molecular criminals managed to escape the chemo.
Foresight isn't the only company working on blood-based tumor detection tests, which are dubbed liquid biopsies—other companies such as Natera or ArcherDx developed their own. But in a recent study, the Foresight team showed that their method is significantly more sensitive in "fishing out" the cancer molecules than existing tests. Chabon says that this test can detect circulating tumor DNA in concentrations that are nearly 100 times lower than other methods. Put another way, it's sensitive enough to spot one cancerous perpetrator amongst one million other DNA molecules.
They also aim to extend their test to detect other malignancies such as lung, breast or colorectal cancers.
"It increases the sensitivity of detection and really catches most patients who are going to progress," says Green, the University of Texas oncologist who wasn't involved in the study, but is familiar with the method. It would also allow monitoring patients during treatment and making better-informed decisions about which therapy regimens would be most effective. "It's a minimally invasive test," Green says, and "it gives you a very high confidence about what's going on."
Having shown that the test works well, Kurtz and Chabon are planning a new trial in which oncologists would rely on their method to decide when to stop or continue chemo. They also aim to extend their test to detect other malignancies such as lung, breast or colorectal cancers. The latest genome sequencing technologies have sequenced and catalogued over 2,500 different tumor specimens and the Foresight team is analyzing this data, says Chabon, which gives the team the opportunity to create more molecular "mug shots."
The team hopes that that their blood cancer test will become available to patients within about five years, making doctors' job easier, and not only at the biological level. "When I tell patients, "good news, your cancer is in remission', they ask me, 'does it mean I'm cured?'" Kurtz says. "Right now I can't answer this question because I don't know—but I would like to." His company's test, he hopes, will enable him to reply with certainty. He'd very much like to have the power of that foresight.
This article is republished from our archives to coincide with Blood Cancer Awareness Month, which highlights progress in cancer diagnostics and treatment.
Lina Zeldovich has written about science, medicine and technology for Popular Science, Smithsonian, National Geographic, Scientific American, Reader’s Digest, the New York Times and other major national and international publications. A Columbia J-School alumna, she has won several awards for her stories, including the ASJA Crisis Coverage Award for Covid reporting, and has been a contributing editor at Nautilus Magazine. In 2021, Zeldovich released her first book, The Other Dark Matter, published by the University of Chicago Press, about the science and business of turning waste into wealth and health. You can find her on http://linazeldovich.com/ and @linazeldovich.