To Make Science Engaging, We Need a Sesame Street for Adults
This article is part of the magazine, "The Future of Science In America: The Election Issue," co-published by LeapsMag, the Aspen Institute Science & Society Program, and GOOD.
In the mid-1960s, a documentary producer in New York City wondered if the addictive jingles, clever visuals, slogans, and repetition of television ads—the ones that were captivating young children of the time—could be harnessed for good. Over the course of three months, she interviewed educators, psychologists, and artists, and the result was a bonanza of ideas.
Perhaps a new TV show could teach children letters and numbers in short animated sequences? Perhaps adults and children could read together with puppets providing comic relief and prompting interaction from the audience? And because it would be broadcast through a device already in almost every home, perhaps this show could reach across socioeconomic divides and close an early education gap?
Soon after Joan Ganz Cooney shared her landmark report, "The Potential Uses of Television in Preschool Education," in 1966, she was prototyping show ideas, attracting funding from The Carnegie Corporation, The Ford Foundation, and The Corporation for Public Broadcasting, and co-founding the Children's Television Workshop with psychologist Lloyd Morrisett. And then, on November 10, 1969, informal learning was transformed forever with the premiere of Sesame Street on public television.
For its first season, Sesame Street won three Emmy Awards and a Peabody Award. Its star, Big Bird, landed on the cover of Time Magazine, which called the show "TV's gift to children." Fifty years later, it's hard to imagine an approach to informal preschool learning that isn't Sesame Street.
And that approach can be boiled down to one word: Entertainment.
Despite decades of evidence from Sesame Street—one of the most studied television shows of all time—and more research from social science, psychology, and media communications, we haven't yet taken Ganz Cooney's concepts to heart in educating adults. Adults have news programs and documentaries and educational YouTube channels, but no Sesame Street. So why don't we? Here's how we can design a new kind of television to make science engaging and accessible for a public that is all too often intimidated by it.
We have to start from the realization that America is a nation of high-school graduates. By the end of high school, students have decided to abandon science because they think it's too difficult, and as a nation, we've made it acceptable for any one of us to say "I'm not good at science" and offload thinking to the ones who might be. So, is it surprising that a large number of Americans are likely to believe in conspiracy theories like the 25% that believe the release of COVID-19 was planned, the one in ten who believe the Moon landing was a hoax, or the 30–40% that think the condensation trails of planes are actually nefarious chemtrails? If we're meeting people where they are, the aim can't be to get the audience from an A to an A+, but from an F to a D, and without judgment of where they are starting from.
There's also a natural compulsion for a well-meaning educator to fill a literacy gap with a barrage of information, but this is what I call "factsplaining," and we know it doesn't work. And worse, it can backfire. In one study from 2014, parents were provided with factual information about vaccine safety, and it was the group that was already the most averse to vaccines that uniquely became even more averse.
Why? Our social identities and cognitive biases are stubborn gatekeepers when it comes to processing new information. We filter ideas through pre-existing beliefs—our values, our religions, our political ideologies. Incongruent ideas are rejected. Congruent ideas, no matter how absurd, are allowed through. We hear what we want to hear, and then our brains justify the input by creating narratives that preserve our identities. Even when we have all the facts, we can use them to support any worldview.
But social science has revealed many mechanisms for hijacking these processes through narrative storytelling, and this can form the foundation of a new kind of educational television.
Could new television series establish the baseline narratives for novel science like gene editing, quantum computing, or artificial intelligence?
As media creators, we can reject factsplaining and instead construct entertaining narratives that disrupt cognitive processes. Two-decade-old research tells us when people are immersed in entertaining fiction narratives, they loosen their defenses, opening a path for new information, editing attitudes, and inspiring new behavior. Where news about hot-button issues like climate change or vaccination might trigger resistance or a backfire effect, fiction can be crafted to be absorbing and, as a result, persuasive.
But the narratives can't be stuffed with information. They must be simplified. If this feels like the opposite of what an educator should be doing, it is possible to reduce the complexity of information, without oversimplification, through "exemplification," a framing device to tell the stories of individuals in specific circumstances that can speak to the greater issue without needing to explain it all. It's a technique you've seen used in biopics. The Discovery Channel true-crime miniseries Manhunt: Unabomber does many things well from a science storytelling perspective, including exemplifying the virtues of the scientific method through a character who argues for a new field of science, forensic linguistics, to catch one of the most notorious domestic terrorists in U.S. history.
We must also appeal to the audience's curiosity. We know curiosity is such a strong driver of human behavior that it can even counteract the biases put up by one's political ideology around subjects like climate change. If we treat science information like a product—and we should—advertising research tells us we can maximize curiosity though a Goldilocks effect. If the information is too complex, your show might as well be a PowerPoint presentation. If it's too simple, it's Sesame Street. There's a sweet spot for creating intrigue about new information when there's a moderate cognitive gap.
The science of "identification" tells us that the more the main character is endearing to a viewer, the more likely the viewer will adopt the character's worldview and journey of change. This insight further provides incentives to craft characters reflective of our audiences. If we accept our biases for what they are, we can understand why the messenger becomes more important than the message, because, without an appropriate messenger, the message becomes faint and ineffective. And research confirms that the stereotype-busting doctor-skeptic Dana Scully of The X-Files, a popular science-fiction series, was an inspiration for a generation of women who pursued science careers.
With these directions, we can start making a new kind of television. But is television itself still the right delivery medium? Americans do spend six hours per day—a quarter of their lives—watching video. And even with the rise of social media and apps, science-themed television shows remain popular, with four out of five adults reporting that they watch shows about science at least sometimes. CBS's The Big Bang Theory was the most-watched show on television in the 2017–2018 season, and Cartoon Network's Rick & Morty is the most popular comedy series among millennials. And medical and forensic dramas continue to be broadcast staples. So yes, it's as true today as it was in the 1980s when George Gerbner, the "cultivation theory" researcher who studied the long-term impacts of television images, wrote, "a single episode on primetime television can reach more people than all science and technology promotional efforts put together."
We know from cultivation theory that media images can shape our views of scientists. Quick, picture a scientist! Was it an old, white man with wild hair in a lab coat? If most Americans don't encounter research science firsthand, it's media that dictates how we perceive science and scientists. Characters like Sheldon Cooper and Rick Sanchez become the model. But we can correct that by representing professionals more accurately on-screen and writing characters more like Dana Scully.
Could new television series establish the baseline narratives for novel science like gene editing, quantum computing, or artificial intelligence? Or could new series counter the misinfodemics surrounding COVID-19 and vaccines through more compelling, corrective narratives? Social science has given us a blueprint suggesting they could. Binge-watching a show like the surreal NBC sitcom The Good Place doesn't replace a Ph.D. in philosophy, but its use of humor plants the seed of continued interest in a new subject. The goal of persuasive entertainment isn't to replace formal education, but it can inspire, shift attitudes, increase confidence in the knowledge of complex issues, and otherwise prime viewers for continued learning.
[Editor's Note: To read other articles in this special magazine issue, visit the beautifully designed e-reader version.]
The Friday Five: A new blood test to detect Alzheimer's
The Friday Five covers five stories in research that you may have missed this week. There are plenty of controversies and troubling ethical issues in science – and we get into many of them in our online magazine – but this news roundup focuses on scientific creativity and progress to give you a therapeutic dose of inspiration headed into the weekend.
Listen on Apple | Listen on Spotify | Listen on Stitcher | Listen on Amazon | Listen on Google
Here are the promising studies covered in this week's Friday Five:
- A blood test to detect Alzheimer's
- War vets can take their psychologist wherever they go
- Does intermittent fasting affect circadian rhythms?
- A new year's resolution for living longer
- 3-D printed eyes?
Staying well in the 21st century is like playing a game of chess
This article originally appeared in One Health/One Planet, a single-issue magazine that explores how climate change and other environmental shifts are increasing vulnerabilities to infectious diseases by land and by sea. The magazine probes how scientists are making progress with leaders in other fields toward solutions that embrace diverse perspectives and the interconnectedness of all lifeforms and the planet.
On July 30, 1999, the Centers for Disease Control and Prevention published a report comparing data on the control of infectious disease from the beginning of the 20th century to the end. The data showed that deaths from infectious diseases declined markedly. In the early 1900s, pneumonia, tuberculosis and diarrheal diseases were the three leading killers, accounting for one-third of total deaths in the U.S.—with 40 percent being children under five.
Mass vaccinations, the discovery of antibiotics and overall sanitation and hygiene measures eventually eradicated smallpox, beat down polio, cured cholera, nearly rid the world of tuberculosis and extended the U.S. life expectancy by 25 years. By 1997, there was a shift in population health in the U.S. such that cancer, diabetes and heart disease were now the leading causes of death.
The control of infectious diseases is considered to be one of the “10 Great Public Health Achievements.” Yet on the brink of the 21st century, new trouble was already brewing. Hospitals were seeing periodic cases of antibiotic-resistant infections. Novel viruses, or those that previously didn’t afflict humans, began to emerge, causing outbreaks of West Nile, SARS, MERS or swine flu.In the years that followed, tuberculosis made a comeback, at least in certain parts of the world. What we didn’t take into account was the very concept of evolution: as we built better protections, our enemies eventually boosted their attacking prowess, so soon enough we found ourselves on the defensive once again.
At the same time, new, previously unknown or extremely rare disorders began to rise, such as autoimmune or genetic conditions. Two decades later, scientists began thinking about health differently—not as a static achievement guaranteed to last, but as something dynamic and constantly changing—and sometimes, for the worse.
What emerged since then is a different paradigm that makes our interactions with the microbial world more like a biological chess match, says Victoria McGovern, a biochemist and program officer for the Burroughs Wellcome Fund’s Infectious Disease and Population Sciences Program. In this chess game, humans may make a clever strategic move, which could involve creating a new vaccine or a potent antibiotic, but that advantage is fleeting. At some point, the organisms we are up against could respond with a move of their own—such as developing resistance to medication or genetic mutations that attack our bodies. Simply eradicating the “opponent,” or the pathogenic microbes, as efficiently as possible isn’t enough to keep humans healthy long-term.
Instead, scientists should focus on studying the complexity of interactions between humans and their pathogens. “We need to better understand the lifestyles of things that afflict us,” McGovern says. “The solutions are going to be in understanding various parts of their biology so we can influence how they behave around our systems.”
Genetics and cell biology, combined with imaging techniques that allow one to see tissues and individual cells in actions, will enable scientists to define and quantify what it means to be healthy at the molecular level.
What is being proposed will require a pivot to basic biology and other disciplines that have suffered from lack of research funding in recent years. Yet, according to McGovern, the research teams of funded proposals are answering bigger questions. “We look for people exploring questions about hosts and pathogens, and what happens when they touch, but we’re also looking for people with big ideas,” she says. For example, if one specific infection causes a chain of pathological events in the body, can other infections cause them too? And if we find a way to break that chain for one pathogen, can we play the same trick on another? “We really want to see people thinking of not just one experiment but about big implications of their work,” McGovern says.
Jonah Cool, a cell biologist, geneticist and science officer at the Chan Zuckerberg Initiative, says that it’s necessary to define what constitutes a healthy organism and how it overcomes infections or environmental assaults, such as pollution from forest fires or toxins from industrial smokestacks. An organism that catches a disease isn’t necessarily an unhealthy one, as long as it fights it off successfully—an ability that arises from the complex interplay of its genes, the immune system, age, stress levels and other factors. Modern science allows many of these factors to be measured, recorded and compared. “We need a data-driven, deep-phenotyping approach to defining healthy biological systems and their responses to insults—which can be infectious disease or environmental exposures—and their ability to navigate their way through that space,” Cool says.
Genetics and cell biology, combined with imaging techniques that allow one to see tissues and individual cells in actions, will enable scientists to define and quantify what it means to be healthy at the molecular level. “As a geneticist and cell biologist, I believe in all these molecular underpinnings and how they arise in phenotypic differences in cells, genes, proteins—and how their combinations form complex cellular states,” Cool says.
Julie Graves, a physician, public health consultant, former adjunct professor of management, policy and community health at the University of Texas Health Science Center in Houston, stresses the necessity of nutritious diets. According to the Rockefeller Food Initiative, “poor diet is the leading risk factor for disease, disability and premature death in the majority of countries around the world.” Adequate nutrition is critical for maintaining human health and life. Yet, Western diets are often low in essential nutrients, high in calories and heavy on processed foods. Overconsumption of these foods has contributed to high rates of obesity and chronic disease in the U.S. In fact, more than half of American adults have at least one chronic disease, and 27 percent have more than one—which increases vulnerability to COVID-19 infections, according to the 2018 National Health Interview Survey.
Further, the contamination of our food supply with various agricultural and industrial toxins—petrochemicals, pesticides, PFAS and others—has implications for morbidity, mortality, and overall quality of life. “These chemicals are insidiously in everything, including our bodies,” Graves says—and they are interfering with our normal biological functions. “We need to stop how we manufacture food,” she adds, and rid our sustenance of these contaminants.
According to the Humane Society of the United States, factory farms result in nearly 40 percent of emissions of methane. Concentrated animal feeding operations or CAFOs may serve as breeding grounds for pandemics, scientists warn, so humans should research better ways to raise and treat livestock. Diego Rose, a professor of food and nutrition policy at Tulane University School of Public Health & Tropical Medicine, and his colleagues found that “20 percent of Americans’ diets account for about 45 percent of the environmental impacts [that come from food].” A subsequent study explored the impacts of specific foods and found that substituting beef for chicken lowers an individual’s carbon footprint by nearly 50 percent, with water usage decreased by 30 percent. Notably, however, eating too much red meat has been associated with a variety of illnesses.
In some communities, the option to swap food types is limited or impossible. For example, “many populations live in relative food deserts where there’s not a local grocery store that has any fresh produce,” says Louis Muglia, the president and CEO of Burroughs Wellcome. Individuals in these communities suffer from an insufficient intake of beneficial macronutrients, and they’re “probably being exposed to phenols and other toxins that are in the packaging.” An equitable, sustainable and nutritious food supply will be vital to humanity’s wellbeing in the era of climate change, unpredictable weather and spillover events.
A recent report by See Change Institute and the Climate Mental Health Network showed that people who are experiencing socioeconomic inequalities, including many people of color, contribute the least to climate change, yet they are impacted the most. For example, people in low-income communities are disproportionately exposed to vehicle emissions, Muglia says. Through its Climate Change and Human Health Seed Grants program, Burroughs Wellcome funds research that aims to understand how various factors related to climate change and environmental chemicals contribute to premature births, associated with health vulnerabilities over the course of a person’s life—and map such hot spots.
“It’s very complex, the combinations of socio-economic environment, race, ethnicity and environmental exposure, whether that’s heat or toxic chemicals,” Muglia explains. “Disentangling those things really requires a very sophisticated, multidisciplinary team. That’s what we’ve put together to describe where these hotspots are and see how they correlate with different toxin exposure levels.”
In addition to mapping the risks, researchers are developing novel therapeutics that will be crucial to our armor arsenal, but we will have to be smarter at designing and using them. We will need more potent, better-working monoclonal antibodies. Instead of directly attacking a pathogen, we may have to learn to stimulate the immune system—training it to fight the disease-causing microbes on its own. And rather than indiscriminately killing all bacteria with broad-scope drugs, we would need more targeted medications. “Instead of wiping out the entire gut flora, we will need to come up with ways that kill harmful bacteria but not healthy ones,” Graves says. Training our immune systems to recognize and react to pathogens by way of vaccination will keep us ahead of our biological opponents, too. “Continued development of vaccines against infectious diseases is critical,” says Graves.
With all of the unpredictable events that lie ahead, it is difficult to foresee what achievements in public health will be reported at the end of the 21st century. Yet, technological advances, better modeling and pursuing bigger questions in science, along with education and working closely with communities will help overcome the challenges. The Chan Zuckerberg Initiative displays an optimistic message on its website: “Is it possible to cure, prevent, or manage all diseases by the end of this century? We think so.” Cool shares the view of his employer—and believes that science can get us there. Just give it some time and a chance. “It’s a big, bold statement,” he says, “but the end of the century is a long way away.”Lina Zeldovich has written about science, medicine and technology for Popular Science, Smithsonian, National Geographic, Scientific American, Reader’s Digest, the New York Times and other major national and international publications. A Columbia J-School alumna, she has won several awards for her stories, including the ASJA Crisis Coverage Award for Covid reporting, and has been a contributing editor at Nautilus Magazine. In 2021, Zeldovich released her first book, The Other Dark Matter, published by the University of Chicago Press, about the science and business of turning waste into wealth and health. You can find her on http://linazeldovich.com/ and @linazeldovich.