To Make Science Engaging, We Need a Sesame Street for Adults
This article is part of the magazine, "The Future of Science In America: The Election Issue," co-published by LeapsMag, the Aspen Institute Science & Society Program, and GOOD.
In the mid-1960s, a documentary producer in New York City wondered if the addictive jingles, clever visuals, slogans, and repetition of television ads—the ones that were captivating young children of the time—could be harnessed for good. Over the course of three months, she interviewed educators, psychologists, and artists, and the result was a bonanza of ideas.
Perhaps a new TV show could teach children letters and numbers in short animated sequences? Perhaps adults and children could read together with puppets providing comic relief and prompting interaction from the audience? And because it would be broadcast through a device already in almost every home, perhaps this show could reach across socioeconomic divides and close an early education gap?
Soon after Joan Ganz Cooney shared her landmark report, "The Potential Uses of Television in Preschool Education," in 1966, she was prototyping show ideas, attracting funding from The Carnegie Corporation, The Ford Foundation, and The Corporation for Public Broadcasting, and co-founding the Children's Television Workshop with psychologist Lloyd Morrisett. And then, on November 10, 1969, informal learning was transformed forever with the premiere of Sesame Street on public television.
For its first season, Sesame Street won three Emmy Awards and a Peabody Award. Its star, Big Bird, landed on the cover of Time Magazine, which called the show "TV's gift to children." Fifty years later, it's hard to imagine an approach to informal preschool learning that isn't Sesame Street.
And that approach can be boiled down to one word: Entertainment.
Despite decades of evidence from Sesame Street—one of the most studied television shows of all time—and more research from social science, psychology, and media communications, we haven't yet taken Ganz Cooney's concepts to heart in educating adults. Adults have news programs and documentaries and educational YouTube channels, but no Sesame Street. So why don't we? Here's how we can design a new kind of television to make science engaging and accessible for a public that is all too often intimidated by it.
We have to start from the realization that America is a nation of high-school graduates. By the end of high school, students have decided to abandon science because they think it's too difficult, and as a nation, we've made it acceptable for any one of us to say "I'm not good at science" and offload thinking to the ones who might be. So, is it surprising that a large number of Americans are likely to believe in conspiracy theories like the 25% that believe the release of COVID-19 was planned, the one in ten who believe the Moon landing was a hoax, or the 30–40% that think the condensation trails of planes are actually nefarious chemtrails? If we're meeting people where they are, the aim can't be to get the audience from an A to an A+, but from an F to a D, and without judgment of where they are starting from.
There's also a natural compulsion for a well-meaning educator to fill a literacy gap with a barrage of information, but this is what I call "factsplaining," and we know it doesn't work. And worse, it can backfire. In one study from 2014, parents were provided with factual information about vaccine safety, and it was the group that was already the most averse to vaccines that uniquely became even more averse.
Why? Our social identities and cognitive biases are stubborn gatekeepers when it comes to processing new information. We filter ideas through pre-existing beliefs—our values, our religions, our political ideologies. Incongruent ideas are rejected. Congruent ideas, no matter how absurd, are allowed through. We hear what we want to hear, and then our brains justify the input by creating narratives that preserve our identities. Even when we have all the facts, we can use them to support any worldview.
But social science has revealed many mechanisms for hijacking these processes through narrative storytelling, and this can form the foundation of a new kind of educational television.
Could new television series establish the baseline narratives for novel science like gene editing, quantum computing, or artificial intelligence?
As media creators, we can reject factsplaining and instead construct entertaining narratives that disrupt cognitive processes. Two-decade-old research tells us when people are immersed in entertaining fiction narratives, they loosen their defenses, opening a path for new information, editing attitudes, and inspiring new behavior. Where news about hot-button issues like climate change or vaccination might trigger resistance or a backfire effect, fiction can be crafted to be absorbing and, as a result, persuasive.
But the narratives can't be stuffed with information. They must be simplified. If this feels like the opposite of what an educator should be doing, it is possible to reduce the complexity of information, without oversimplification, through "exemplification," a framing device to tell the stories of individuals in specific circumstances that can speak to the greater issue without needing to explain it all. It's a technique you've seen used in biopics. The Discovery Channel true-crime miniseries Manhunt: Unabomber does many things well from a science storytelling perspective, including exemplifying the virtues of the scientific method through a character who argues for a new field of science, forensic linguistics, to catch one of the most notorious domestic terrorists in U.S. history.
We must also appeal to the audience's curiosity. We know curiosity is such a strong driver of human behavior that it can even counteract the biases put up by one's political ideology around subjects like climate change. If we treat science information like a product—and we should—advertising research tells us we can maximize curiosity though a Goldilocks effect. If the information is too complex, your show might as well be a PowerPoint presentation. If it's too simple, it's Sesame Street. There's a sweet spot for creating intrigue about new information when there's a moderate cognitive gap.
The science of "identification" tells us that the more the main character is endearing to a viewer, the more likely the viewer will adopt the character's worldview and journey of change. This insight further provides incentives to craft characters reflective of our audiences. If we accept our biases for what they are, we can understand why the messenger becomes more important than the message, because, without an appropriate messenger, the message becomes faint and ineffective. And research confirms that the stereotype-busting doctor-skeptic Dana Scully of The X-Files, a popular science-fiction series, was an inspiration for a generation of women who pursued science careers.
With these directions, we can start making a new kind of television. But is television itself still the right delivery medium? Americans do spend six hours per day—a quarter of their lives—watching video. And even with the rise of social media and apps, science-themed television shows remain popular, with four out of five adults reporting that they watch shows about science at least sometimes. CBS's The Big Bang Theory was the most-watched show on television in the 2017–2018 season, and Cartoon Network's Rick & Morty is the most popular comedy series among millennials. And medical and forensic dramas continue to be broadcast staples. So yes, it's as true today as it was in the 1980s when George Gerbner, the "cultivation theory" researcher who studied the long-term impacts of television images, wrote, "a single episode on primetime television can reach more people than all science and technology promotional efforts put together."
We know from cultivation theory that media images can shape our views of scientists. Quick, picture a scientist! Was it an old, white man with wild hair in a lab coat? If most Americans don't encounter research science firsthand, it's media that dictates how we perceive science and scientists. Characters like Sheldon Cooper and Rick Sanchez become the model. But we can correct that by representing professionals more accurately on-screen and writing characters more like Dana Scully.
Could new television series establish the baseline narratives for novel science like gene editing, quantum computing, or artificial intelligence? Or could new series counter the misinfodemics surrounding COVID-19 and vaccines through more compelling, corrective narratives? Social science has given us a blueprint suggesting they could. Binge-watching a show like the surreal NBC sitcom The Good Place doesn't replace a Ph.D. in philosophy, but its use of humor plants the seed of continued interest in a new subject. The goal of persuasive entertainment isn't to replace formal education, but it can inspire, shift attitudes, increase confidence in the knowledge of complex issues, and otherwise prime viewers for continued learning.
[Editor's Note: To read other articles in this special magazine issue, visit the beautifully designed e-reader version.]
Like any life-threatening medical condition that affects children, food allergies can traumatize more than just the patient. My wife and I learned this one summer afternoon when our daughter was three years old.
Emergency room visits for anaphylaxis in children more than doubled from 2010 to 2016.
At an ice cream parlor, I gave Samantha a lick of my pistachio cone; within seconds, red blotches erupted on her skin, her lips began to swell, and she complained that her throat felt funny. We rushed her to the nearest emergency room, where a doctor injected her with epinephrine. Explaining that the reaction, known as anaphylaxis, could have been fatal if left unchecked, he advised us to have her tested for nut allergies—and to start carrying an injector of our own.
After an allergist confirmed Sam's vulnerability to tree nuts and peanuts, we figured that keeping her safe would be relatively simple. But food allergies often come in bunches. Over the next year, she wound up back in the ER after eating bread with sesame seeds at an Italian restaurant, and again after slurping buckwheat noodles at our neighborhood Japanese. She hated eggs, so we discovered that (less severe) allergy only when she vomited after eating a variety of products containing them.
In recent years, a growing number of families have had to grapple with such challenges. An estimated 32 million Americans have food allergies, or nearly 10 percent of the population—10 times the prevalence reported 35 years ago. The severity of symptoms seems to be increasing, too. According to a study released in January by Food Allergy Research & Education (FARE), a Virginia-based nonprofit, insurance claims for anaphylactic food reactions rose 377 percent in the U.S. from 2007 to 2016.
Because food allergies most commonly emerge in childhood, these trends are largely driven by the young. An insurance-industry study found that emergency room visits for anaphylaxis in children more than doubled from 2010 to 2016. Peanut allergies, once rare, tripled in kids between 1997 and 2008. "The first year, it was 1 in 250," says Scott Sicherer, chief of pediatric allergy and immunology at New York City's Mount Sinai Hospital, who led that study. "When we did the next round of research, in 2002, it was 1 in 125. I thought there must be a mistake. But by 2008, it was 1 in 70."
The forces behind these dire statistics—as well as similar numbers throughout the developed world—have yet to be positively identified. But the leading suspects are elements of our modern lifestyle that can throw the immune system out of whack, prompting potentially deadly overreactions to harmless proteins. Although parents can take a few steps that might lessen their children's risk, societal changes may be needed to brighten the larger epidemiological picture.
Meanwhile, scientists are racing to develop therapies that can induce patients' hyped-up immune defenses to chill. And lately, they've made some big strides toward that goal.
A Variety of Culprits
In the United States, about 90 percent of allergic reactions come from eight foods: milk, eggs, peanuts, tree nuts, soy, wheat, fish, and shellfish. The list varies from country to country, depending on dietary customs, but what the trigger foods all have in common is proteins that can survive breakdown in the stomach and enter the bloodstream more or less intact.
"When we were kids, we played in the dirt. Today, children tend to be on their screens, inside sealed buildings."
A food allergy results from a chain of biochemical misunderstandings. The first time the immune system encounters an allergen (as a protein that triggers an allergy is known), it mistakes the substance for a hostile invader—perhaps a parasite with a similar molecular profile. In response, it produces an antibody called immunoglobin E (IgE), which is designed to bind to a specific protein and flag it for attack. These antibodies circulate through the bloodstream and attach to immune-system foot soldiers known as mast cells and basophils, which congregate in the nose, throat, lungs, skin, and gastrointestinal tract.
The next time the person is exposed to the allergen, the IgE antibodies signal the warrior cells to blast the intruder with histamines and other chemical weapons. Tissues in the affected areas swell and leak fluid; blood pressure may fall. Depending on the strength of the reaction, collateral damage to the patient can range from unpleasant—itching, runny nose, nausea—to catastrophic.
This kind of immunological glitchiness runs in families. Genome-wide association studies have identified a dozen genes linked to allergies of all types, and twin studies suggest that about 80 percent of the risk of food allergies is heritable. But why one family member shows symptoms while another doesn't remains unknown. Nor can genetics explain why food allergy rates have skyrocketed in such a brief period. For that, we must turn to the environment.
First, it's important to note that rates of all allergies are rising—including skin and respiratory afflictions—though none as rapidly or with as much risk of anaphylaxis as those involving food. The takeoff was already underway in the late 1980s, when British epidemiologist David P. Strachan found that children in larger households had fewer instances of hay fever. The reason, he suggested, was that their immune systems were strengthened by exposure to their siblings' germs. Since then, other researchers have discerned more evidence for Strachan's "hygiene hypothesis": higher rates of allergy (as well as autoimmune disorders) in cities versus rural areas, in industrialized countries versus developing ones, in lab animals raised under sterile conditions versus those exposed to germs.
Fending off a variety of pathogens, experts theorize, helps train the immune system to better distinguish friend from foe, and to respond to threats in a more nuanced manner. In an era of increasing urbanization, shrinking family sizes, and more sheltered lifestyles, such conditioning may be harder to come by. "When we were kids, we played in the dirt," observes Cathryn R. Nagler, a professor and food allergy researcher at the University of Chicago. "Today, children tend to be on their screens, inside sealed buildings."
But other factors may be driving the allergy epidemic as well. More time indoors, for example, means less exposure to sunlight, which can lead to a deficiency in vitamin D—a nutrient crucial to immune system regulation. The growing popularity of processed foods filled with refined fats and sugars may play a role, along with rising rates of obesity, by promoting tissue inflammation that could increase some people's risk of immunological mayhem. And the surge in allergies also correlates with several trends that may be altering the human microbiome, the community of microbes (including bacteria, viruses, and fungi, among others) that inhabits our guts, skin, and bodily orifices.
The microbiome connection may be particularly relevant to food allergies. In 2014, a team led by Nagler published a landmark study showing that Clostridia, a common class of gut bacteria, protects against these allergies. When the researchers fed peanut allergens to germ-free mice (born and raised in sterile conditions) and to mice treated with antibiotics as newborns (reducing their gut bacteria), the animals showed a strong immunological response. This sensitization could be reversed, however, by reintroducing Clostridia—but not another class of bacteria, Bacteroides—into the mice. Further experiments revealed that Clostridia caused immune cells to produce high levels of interleukin-22 (IL-22), a signaling molecule known to decrease the permeability of the intestinal lining.
"In simple terms," Nagler says, "what we found is that these bacteria prevent food allergens from gaining access to the blood in an intact form that elicits an allergic reaction."
A growing body of evidence suggests that our eating habits are throwing our gut microbiota off-balance, in part by depriving helpful species of the dietary fiber they feed on. Our increasing exposure to antibiotics and antimicrobial compounds may be harming our beneficial bugs as well. These depletions could affect kids from the moment they enter the world: Because babies are seeded with their mothers' microbiota as they pass through the birth canal, they may be inheriting a less diverse microbiome than did previous generations. And the rising rate of caesarian deliveries may be further depriving our children of the bugs they need.
On expert suggests two measures worth a try: increasing consumption of fiber, and reducing use of antimicrobial agents, from antibacterial cleaners to antibiotics.
So which culprit is most responsible for the food allergy upsurge? "The illnesses that we're measuring are complex," says Sicherer. "There are multiple genetic inputs, which interact with one another, and there are multiple environmental inputs, which interact with each other and with the genes. There's not one single thing that's causing this. It's a conglomeration."
What Parents Can Do
For anyone hoping to reduce their child's or their own odds of developing a food allergy (rates of adult onset are also increasing), the current state of science offers few guideposts. As with many other areas of health research, it's hard to know when the data is solid enough to warrant a particular course of action. A case in point: the American Academy of Pediatrics once recommended that children at risk of allergy to peanuts (as evidenced by family history, other food allergies, or eczema) wait to eat them until age three; now, the AAP advises those parents to start their babies at four months, citing epidemiological evidence that early exposure may prevent peanut allergies.
And it's all too easy for a layperson to draw mistaken conclusions from media coverage of such research—inferring, for instance, that taking commercially available probiotics might have a protective effect. Unfortunately, says Nagler, none of those products even contain the relevant kind of bacteria.
Although, as a research scientist, she refrains from giving medical advice, Nagler does suggest (based on a large body of academic literature) that two measures are worth a try: increasing consumption of fiber, and reducing use of antimicrobial agents, from antibacterial cleaners to antibiotics. Yet she acknowledges that it's not always possible to avoid the suspected risk factors for food allergies. Sometimes an antibiotic is a lifesaving necessity, for example—and it's tough to avoid exposure to such drugs altogether, due to their use in animal feed and their consequent presence in many foods and in the water supply. If these chemicals are contributing to the food allergy epidemic, protecting ourselves will require action from farmers, doctors, manufacturers, and policymakers.
My family's experience illustrates the limits of healthy lifestyle choices in mitigating allergy risk. My daughter and son were born without C-sections; both were breastfed as well, receiving maximum microbial seeding from their mother. As a family, we eat exemplary diets, and no one could describe our home as excessively clean. Yet one child can't taste nuts, sesame, or buckwheat without becoming dangerously ill. "You can do everything right and still have allergies," says Ian A. Myles, a staff clinician at the National Institute of Allergy and Infectious Diseases. "You can do everything wrong and not have allergies. The two groups overlap."
The Latest Science Shows Promise
But while preventing all food allergies is clearly unrealistic, researchers are making remarkable progress in developing better treatments—therapies that, instead of combating symptoms after they've started (like epinephrine or antihistamines), aim to make patients less sensitive to allergens in the first place. One promising approach is oral immunotherapy (OIT), in which patients consume small but slowly increasing amounts of an allergen, gradually reducing their sensitivity. A study published last year in the New England Journal of Medicine showed that an experimental OIT called AR101, consisting of a standardized peanut powder mixed into food, enabled 67 percent of participants to tolerate a dose equivalent to two peanut kernels—a potential lifesaver if they were accidentally exposed to the real thing.
Because OIT itself can trigger troublesome reactions in some patients, however, it's not for everyone. Another experimental treatment, sublingual immunotherapy (SLIT) uses an allergen solution or dissolving tablet placed beneath the tongue; although its results are less robust than OIT's, it seems to generate milder side effects. Epicutaneous immunotherapy (EPIT) avoids the mouth entirely, using a technology similar to a nicotine patch to deliver allergens through the skin. Researchers are also exploring the use of medications known as biologics, aiming to speed up the action of immunotherapies by suppressing IgE or targeting other immune-system molecules.
These findings suggest that drugs based on microbial metabolites could help protect vulnerable individuals against a wide range of allergies.
One downside of the immunotherapy approach is that in most cases the allergen must be taken indefinitely to maintain desensitization. To provide a potentially permanent fix, scientists are working on vaccines that use DNA or peptides (protein fragments) from allergens to reset patients' immune systems.
Nagler is attacking the problem from a different angle—one that starts with the microbiome. In a recent study, a follow-up to her peanut-allergy investigation, she and her colleagues found that Clostridia bacteria protect mice against milk allergy as well; they also identified a particular species responsible, known as Anaerostipes caccae. The bugs, the team determined, produce a short-chain fatty acid called butyrate, which modulates many immune activities crucial to maintaining a well-sealed gut.
These findings suggest that drugs based on microbial metabolites could help protect vulnerable individuals against a wide range of allergies. Nagler has launched a company, ClostraBio, to develop biotherapeutics based on this notion; she expects its first product, using synthetic butyrate, to be ready for clinical trials within the next two years.
My daughter could well be a candidate for such a medication. Sam, now 15, is a vibrant, resilient kid who handles her allergies with confidence and humor. Thanks to vigilance and luck (on her part as well as her parents'), she hasn't had another food-related ER visit in more than a decade; she's never had to use her Epi-Pen. Still, she says, she would welcome the arrival of a pill that could reduce the danger. "I've learned how to watch out for myself," she says. "But it would be nice not to have to be so careful."
Imagine eating a slice of cake for breakfast. It's deliciously indulgent, but instead of your blood sugar spiking, your body processes all that sweetness as a healthy high-protein meal. It may sound like sci-fi, but this scenario is not necessarily far off.
"People with diabetes could especially benefit because sweet proteins don't trigger a need for insulin."
The Lowdown
An award-winning agtech startup called Amai is developing "sweet proteins," based on the molecular structure of naturally occuring exotic fruits. These new sugar substitutes could potentially replace artificial sweeteners and help people who are trying to curb their sugar intake. People with diabetes could especially benefit because sweet proteins don't trigger a need for insulin.
While there is a sweet protein currently on the market today called thaumatin, it's expensive, has a short shelf life, and is lacking in the taste department. But Amai's proteins taste 70 to 100 percent identical to the sweet ones found in nature. Once their molecular structure is designed through a sophisticated computing platform, they are made through fermentation, which is akin to brewing beer. These non-GMO proteins are over 10,000 times sweeter than sugar, which means much less needs to be produced and used.
Diseases like diabetes and heart disease, which are often linked to sugar overconsumption, have been on a major upswing over the last few decades, especially in the United States. According to the CDC, 100 million adults in the United States are now living with diabetes or prediabetes, which if not treated, often leads to type 2 diabetes within five years. By 2030, scientists predict cases of diabetes in the U.S. will increase by 54 percent. If sugar proteins like the type Amai is creating become widely available, these numbers could begin to decrease.
Next Up
Amai's sweet proteins are still in the research and development stage, but the Israeli startup is raising significant funding that should help expedite the process. They're also substantially upping their production ability by expanding their facilities.
Will consumers be comfortable ingesting a lab-designed food product?
And in March, the USDA and FDA announced plans to regulate cell-cultured foods, the category in which these sugar proteins would fall, so Amai researchers are hopeful they'll have an easier path to approval once their product is market ready.
Open Questions
All this progress may sound promising, but Amai still has a long way to go before the reality of healthy cake becomes tangible. Some questions to consider: Will consumers be comfortable ingesting a lab-designed food product? Will it taste enough like real sugar?
And if some products and brands begin to adopt it, will it ever overtake the real thing in popularity and make a dent in diseases like diabetes and obesity? Only time, more research, and a lot more money will tell, but in the meantime, feel free to daydream about eating entire pints of ice cream without needing to hit the gym.