To Make Science Engaging, We Need a Sesame Street for Adults
This article is part of the magazine, "The Future of Science In America: The Election Issue," co-published by LeapsMag, the Aspen Institute Science & Society Program, and GOOD.
In the mid-1960s, a documentary producer in New York City wondered if the addictive jingles, clever visuals, slogans, and repetition of television ads—the ones that were captivating young children of the time—could be harnessed for good. Over the course of three months, she interviewed educators, psychologists, and artists, and the result was a bonanza of ideas.
Perhaps a new TV show could teach children letters and numbers in short animated sequences? Perhaps adults and children could read together with puppets providing comic relief and prompting interaction from the audience? And because it would be broadcast through a device already in almost every home, perhaps this show could reach across socioeconomic divides and close an early education gap?
Soon after Joan Ganz Cooney shared her landmark report, "The Potential Uses of Television in Preschool Education," in 1966, she was prototyping show ideas, attracting funding from The Carnegie Corporation, The Ford Foundation, and The Corporation for Public Broadcasting, and co-founding the Children's Television Workshop with psychologist Lloyd Morrisett. And then, on November 10, 1969, informal learning was transformed forever with the premiere of Sesame Street on public television.
For its first season, Sesame Street won three Emmy Awards and a Peabody Award. Its star, Big Bird, landed on the cover of Time Magazine, which called the show "TV's gift to children." Fifty years later, it's hard to imagine an approach to informal preschool learning that isn't Sesame Street.
And that approach can be boiled down to one word: Entertainment.
Despite decades of evidence from Sesame Street—one of the most studied television shows of all time—and more research from social science, psychology, and media communications, we haven't yet taken Ganz Cooney's concepts to heart in educating adults. Adults have news programs and documentaries and educational YouTube channels, but no Sesame Street. So why don't we? Here's how we can design a new kind of television to make science engaging and accessible for a public that is all too often intimidated by it.
We have to start from the realization that America is a nation of high-school graduates. By the end of high school, students have decided to abandon science because they think it's too difficult, and as a nation, we've made it acceptable for any one of us to say "I'm not good at science" and offload thinking to the ones who might be. So, is it surprising that a large number of Americans are likely to believe in conspiracy theories like the 25% that believe the release of COVID-19 was planned, the one in ten who believe the Moon landing was a hoax, or the 30–40% that think the condensation trails of planes are actually nefarious chemtrails? If we're meeting people where they are, the aim can't be to get the audience from an A to an A+, but from an F to a D, and without judgment of where they are starting from.
There's also a natural compulsion for a well-meaning educator to fill a literacy gap with a barrage of information, but this is what I call "factsplaining," and we know it doesn't work. And worse, it can backfire. In one study from 2014, parents were provided with factual information about vaccine safety, and it was the group that was already the most averse to vaccines that uniquely became even more averse.
Why? Our social identities and cognitive biases are stubborn gatekeepers when it comes to processing new information. We filter ideas through pre-existing beliefs—our values, our religions, our political ideologies. Incongruent ideas are rejected. Congruent ideas, no matter how absurd, are allowed through. We hear what we want to hear, and then our brains justify the input by creating narratives that preserve our identities. Even when we have all the facts, we can use them to support any worldview.
But social science has revealed many mechanisms for hijacking these processes through narrative storytelling, and this can form the foundation of a new kind of educational television.
Could new television series establish the baseline narratives for novel science like gene editing, quantum computing, or artificial intelligence?
As media creators, we can reject factsplaining and instead construct entertaining narratives that disrupt cognitive processes. Two-decade-old research tells us when people are immersed in entertaining fiction narratives, they loosen their defenses, opening a path for new information, editing attitudes, and inspiring new behavior. Where news about hot-button issues like climate change or vaccination might trigger resistance or a backfire effect, fiction can be crafted to be absorbing and, as a result, persuasive.
But the narratives can't be stuffed with information. They must be simplified. If this feels like the opposite of what an educator should be doing, it is possible to reduce the complexity of information, without oversimplification, through "exemplification," a framing device to tell the stories of individuals in specific circumstances that can speak to the greater issue without needing to explain it all. It's a technique you've seen used in biopics. The Discovery Channel true-crime miniseries Manhunt: Unabomber does many things well from a science storytelling perspective, including exemplifying the virtues of the scientific method through a character who argues for a new field of science, forensic linguistics, to catch one of the most notorious domestic terrorists in U.S. history.
We must also appeal to the audience's curiosity. We know curiosity is such a strong driver of human behavior that it can even counteract the biases put up by one's political ideology around subjects like climate change. If we treat science information like a product—and we should—advertising research tells us we can maximize curiosity though a Goldilocks effect. If the information is too complex, your show might as well be a PowerPoint presentation. If it's too simple, it's Sesame Street. There's a sweet spot for creating intrigue about new information when there's a moderate cognitive gap.
The science of "identification" tells us that the more the main character is endearing to a viewer, the more likely the viewer will adopt the character's worldview and journey of change. This insight further provides incentives to craft characters reflective of our audiences. If we accept our biases for what they are, we can understand why the messenger becomes more important than the message, because, without an appropriate messenger, the message becomes faint and ineffective. And research confirms that the stereotype-busting doctor-skeptic Dana Scully of The X-Files, a popular science-fiction series, was an inspiration for a generation of women who pursued science careers.
With these directions, we can start making a new kind of television. But is television itself still the right delivery medium? Americans do spend six hours per day—a quarter of their lives—watching video. And even with the rise of social media and apps, science-themed television shows remain popular, with four out of five adults reporting that they watch shows about science at least sometimes. CBS's The Big Bang Theory was the most-watched show on television in the 2017–2018 season, and Cartoon Network's Rick & Morty is the most popular comedy series among millennials. And medical and forensic dramas continue to be broadcast staples. So yes, it's as true today as it was in the 1980s when George Gerbner, the "cultivation theory" researcher who studied the long-term impacts of television images, wrote, "a single episode on primetime television can reach more people than all science and technology promotional efforts put together."
We know from cultivation theory that media images can shape our views of scientists. Quick, picture a scientist! Was it an old, white man with wild hair in a lab coat? If most Americans don't encounter research science firsthand, it's media that dictates how we perceive science and scientists. Characters like Sheldon Cooper and Rick Sanchez become the model. But we can correct that by representing professionals more accurately on-screen and writing characters more like Dana Scully.
Could new television series establish the baseline narratives for novel science like gene editing, quantum computing, or artificial intelligence? Or could new series counter the misinfodemics surrounding COVID-19 and vaccines through more compelling, corrective narratives? Social science has given us a blueprint suggesting they could. Binge-watching a show like the surreal NBC sitcom The Good Place doesn't replace a Ph.D. in philosophy, but its use of humor plants the seed of continued interest in a new subject. The goal of persuasive entertainment isn't to replace formal education, but it can inspire, shift attitudes, increase confidence in the knowledge of complex issues, and otherwise prime viewers for continued learning.
[Editor's Note: To read other articles in this special magazine issue, visit the beautifully designed e-reader version.]
In different countries' national dietary guidelines, red meats (beef, pork, and lamb) are often confined to a very small corner. Swedish officials, for example, advise the population to "eat less red and processed meat". Experts in Greece recommend consuming no more than four servings of red meat — not per week, but per month.
"Humans 100% rely on the microbes to digest this food."
Yet somehow, the matter is far from settled. Quibbles over the scientific evidence emerge on a regular basis — as in a recent BMJ article titled, "No need to cut red meat, say new guidelines." News headlines lately have declared that limiting red meat may be "bad advice," while carnivore diet enthusiasts boast about the weight loss and good health they've achieved on an all-meat diet. The wildly successful plant-based burgers? To them, a gimmick. The burger wars are on.
Nutrition science would seem the best place to look for answers on the health effects of specific foods. And on one hand, the science is rather clear: in large populations, people who eat more red meat tend to have more health problems, including cardiovascular disease, colorectal cancer, and other conditions. But this sort of correlational evidence fails to settle the matter once and for all; many who look closely at these studies cite methodological shortcomings and a low certainty of evidence.
Some scientists, meanwhile, are trying to cut through the noise by increasing their focus on the mechanisms: exactly how red meat is digested and the step-by-step of how this affects human health. And curiously, as these lines of evidence emerge, several of them center around gut microbes as active participants in red meat's ultimate effects on human health.
Dr. Stanley Hazen, researcher and medical director of preventive cardiology at Cleveland Clinic, was one of the first to zero in on gut microorganisms as possible contributors to the health effects of red meat. In looking for chemical compounds in the blood that could predict the future development of cardiovascular disease, his lab identified a molecule called trimethylamine-N-oxide (TMAO). Little by little, he and his colleagues began to gather both human and animal evidence that TMAO played a role in causing heart disease.
Naturally, they tried to figure out where the TMAO came from. Hazen says, "We found that animal products, and especially red meat, were a dietary source that, [along with] gut microbes, would generate this product that leads to heart disease development." They observed that the gut microbes were essential for making TMAO out of dietary compounds (like red meat) that contained its precursor, trimethylamine (TMA).
So in linking red meat to cardiovascular disease through TMAO, the surprising conclusion, says Hazen, was that, "Without a doubt, [the microbes] are the most important aspect of the whole pathway."
"I think it's just a matter of time [before] we will have therapeutic interventions that actually target our gut microbes, just like the way we take drugs that lower cholesterol levels."
Other researchers have taken an interest in different red-meat-associated health problems, like colorectal cancer and the inflammation that accompanies it. This was the mechanistic link tackled by the lab of professor Karsten Zengler of the UC San Diego Departments of Pediatrics and Bioengineering—and it also led straight back to the gut microbes.
Zengler and colleagues recently published a paper in Nature Microbiology that focused on the effects of a red meat carbohydrate (or sugar) called Neu5Gc.
He explains, "If you eat animal proteins in your diet… the bound sugars in your diet are cleaved off in your gut and they get recycled. Your own cells will not recognize between the foreign sugars and your own sugars, because they look almost identical." The unsuspecting human cells then take up these foreign sugars — spurring antibody production and creating inflammation.
Zengler showed, however, that gut bacteria use enzymes to cleave off the sugar during digestion, stopping the inflammation and rendering the sugar harmless. "There's no enzyme in the human body that can cleave this [sugar] off. Humans 100% rely on the microbes to digest this food," he says.
Both researchers are quick to caution that the health effects of diet are complex. Other work indicates, for example, that while intake of red meat can affect TMAO levels, so can intake of fish and seafood. But these new lines of evidence could help explain why some people, ironically, seem to be in perfect health despite eating a lot of red meat: their ideal frequency of meat consumption may depend on their existing community of gut microbes.
"It helps explain what accounts for inter-person variability," Hazen says.
These emerging mechanisms reinforce overall why it's prudent to limit red meat, just as the nutritional guidelines advised in the first place. But both Hazen and Zengler predict that interventions to buffer the effects of too many ribeyes may be just around the corner.
Zengler says, "Our idea is that you basically can help your own digestive system detoxify these inflammatory compounds in meat, if you continue eating red meat or you want to eat a high amount of red meat." A possibly strategy, he says, is to use specific pre- or probiotics to cultivate an inflammation-reducing gut microbial community.
Hazen foresees the emergence of drugs that act not on the human, but on the human's gut microorganisms. "I think it's just a matter of time [before] we will have therapeutic interventions that actually target our gut microbes, just like the way we take drugs that lower cholesterol levels."
He adds, "It's a matter of 'stay tuned', I think."
New Device Can Detect Peanut Allergens on a Plate in 30 Seconds
People with life-threatening allergies live in constant fear of coming into contact with deadly allergens. Researchers estimate that about 32 million Americans have food allergies, with the most severe being milk, egg, peanut, tree nuts, wheat, soy, fish, and shellfish.
"It is important to understand that just several years ago, this would not have been possible."
Every three minutes, a food allergy reaction sends someone to the emergency room, and 200,000 people in the U.S. require emergency medical care each year for allergic reactions, according to Food Allergy Research and Education.
But what if there was a way you could easily detect if something you were about to eat contains any harmful allergens? Thanks to Israeli scientists, this will soon be the case — at least for peanuts. The team has been working to develop a handheld device called Allerguard, which analyzes the vapors in your meal and can detect allergens in 30 seconds.
Leapsmag spoke with the founder and CTO of Allerguard, Guy Ayal, about the groundbreaking technology, how it works, and when it will be available to purchase.
What prompted you to create this device? Do you have a personal connection with severe food allergies?
Guy Ayal: My eldest daughter's best friend suffers from a severe food allergy, and I experienced first-hand the effect it has on the person and their immediate surroundings. Most notable for me was the effect on the quality of life – the experience of living in constant fear. Everything we do at Allerguard is basically to alleviate some of that fear.
How exactly does the device work?
The device is built on two main pillars. The first is the nano-chemical stage, in which we developed specially attuned nanoparticles that selectively adhere only to the specific molecules that we are looking for. Those molecules, once bound to the nanoparticles, induce a change in their electrical behavior, which is measured and analyzed by the second main pillar -- highly advanced machine learning algorithms, which can surmise which molecules were collected, and thus whether or not peanuts (or in the future, other allergens) were detected.
It is important to understand that just several years ago, this would not have been possible, because both the nano-chemistry, and especially the entire world of machine learning, big data, and what is commonly known as AI only started to exist in the '90s, and reached applicability for handheld devices only in the past few years.
Where are you at in the development process and when will the device be available to consumers?
We have concluded the proof of concept and proof of capability phase, when we demonstrated successful detection of the minimal known clinical amount that may cause the slightest effect in the most severely allergic person – less than 1 mg of peanut (actually it is 0.7 mg). Over the next 18 months will be productization, qualification, and validation of our device, which should be ready to market in the latter half of 2021. The sensor will be available in the U.S., and after a year in Europe and Canada.
The Allerguard was made possible through recent advances in machine learning, big data, and AI.
(Courtesy)
How much will it cost?
Our target price is about $200 for the device, with a disposable SenseCard that will run for at least a full day and cost about $1. That card is for a specific allergen and will work for multiple scans in a day, not just one time.
[At a later stage, the company will have sensors for other allergens like tree nuts, eggs, and milk, and they'll develop a multi-SenseCard that works for a few allergens at once.]
Are there any other devices on the market that do something similar to Allerguard?
No other devices are even close to supplying the level of service that we promise. All known methods for allergen detection rely on sampling of the food, which is a viable solution for homogenous foodstuffs, such as a factory testing their raw ingredients, but not for something as heterogenous as an actual dish – especially not for solid allergens such as peanuts, treenuts, or sesame.
If there is a single peanut in your plate, and you sample from anywhere on that plate which is not where that peanut is located, you will find that your sample is perfectly clean – because it is. But the dish is not. That dish is a death trap for an allergic person. Allerguard is the only suggested solution that could indeed detect that peanut, no matter where in that plate it is hiding.
Anything else readers should know?
Our first-generation product will be for peanuts only. You have to understand, we are still a start-up company, and if we don't concentrate our limited resources to one specific goal, we will not be able to achieve anything at all. Once we are ready to market our first device, the peanut detector, we will be able to start the R&D for the 2nd product, which will be for another allergen – most likely tree nuts and/or sesame, but that will probably be in debate until we actually start it.