To Make Science Engaging, We Need a Sesame Street for Adults
This article is part of the magazine, "The Future of Science In America: The Election Issue," co-published by LeapsMag, the Aspen Institute Science & Society Program, and GOOD.
In the mid-1960s, a documentary producer in New York City wondered if the addictive jingles, clever visuals, slogans, and repetition of television ads—the ones that were captivating young children of the time—could be harnessed for good. Over the course of three months, she interviewed educators, psychologists, and artists, and the result was a bonanza of ideas.
Perhaps a new TV show could teach children letters and numbers in short animated sequences? Perhaps adults and children could read together with puppets providing comic relief and prompting interaction from the audience? And because it would be broadcast through a device already in almost every home, perhaps this show could reach across socioeconomic divides and close an early education gap?
Soon after Joan Ganz Cooney shared her landmark report, "The Potential Uses of Television in Preschool Education," in 1966, she was prototyping show ideas, attracting funding from The Carnegie Corporation, The Ford Foundation, and The Corporation for Public Broadcasting, and co-founding the Children's Television Workshop with psychologist Lloyd Morrisett. And then, on November 10, 1969, informal learning was transformed forever with the premiere of Sesame Street on public television.
For its first season, Sesame Street won three Emmy Awards and a Peabody Award. Its star, Big Bird, landed on the cover of Time Magazine, which called the show "TV's gift to children." Fifty years later, it's hard to imagine an approach to informal preschool learning that isn't Sesame Street.
And that approach can be boiled down to one word: Entertainment.
Despite decades of evidence from Sesame Street—one of the most studied television shows of all time—and more research from social science, psychology, and media communications, we haven't yet taken Ganz Cooney's concepts to heart in educating adults. Adults have news programs and documentaries and educational YouTube channels, but no Sesame Street. So why don't we? Here's how we can design a new kind of television to make science engaging and accessible for a public that is all too often intimidated by it.
We have to start from the realization that America is a nation of high-school graduates. By the end of high school, students have decided to abandon science because they think it's too difficult, and as a nation, we've made it acceptable for any one of us to say "I'm not good at science" and offload thinking to the ones who might be. So, is it surprising that a large number of Americans are likely to believe in conspiracy theories like the 25% that believe the release of COVID-19 was planned, the one in ten who believe the Moon landing was a hoax, or the 30–40% that think the condensation trails of planes are actually nefarious chemtrails? If we're meeting people where they are, the aim can't be to get the audience from an A to an A+, but from an F to a D, and without judgment of where they are starting from.
There's also a natural compulsion for a well-meaning educator to fill a literacy gap with a barrage of information, but this is what I call "factsplaining," and we know it doesn't work. And worse, it can backfire. In one study from 2014, parents were provided with factual information about vaccine safety, and it was the group that was already the most averse to vaccines that uniquely became even more averse.
Why? Our social identities and cognitive biases are stubborn gatekeepers when it comes to processing new information. We filter ideas through pre-existing beliefs—our values, our religions, our political ideologies. Incongruent ideas are rejected. Congruent ideas, no matter how absurd, are allowed through. We hear what we want to hear, and then our brains justify the input by creating narratives that preserve our identities. Even when we have all the facts, we can use them to support any worldview.
But social science has revealed many mechanisms for hijacking these processes through narrative storytelling, and this can form the foundation of a new kind of educational television.
Could new television series establish the baseline narratives for novel science like gene editing, quantum computing, or artificial intelligence?
As media creators, we can reject factsplaining and instead construct entertaining narratives that disrupt cognitive processes. Two-decade-old research tells us when people are immersed in entertaining fiction narratives, they loosen their defenses, opening a path for new information, editing attitudes, and inspiring new behavior. Where news about hot-button issues like climate change or vaccination might trigger resistance or a backfire effect, fiction can be crafted to be absorbing and, as a result, persuasive.
But the narratives can't be stuffed with information. They must be simplified. If this feels like the opposite of what an educator should be doing, it is possible to reduce the complexity of information, without oversimplification, through "exemplification," a framing device to tell the stories of individuals in specific circumstances that can speak to the greater issue without needing to explain it all. It's a technique you've seen used in biopics. The Discovery Channel true-crime miniseries Manhunt: Unabomber does many things well from a science storytelling perspective, including exemplifying the virtues of the scientific method through a character who argues for a new field of science, forensic linguistics, to catch one of the most notorious domestic terrorists in U.S. history.
We must also appeal to the audience's curiosity. We know curiosity is such a strong driver of human behavior that it can even counteract the biases put up by one's political ideology around subjects like climate change. If we treat science information like a product—and we should—advertising research tells us we can maximize curiosity though a Goldilocks effect. If the information is too complex, your show might as well be a PowerPoint presentation. If it's too simple, it's Sesame Street. There's a sweet spot for creating intrigue about new information when there's a moderate cognitive gap.
The science of "identification" tells us that the more the main character is endearing to a viewer, the more likely the viewer will adopt the character's worldview and journey of change. This insight further provides incentives to craft characters reflective of our audiences. If we accept our biases for what they are, we can understand why the messenger becomes more important than the message, because, without an appropriate messenger, the message becomes faint and ineffective. And research confirms that the stereotype-busting doctor-skeptic Dana Scully of The X-Files, a popular science-fiction series, was an inspiration for a generation of women who pursued science careers.
With these directions, we can start making a new kind of television. But is television itself still the right delivery medium? Americans do spend six hours per day—a quarter of their lives—watching video. And even with the rise of social media and apps, science-themed television shows remain popular, with four out of five adults reporting that they watch shows about science at least sometimes. CBS's The Big Bang Theory was the most-watched show on television in the 2017–2018 season, and Cartoon Network's Rick & Morty is the most popular comedy series among millennials. And medical and forensic dramas continue to be broadcast staples. So yes, it's as true today as it was in the 1980s when George Gerbner, the "cultivation theory" researcher who studied the long-term impacts of television images, wrote, "a single episode on primetime television can reach more people than all science and technology promotional efforts put together."
We know from cultivation theory that media images can shape our views of scientists. Quick, picture a scientist! Was it an old, white man with wild hair in a lab coat? If most Americans don't encounter research science firsthand, it's media that dictates how we perceive science and scientists. Characters like Sheldon Cooper and Rick Sanchez become the model. But we can correct that by representing professionals more accurately on-screen and writing characters more like Dana Scully.
Could new television series establish the baseline narratives for novel science like gene editing, quantum computing, or artificial intelligence? Or could new series counter the misinfodemics surrounding COVID-19 and vaccines through more compelling, corrective narratives? Social science has given us a blueprint suggesting they could. Binge-watching a show like the surreal NBC sitcom The Good Place doesn't replace a Ph.D. in philosophy, but its use of humor plants the seed of continued interest in a new subject. The goal of persuasive entertainment isn't to replace formal education, but it can inspire, shift attitudes, increase confidence in the knowledge of complex issues, and otherwise prime viewers for continued learning.
[Editor's Note: To read other articles in this special magazine issue, visit the beautifully designed e-reader version.]
Sarah Mancoll was 22 years old when she noticed a bald spot on the back of her head. A dermatologist confirmed that it was alopecia aerata, an autoimmune disorder that causes hair loss.
Of 213 new drugs approved from 2003 to 2012, only five percent included any data from pregnant women.
She successfully treated the condition with corticosteroid shots for nearly 10 years. Then Mancoll and her husband began thinking about starting a family. Would the shots be safe for her while pregnant? For the fetus? What about breastfeeding?
Mancoll consulted her primary care physician, her dermatologist, even a pediatrician. Without clinical data, no one could give her a definitive answer, so she stopped treatment to be "on the safe side." By the time her son was born, she'd lost at least half her hair. She returned to her Washington, D.C., public policy job two months later entirely bald—and without either eyebrows or eyelashes.
After having two more children in quick succession, Mancoll recently resumed the shots but didn't forget her experience. Today, she is an advocate for including more pregnant and lactating women in clinical studies so they can have more information about therapies than she did.
"I live a very privileged life, and I'll do just fine with or without hair, but it's not just about me," Mancoll said. "It's about a huge population of women who are being disenfranchised…They're invisible."
About 4 million women give birth each year in the United States, and many face medical conditions, from hypertension and diabetes to psychiatric disorders. A 2011 study showed that most women reported taking at least one medication while pregnant between 1976 and 2008. But for decades, pregnant and lactating women have been largely excluded from clinical drug studies that rigorously test medications for safety and effectiveness.
An estimated 98 percent of government-approved drug treatments between 2000 and 2010 had insufficient data to determine risk to the fetus, and close to 75 percent had no human pregnancy data at all. All told, of 213 new pharmaceuticals approved from 2003 to 2012, only five percent included any data from pregnant women.
But recent developments suggest that could be changing. Amid widespread concerns about increased maternal mortality rates, women's health advocates, physicians, and researchers are sensing and encouraging a cultural shift toward protecting women through responsible research instead of from research.
"The question is not whether to do research with pregnant women, but how," Anne Drapkin Lyerly, professor and associate director of the Center for Bioethics at the University of North Carolina at Chapel Hill, wrote last year in an op-ed. "These advances are essential. It is well past time—and it is morally imperative—for research to benefit pregnant women."
"In excluding pregnant women from drug trials to protect them from experimentation, we subject them to uncontrolled experimentation."
To that end, the American College of Obstetricians and Gynecologists' Committee on Ethics acknowledged that research trials need to be better designed so they don't "inappropriately constrain the reproductive choices of study participants or unnecessarily exclude pregnant women." A federal task force also called for significantly expanded research and the removal of regulatory barriers that make it difficult for pregnant and lactating women to participate in research.
Several months ago, a government change to a regulation known as the Common Rule took effect, removing pregnant women as a "vulnerable population" in need of special protections -- a designation that had made it more difficult to enroll them in clinical drug studies. And just last week, the U.S. Food and Drug Administration (FDA) issued new draft guidances for industry on when and how to include pregnant and lactating women in clinical trials.
Inclusion is better than the absence of data on their treatment, said Catherine Spong, former chair of the federal task force.
"It's a paradox," said Spong, professor of obstetrics and gynecology and chief of maternal fetal medicine at University of Texas Southwestern Medical Center. "There is a desire to protect women and fetuses from harm, which is translated to a reluctance to include them in research. By excluding them, the evidence for their care is limited."
Jacqueline Wolf, a professor of the history of medicine at Ohio University, agreed.
"In excluding pregnant women from drug trials to protect them from experimentation, we subject them to uncontrolled experimentation," she said. "We give them the medication without doing any research, and that's dangerous."
Women, of course, don't stop getting sick or having chronic medical conditions just because they are pregnant or breastfeeding, and conditions during pregnancy can affect a baby's health later in life. Evidence-based data is important for other reasons, too.
Pregnancy can dramatically change a woman's physiology, affecting how drugs act on her body and how her body acts or reacts to drugs. For instance, pregnant bodies can more quickly clear out medications such as glyburide, used during diabetes in pregnancy to stabilize high blood-sugar levels, which can be toxic to the fetus and harmful to women. That means a regular dose of the drug may not be enough to control blood sugar and prevent poor outcomes.
Pregnant patients also may be reluctant to take needed drugs for underlying conditions (and doctors may be hesitant to prescribe them), which in turn can cause more harm to the woman and fetus than had they been treated. For example, women who have severe asthma attacks while pregnant are at a higher risk of having low-birthweight babies, and pregnant women with uncontrolled diabetes in early pregnancy have more than four times the risk of birth defects.
Current clinical trials involving pregnant women are assessing treatments for obstructive sleep apnea, postpartum hemorrhage, lupus, and diabetes.
For Kate O'Brien, taking medication during her pregnancy was a matter of life and death. A freelance video producer who lives in New Jersey, O'Brien was diagnosed with tuberculosis in 2015 after she became pregnant with her second child, a boy. Even as she signed hospital consent forms, she had no idea if the treatment would harm him.
"It's a really awful experience," said O'Brien, who now is active with We are TB, an advocacy and support network. "All they had to tell me about the medication was just that women have been taking it for a really long time all over the world. That was the best they could do."
More and more doctors, researchers and women's health organizations and advocates are calling that unacceptable.
By indicating that filling current knowledge gaps is "a critical public health need," the FDA is signaling its support for advancing research with pregnant women, said Lyerly, also co-founder of the Second Wave Initiative, which promotes fair representation of the health interests of pregnant women in biomedical research and policies. "It's a very important shift."
Research with pregnant women can be done ethically, Lyerly said, whether by systematically collecting data from those already taking medications or enrolling pregnant women in studies of drugs or vaccines in development.
Current clinical trials involving pregnant women are assessing treatments for obstructive sleep apnea, postpartum hemorrhage, lupus, and diabetes. Notable trials in development target malaria and HIV prevention in pregnancy.
"It clearly is doable to do this research, and test trials are important to provide evidence for treatment," Spong said. "If we don't have that evidence, we aren't making the best educated decisions for women."
The news last November that a rogue Chinese scientist had genetically altered the embryos of a pair of Chinese twins shocked the world. But although this use of advanced technology to change the human gene pool was premature, it was a harbinger of how genetic science will alter our healthcare, the way we make babies, the nature of the babies we make, and, ultimately, our sense of who and what we are as a species.
The healthcare applications of the genetics revolution are merely stations along the way to the ultimate destination.
But while the genetics revolution has already begun, we aren't prepared to handle these Promethean technologies responsibly.
By identifying the structure of DNA in the 1950s, Watson, Crick, Wilkins, and Franklin showed that the book of life was written in the DNA double helix. When the human genome project was completed in 2003, we saw how this book of human life could be transcribed. Painstaking research paired with advanced computational algorithms then showed what increasing numbers of genes do and how the genetic book of life can be read.
Now, with the advent of precision gene editing tools like CRISPR, we are seeing that the book of life -- and all biology -- can be re-written. Biology is being recognized as another form of readable, writable, and hackable information technology with we humans as the coders.
The impact of this transformation is being first experienced in our healthcare. Gene therapies including those extracting, re-engineering, then reintroducing a person's own cells enhanced into cancer-fighting supercells are already performing miracles in clinical trials. Thousands of applications have already been submitted to regulators across the globe for trials using gene therapies to address a host of other diseases.
Recently, the first gene editing of cells inside a person's body was deployed to treat the genetically relatively simple metabolic disorder Hunter syndrome, with many more applications to come. These new approaches are only the very first steps in our shift from the current system of generalized medicine based on population averages to precision medicine based on each patient's individual biology to predictive medicine based on AI-generated estimations of a person's future health state.
Jamie Metzl's groundbreaking new book, Hacking Darwin: Genetic Engineering and the Future of Humanity, explores how the genetic revolution is transforming our healthcare, the way we make babies, and the nature of and babies we make, what this means for each of us, and what we must all do now to prepare for what's coming.
This shift in our healthcare will ensure that millions and then billions of people will have their genomes sequenced as the foundation of their treatment. Big data analytics will then be used to compare at scale people's genotypes (what their genes say) to their phenotypes (how those genes are expressed over the course of their lives).
These massive datasets of genetic and life information will then make it possible to go far beyond the simple genetic analysis of today and to understand far more complex human diseases and traits influenced by hundreds or thousands of genes. Our understanding of this complex genetic system within the vaster ecosystem of our bodies and the environment around us will transform healthcare for the better and help us cure terrible diseases that have plagued our ancestors for millennia.
But as revolutionary as this challenge will be for medicine, the healthcare applications of the genetics revolution are merely stations along the way to the ultimate destination – a deep and fundamental transformation of our evolutionary trajectory as a species.
A first inkling of where we are heading can be seen in the direct-to-consumer genetic testing industry. Many people around the world have now sent their cheek swabs to companies like 23andMe for analysis. The information that comes back can tell people a lot about relatively simple genetic traits like carrier status for single gene mutation diseases, eye color, or whether they hate the taste of cilantro, but the information about complex traits like athletic predisposition, intelligence, or personality style today being shared by some of these companies is wildly misleading.
This will not always be the case. As the genetic and health data pools grow, analysis of large numbers of sequenced genomes will make it possible to apply big data analytics to predict some very complex genetic disease risks and the genetic components of traits like height, IQ, temperament, and personality style with increasing accuracy. This process, called "polygenic scoring," is already being offered in beta stage by a few companies and will become an ever bigger part of our lives going forward.
The most profound application of all this will be in our baby-making. Before making a decision about which of the fertilized eggs to implant, women undergoing in vitro fertilization can today elect to have a small number of cells extracted from their pre-implanted embryos and sequenced. With current technology, this can be used to screen for single-gene mutation diseases and other relatively simple disorders. Polygenic scoring, however, will soon make it possible to screen these early stage pre-implanted embryos to assess their risk of complex genetic diseases and even to make predictions about the heritable parts of complex human traits. The most intimate elements of being human will start feeling like high-pressure choices needing to be made by parents.
The limit of our imagination will become the most significant barrier to our recasting biology.
Adult stem cell technologies will then likely make it possible to generate hundreds or thousands of a woman's own eggs from her blood sample or skin graft. This would blow open the doors of reproductive possibility and allow parents to choose embryos with exceptional potential capabilities from a much larger set of options.
The complexity of human biology will place some limits to the extent of possible gene edits that might be made to these embryos, but all of biology, including our own, is extremely flexible. How else could all the diversity of life have emerged from a single cell nearly four billion years ago? The limit of our imagination will become the most significant barrier to our recasting biology.
But while we humans are gaining the powers of the gods, we aren't at all ready to use them.
The same tools that will help cure our worst afflictions, save our children, help us live longer, healthier, more robust lives will also open the door to potential abuses. Prospective parents with the best of intentions or governments with lax regulatory structures or aggressive ideas of how population-wide genetic engineering might be used to enhance national competitiveness or achieve some other goal could propel us into a genetic arms race that could undermine our essential diversity, dangerously divide societies, lead to dangerous, destabilizing, and potentially even deadly conflicts between us, and threaten our very humanity.
But while the advance of genetic technologies is inevitable, how it plays out is anything but. If we don't want the genetic revolution to undermine our species or lead to grave conflicts between genetic haves and have nots or between societies opting in and those opting out, now is the time when we need to make smart decisions based on our individual and collective best values. Although the technology driving the genetic revolution is new, the value systems we will need to optimize the benefits and minimize the harms of this massive transformation are ones we have been developing for thousands of years.
And while some very smart and well-intentioned scientists have been meeting to explore what comes next, it won't be enough for a few of even our wisest prophets to make decisions about the future of our species that will impact everyone. We'll also need smart regulations on both the national and international levels.
Every country will need to have its own regulatory guidelines for human genetic engineering based on both international best practices and the country's unique traditions and values. Because we are all one species, however, we will also ultimately need to develop guidelines that can apply to all of us.
As a first step toward making this possible, we must urgently launch a global, species-wide education effort and inclusive dialogue on the future of human genetic engineering that can eventually inform global norms that will need to underpin international regulations. This process will not be easy, but the alternative of an unregulated genetic arms race would be far worse.
The overlapping genomics and AI revolutions may seem like distant science fiction but are closer than you think. Far sooner than most people recognize, the inherent benefits of these technologies and competition between us will spark rapid adoption. Before that spark ignites, we have a brief moment to come together as a species like we never have before to articulate and translate into action the future we jointly envision. The north star of our best shared values can help us navigate the almost unimaginable opportunities and very real challenges that lie ahead.