To Make Science Engaging, We Need a Sesame Street for Adults
This article is part of the magazine, "The Future of Science In America: The Election Issue," co-published by LeapsMag, the Aspen Institute Science & Society Program, and GOOD.
In the mid-1960s, a documentary producer in New York City wondered if the addictive jingles, clever visuals, slogans, and repetition of television ads—the ones that were captivating young children of the time—could be harnessed for good. Over the course of three months, she interviewed educators, psychologists, and artists, and the result was a bonanza of ideas.
Perhaps a new TV show could teach children letters and numbers in short animated sequences? Perhaps adults and children could read together with puppets providing comic relief and prompting interaction from the audience? And because it would be broadcast through a device already in almost every home, perhaps this show could reach across socioeconomic divides and close an early education gap?
Soon after Joan Ganz Cooney shared her landmark report, "The Potential Uses of Television in Preschool Education," in 1966, she was prototyping show ideas, attracting funding from The Carnegie Corporation, The Ford Foundation, and The Corporation for Public Broadcasting, and co-founding the Children's Television Workshop with psychologist Lloyd Morrisett. And then, on November 10, 1969, informal learning was transformed forever with the premiere of Sesame Street on public television.
For its first season, Sesame Street won three Emmy Awards and a Peabody Award. Its star, Big Bird, landed on the cover of Time Magazine, which called the show "TV's gift to children." Fifty years later, it's hard to imagine an approach to informal preschool learning that isn't Sesame Street.
And that approach can be boiled down to one word: Entertainment.
Despite decades of evidence from Sesame Street—one of the most studied television shows of all time—and more research from social science, psychology, and media communications, we haven't yet taken Ganz Cooney's concepts to heart in educating adults. Adults have news programs and documentaries and educational YouTube channels, but no Sesame Street. So why don't we? Here's how we can design a new kind of television to make science engaging and accessible for a public that is all too often intimidated by it.
We have to start from the realization that America is a nation of high-school graduates. By the end of high school, students have decided to abandon science because they think it's too difficult, and as a nation, we've made it acceptable for any one of us to say "I'm not good at science" and offload thinking to the ones who might be. So, is it surprising that a large number of Americans are likely to believe in conspiracy theories like the 25% that believe the release of COVID-19 was planned, the one in ten who believe the Moon landing was a hoax, or the 30–40% that think the condensation trails of planes are actually nefarious chemtrails? If we're meeting people where they are, the aim can't be to get the audience from an A to an A+, but from an F to a D, and without judgment of where they are starting from.
There's also a natural compulsion for a well-meaning educator to fill a literacy gap with a barrage of information, but this is what I call "factsplaining," and we know it doesn't work. And worse, it can backfire. In one study from 2014, parents were provided with factual information about vaccine safety, and it was the group that was already the most averse to vaccines that uniquely became even more averse.
Why? Our social identities and cognitive biases are stubborn gatekeepers when it comes to processing new information. We filter ideas through pre-existing beliefs—our values, our religions, our political ideologies. Incongruent ideas are rejected. Congruent ideas, no matter how absurd, are allowed through. We hear what we want to hear, and then our brains justify the input by creating narratives that preserve our identities. Even when we have all the facts, we can use them to support any worldview.
But social science has revealed many mechanisms for hijacking these processes through narrative storytelling, and this can form the foundation of a new kind of educational television.
Could new television series establish the baseline narratives for novel science like gene editing, quantum computing, or artificial intelligence?
As media creators, we can reject factsplaining and instead construct entertaining narratives that disrupt cognitive processes. Two-decade-old research tells us when people are immersed in entertaining fiction narratives, they loosen their defenses, opening a path for new information, editing attitudes, and inspiring new behavior. Where news about hot-button issues like climate change or vaccination might trigger resistance or a backfire effect, fiction can be crafted to be absorbing and, as a result, persuasive.
But the narratives can't be stuffed with information. They must be simplified. If this feels like the opposite of what an educator should be doing, it is possible to reduce the complexity of information, without oversimplification, through "exemplification," a framing device to tell the stories of individuals in specific circumstances that can speak to the greater issue without needing to explain it all. It's a technique you've seen used in biopics. The Discovery Channel true-crime miniseries Manhunt: Unabomber does many things well from a science storytelling perspective, including exemplifying the virtues of the scientific method through a character who argues for a new field of science, forensic linguistics, to catch one of the most notorious domestic terrorists in U.S. history.
We must also appeal to the audience's curiosity. We know curiosity is such a strong driver of human behavior that it can even counteract the biases put up by one's political ideology around subjects like climate change. If we treat science information like a product—and we should—advertising research tells us we can maximize curiosity though a Goldilocks effect. If the information is too complex, your show might as well be a PowerPoint presentation. If it's too simple, it's Sesame Street. There's a sweet spot for creating intrigue about new information when there's a moderate cognitive gap.
The science of "identification" tells us that the more the main character is endearing to a viewer, the more likely the viewer will adopt the character's worldview and journey of change. This insight further provides incentives to craft characters reflective of our audiences. If we accept our biases for what they are, we can understand why the messenger becomes more important than the message, because, without an appropriate messenger, the message becomes faint and ineffective. And research confirms that the stereotype-busting doctor-skeptic Dana Scully of The X-Files, a popular science-fiction series, was an inspiration for a generation of women who pursued science careers.
With these directions, we can start making a new kind of television. But is television itself still the right delivery medium? Americans do spend six hours per day—a quarter of their lives—watching video. And even with the rise of social media and apps, science-themed television shows remain popular, with four out of five adults reporting that they watch shows about science at least sometimes. CBS's The Big Bang Theory was the most-watched show on television in the 2017–2018 season, and Cartoon Network's Rick & Morty is the most popular comedy series among millennials. And medical and forensic dramas continue to be broadcast staples. So yes, it's as true today as it was in the 1980s when George Gerbner, the "cultivation theory" researcher who studied the long-term impacts of television images, wrote, "a single episode on primetime television can reach more people than all science and technology promotional efforts put together."
We know from cultivation theory that media images can shape our views of scientists. Quick, picture a scientist! Was it an old, white man with wild hair in a lab coat? If most Americans don't encounter research science firsthand, it's media that dictates how we perceive science and scientists. Characters like Sheldon Cooper and Rick Sanchez become the model. But we can correct that by representing professionals more accurately on-screen and writing characters more like Dana Scully.
Could new television series establish the baseline narratives for novel science like gene editing, quantum computing, or artificial intelligence? Or could new series counter the misinfodemics surrounding COVID-19 and vaccines through more compelling, corrective narratives? Social science has given us a blueprint suggesting they could. Binge-watching a show like the surreal NBC sitcom The Good Place doesn't replace a Ph.D. in philosophy, but its use of humor plants the seed of continued interest in a new subject. The goal of persuasive entertainment isn't to replace formal education, but it can inspire, shift attitudes, increase confidence in the knowledge of complex issues, and otherwise prime viewers for continued learning.
[Editor's Note: To read other articles in this special magazine issue, visit the beautifully designed e-reader version.]
By now you have probably heard something about CRISPR, the simple and relatively inexpensive method of precisely editing the genomes of plants, animals, and humans.
The treatment of disease in fetuses, the liminal category of life between embryos and humans, poses the next frontier.
Through CRISPR and other methods of gene editing, scientists have produced crops to be more nutritious, better able to resist pests, and tolerate droughts; engineered animals ranging from fruit flies to monkeys to make them better suited for scientific study; and experimentally treated the HIV virus, Hepatitis B, and leukemia in human patients.
There are also currently FDA-approved trials to treat blindness, cancer, and sickle cell disease in humans using gene editing, and there is consensus that CRISPR's therapeutic applications will grow significantly in the coming years.
While the treatment of human disease through use of gene editing is not without its medical and ethical concerns, the avoidance of disease in embryos is far more fraught. Nonetheless, Nature reported in November that He Jiankui, a scientist in China, had edited twin embryos to disable a gene called CCR5 in hopes of avoiding transmission of HIV from their HIV-positive father.
Though there are questions about the effectiveness and necessity of this therapy, He reported that sequencing has proven his embryonic gene edits were successful and the twins were "born normal and healthy," although his claims have not been independently verified.
More recently, Denis Rebrikov, a Russian scientist, announced his plans to disable the same gene in embryos to be implanted in HIV-positive women later this year. Futuristic as it may seem, prenatal gene editing is already here.
The treatment of disease in fetuses, the liminal category of life between embryos and humans, poses the next frontier. Numerous conditions—some minor, some resulting in a lifetime of medical treatment, some incompatible with life outside of the womb—can be diagnosed through use of prenatal diagnostic testing. There is promising research suggesting doctors will soon be able to treat or mitigate at least some of them through use of fetal gene editing.
This research could soon present women carrying genetically anomalous fetuses a third option aside from termination or birthing a child who will likely face a challenging and uncertain medical future: Whether to undergo a fetal genetic intervention.
However, genetic intervention will open the door to a host of ethical considerations, particularly with respect to the relationship between pregnant women and prenatal genetic counselors. Current counselors theoretically provide objective information and answer questions rather than advise their pregnant client whether to continue with her pregnancy, despite the risks, or to have an abortion.
In practice, though, prenatal genetic counseling is most often directive, and the nature of the counseling pregnant women receive can depend on numerous factors, including their religious and cultural beliefs, their perceived ability to handle a complicated pregnancy and subsequent birth, and their financial status. Introducing the possibility of a fetal genetic intervention will exacerbate counselor reliance upon these considerations and in some cases lead to counseling that is even more directive.
Some women in the near future will face the choice of whether to abort, keep, or treat a genetically anomalous fetus.
Future counselors will have to figure out under what circumstances it is even appropriate to broach the subject. Should they only discuss therapies that are FDA-approved, or should they mention experimental treatments? What about interventions that are available in Europe or Asia, but banned in the United States? Or even in the best case of scenario of an FDA-approved treatment, should a counselor make reference to it if she knows for a fact that her client cannot possibly afford it?
Beyond the basic question of what information to share, counselors will have to confront the fact that the very notion of fixing or "editing" offspring will be repugnant to many women, and inherent in the suggestion is the stigmatization of individuals with disabilities. Prenatal genetic counselors will be on the forefront of debates surrounding which fetuses should remain as they are and which ones should be altered.
Despite these concerns, some women in the near future will face the choice of whether to abort, keep, or treat a genetically anomalous fetus in utero. Take, for example, a woman who learns during prenatal testing that her fetus has Angelman syndrome, a genetic disorder characterized by intellectual disability, speech impairment, loss of muscle control, epilepsy, and a small head. There is currently no human treatment for Angelman syndrome, which is caused by a loss of function in a single gene, UBE3A.
But scientists at the University of North Carolina have been able to treat Angelman syndrome in fetal mice by reactivating UBE3A through use of a single injection. The therapy has also proven effective in cultured human brain cells. This suggests that a woman might soon have to consider injecting her fetus's brain with a CRISPR concoction custom-designed to target UBE3A, rather than terminate her pregnancy or bring her fetus to term unaltered.
Assuming she receives the adequate information to make an informed choice, she too will face an ethical conundrum. There will be the inherent risks of injecting anything into a developing fetus's brain, including the possibility of infection, brain damage, and miscarriage. But there are also risks specific to gene editing, such as so-called off-target effects, the possibility of impacting genes other than the intended one. Such effects are highly unpredictable and can be difficult to detect. So too is it impossible to predict how altering UBE3A might lead to other genetic and epigenetic changes once the baby is born.
There are no easy answers to the many questions that will arise in this space.
A woman deciding how to act in this scenario must balance these risks against the potential benefits of the therapy, layered on top of her belief system, resources, and personal ethics. The calculus will be different for every woman, and even the same woman might change her mind from one pregnancy to the next based on the severity of the condition diagnosed and other available medical options.
Her genetic counselor, meanwhile, must be sensitive to all of these concerns in helping her make her decision, keeping up to date on the possible new treatments, and carefully choosing which information to disclose in striving to be neutral. There are no easy answers to the many questions that will arise in this space, but better to start thinking about them now, before it is too late.
Agriculture in the 21st century is not as simple as it once was. With a population seven billion strong, a climate in crisis, and sustainability in farming practices on everyone's radar, figuring out how to feed the masses without destroying the Earth is a pressing concern.
Tufts scientists argue that insect cells may be better suited to lab-created meat protein than traditional farm animal cells.
In addition to low-emission cows and drone pollinators, there's a promising new solution on the table. How does "lab-grown insect meat" grab you?
Writing in Frontiers in Sustainable Food Systems, researchers at Tufts University say insects that are fed plants and genetically modified for maximum growth, nutrition, and flavor could be the best, greenest alternative to our current livestock farming practices. This lab-grown protein source could produce high volume, nutritious food without the massive resources required for traditional animal agriculture.
"Due to the environmental, public health, and animal welfare concerns associated with our current livestock system, it is vital to develop more sustainable food production methods," says lead author Natalie Rubio. Could insect meat be the key?
Next Up
New sustainable food production includes what's called "cellular agriculture," an emerging industry and field of study in which meat and dairy are produced via cells in a lab instead of whole animals. So far, scientists have primarily focused on bovine, porcine, and avian cells to create this "cultured meat."
But the Tufts scientists argue that insect cells may be better suited to lab-created meat protein than traditional farm animal cells.
"Compared to cultured mammalian, avian, and other vertebrate cells, insect cell cultures require fewer resources and less energy-intensive environmental control, as they have lower glucose requirements and can thrive in a wider range of temperature, pH, oxygen, and osmolarity conditions," reports Rubio.
"Alterations necessary for large-scale production are also simpler to achieve with insect cells, which are currently used for biomanufacturing of insecticides, drugs, and vaccines," she adds.
They still have some details to hash out, however, including how to make cultured insect meat more like the steak and chicken we're all familiar with.
"Despite this immense potential, cultured insect meat isn't ready for consumption," says Rubio. "Research is ongoing to master two key processes: controlling development of insect cells into muscle and fat, and combining these in 3D cultures with a meat-like texture." They are currently experimenting with mushroom-derived fiber to tackle the latter.
People would still be able to eat meat—it would just come from a different source.
Open Questions
As the report points out, one thing that makes cellular agriculture an attractive alternative to high-density animal farming is that it doesn't require consumers to change their behaviors. People would still be able to eat meat—it would just come from a different source.
But the big question remains: How will lab-grown insect meat taste? Will the buggers really taste as good as burgers?
And, of course, there's the "ew" factor. Meat alternatives have proven to work for some people—Tofurky is still in business, after all—but it may be a hard sell to get the masses to jump on board with eating bugs. Consuming creepy crawlies sounds simply unpalatable to many, and the term "lab-grown, cellular insect meat" doesn't help much. Perhaps an entirely new nomenclature is in order.
Another question is whether or not folks will trust such scientifically-created food. People already use the term "frankenfood" to refer to genetic modification -- even though the vast majority of the corn and soybeans planted in the U.S. today are genetically engineered, and other major crops with GM varieties include potatoes, apples, squash, and papayas. Still, combining GM technology with eating insects may be a hard sell.
However, we're all going to have to get used to trying new things if we want to leave a habitable home for our children. If a lab-grown bug burger can save the planet, maybe it's worth a shot.