To Make Science Engaging, We Need a Sesame Street for Adults
This article is part of the magazine, "The Future of Science In America: The Election Issue," co-published by LeapsMag, the Aspen Institute Science & Society Program, and GOOD.
In the mid-1960s, a documentary producer in New York City wondered if the addictive jingles, clever visuals, slogans, and repetition of television ads—the ones that were captivating young children of the time—could be harnessed for good. Over the course of three months, she interviewed educators, psychologists, and artists, and the result was a bonanza of ideas.
Perhaps a new TV show could teach children letters and numbers in short animated sequences? Perhaps adults and children could read together with puppets providing comic relief and prompting interaction from the audience? And because it would be broadcast through a device already in almost every home, perhaps this show could reach across socioeconomic divides and close an early education gap?
Soon after Joan Ganz Cooney shared her landmark report, "The Potential Uses of Television in Preschool Education," in 1966, she was prototyping show ideas, attracting funding from The Carnegie Corporation, The Ford Foundation, and The Corporation for Public Broadcasting, and co-founding the Children's Television Workshop with psychologist Lloyd Morrisett. And then, on November 10, 1969, informal learning was transformed forever with the premiere of Sesame Street on public television.
For its first season, Sesame Street won three Emmy Awards and a Peabody Award. Its star, Big Bird, landed on the cover of Time Magazine, which called the show "TV's gift to children." Fifty years later, it's hard to imagine an approach to informal preschool learning that isn't Sesame Street.
And that approach can be boiled down to one word: Entertainment.
Despite decades of evidence from Sesame Street—one of the most studied television shows of all time—and more research from social science, psychology, and media communications, we haven't yet taken Ganz Cooney's concepts to heart in educating adults. Adults have news programs and documentaries and educational YouTube channels, but no Sesame Street. So why don't we? Here's how we can design a new kind of television to make science engaging and accessible for a public that is all too often intimidated by it.
We have to start from the realization that America is a nation of high-school graduates. By the end of high school, students have decided to abandon science because they think it's too difficult, and as a nation, we've made it acceptable for any one of us to say "I'm not good at science" and offload thinking to the ones who might be. So, is it surprising that a large number of Americans are likely to believe in conspiracy theories like the 25% that believe the release of COVID-19 was planned, the one in ten who believe the Moon landing was a hoax, or the 30–40% that think the condensation trails of planes are actually nefarious chemtrails? If we're meeting people where they are, the aim can't be to get the audience from an A to an A+, but from an F to a D, and without judgment of where they are starting from.
There's also a natural compulsion for a well-meaning educator to fill a literacy gap with a barrage of information, but this is what I call "factsplaining," and we know it doesn't work. And worse, it can backfire. In one study from 2014, parents were provided with factual information about vaccine safety, and it was the group that was already the most averse to vaccines that uniquely became even more averse.
Why? Our social identities and cognitive biases are stubborn gatekeepers when it comes to processing new information. We filter ideas through pre-existing beliefs—our values, our religions, our political ideologies. Incongruent ideas are rejected. Congruent ideas, no matter how absurd, are allowed through. We hear what we want to hear, and then our brains justify the input by creating narratives that preserve our identities. Even when we have all the facts, we can use them to support any worldview.
But social science has revealed many mechanisms for hijacking these processes through narrative storytelling, and this can form the foundation of a new kind of educational television.
Could new television series establish the baseline narratives for novel science like gene editing, quantum computing, or artificial intelligence?
As media creators, we can reject factsplaining and instead construct entertaining narratives that disrupt cognitive processes. Two-decade-old research tells us when people are immersed in entertaining fiction narratives, they loosen their defenses, opening a path for new information, editing attitudes, and inspiring new behavior. Where news about hot-button issues like climate change or vaccination might trigger resistance or a backfire effect, fiction can be crafted to be absorbing and, as a result, persuasive.
But the narratives can't be stuffed with information. They must be simplified. If this feels like the opposite of what an educator should be doing, it is possible to reduce the complexity of information, without oversimplification, through "exemplification," a framing device to tell the stories of individuals in specific circumstances that can speak to the greater issue without needing to explain it all. It's a technique you've seen used in biopics. The Discovery Channel true-crime miniseries Manhunt: Unabomber does many things well from a science storytelling perspective, including exemplifying the virtues of the scientific method through a character who argues for a new field of science, forensic linguistics, to catch one of the most notorious domestic terrorists in U.S. history.
We must also appeal to the audience's curiosity. We know curiosity is such a strong driver of human behavior that it can even counteract the biases put up by one's political ideology around subjects like climate change. If we treat science information like a product—and we should—advertising research tells us we can maximize curiosity though a Goldilocks effect. If the information is too complex, your show might as well be a PowerPoint presentation. If it's too simple, it's Sesame Street. There's a sweet spot for creating intrigue about new information when there's a moderate cognitive gap.
The science of "identification" tells us that the more the main character is endearing to a viewer, the more likely the viewer will adopt the character's worldview and journey of change. This insight further provides incentives to craft characters reflective of our audiences. If we accept our biases for what they are, we can understand why the messenger becomes more important than the message, because, without an appropriate messenger, the message becomes faint and ineffective. And research confirms that the stereotype-busting doctor-skeptic Dana Scully of The X-Files, a popular science-fiction series, was an inspiration for a generation of women who pursued science careers.
With these directions, we can start making a new kind of television. But is television itself still the right delivery medium? Americans do spend six hours per day—a quarter of their lives—watching video. And even with the rise of social media and apps, science-themed television shows remain popular, with four out of five adults reporting that they watch shows about science at least sometimes. CBS's The Big Bang Theory was the most-watched show on television in the 2017–2018 season, and Cartoon Network's Rick & Morty is the most popular comedy series among millennials. And medical and forensic dramas continue to be broadcast staples. So yes, it's as true today as it was in the 1980s when George Gerbner, the "cultivation theory" researcher who studied the long-term impacts of television images, wrote, "a single episode on primetime television can reach more people than all science and technology promotional efforts put together."
We know from cultivation theory that media images can shape our views of scientists. Quick, picture a scientist! Was it an old, white man with wild hair in a lab coat? If most Americans don't encounter research science firsthand, it's media that dictates how we perceive science and scientists. Characters like Sheldon Cooper and Rick Sanchez become the model. But we can correct that by representing professionals more accurately on-screen and writing characters more like Dana Scully.
Could new television series establish the baseline narratives for novel science like gene editing, quantum computing, or artificial intelligence? Or could new series counter the misinfodemics surrounding COVID-19 and vaccines through more compelling, corrective narratives? Social science has given us a blueprint suggesting they could. Binge-watching a show like the surreal NBC sitcom The Good Place doesn't replace a Ph.D. in philosophy, but its use of humor plants the seed of continued interest in a new subject. The goal of persuasive entertainment isn't to replace formal education, but it can inspire, shift attitudes, increase confidence in the knowledge of complex issues, and otherwise prime viewers for continued learning.
[Editor's Note: To read other articles in this special magazine issue, visit the beautifully designed e-reader version.]
Steven Pinker: Data Shows That Life Today Is Better Than Ever
The government shutdown. A volatile stock market. Climate change.
It's so easy to get discouraged by the latest headlines, argues Steven Pinker, that we lose sight of the bigger picture: life today is actually improving.
"To appreciate the world, we've got to look at numbers and trends."
Pinker, a cognitive psychologist from Harvard, says in his book "Enlightenment Now" that we're living at the greatest moment of progress in history, thanks to reason, science, and humanism. But today, he says, these ideals are under-appreciated, and we ignore them at our peril.
So he set out to provide a vigorous moral defense of the values of the Enlightenment by examining the evidence for their effectiveness. Across a range of categories from happiness and health to peace and safety, Pinker examines the data and reassures readers that this is a pretty great time to be alive. As we kick off the new year, he's hopeful that our embrace of science and reason will lead to an even more prosperous future. But political and cultural hurdles must still be overcome before the heroic story of human progress can continue to unfold.
Pinker spoke with our Editor-in-Chief Kira Peikoff in advance of the book's paperback release, which hits stores next Tuesday. This interview has been edited and condensed for clarity.
One anecdote you describe in the book was particularly striking: how the public reacted when the polio vaccine was announced. People took the day off work to celebrate, they smiled at each other in the streets, they offered to throw parades. Today, it's hard to imagine such prevalent enthusiasm for a new advance. How can we bring back a culture of respect and gratitude for science?
That's such a good question. And I wish I knew the answer. My contribution is just to remind people of how much progress we've made. It's easy to ignore if your view of the world comes from headlines, but there are some built-in biases in journalism that we have to counteract. Most things that happen all of a sudden are bad things: wars break out, terrorists attack, rampage shootings occur, whereas a lot of the things that make us better off creep up by stealth. But we have to become better aware of them.
It's unlikely that we're going to have replications of the great Salk event, which happened on a particular day, but I think we have to take lessons from cognitive science, from the work of people like Daniel Kahneman and Amos Tversky, showing how misled we can be by images and narratives and that to appreciate the world, we've got to look at numbers and trends.
The cover of "Enlightenment Now," which comes out in paperback next week.
You mention that the President's Bioethics Council under Bush was appointed to deal with "the looming threat of biomedical advances." Do you think that professional bioethicists are more of a hindrance than a help when it comes to creating truly enlightened science policy?
I do. I think that there are some problems in the culture of bioethics. And of course, I would not argue against that the concept of bioethics. Obviously, we have to do biomedical research and applications conscientiously and ethically. But the field called Bioethics tends to specialize in exotic thought experiments that tend to imagine the worst possible things that can happen, and often mire research in red tape that results in a net decrease in human welfare, whereas the goal of bioethics should be to enhance human welfare.
In an op-ed that I published in the Boston Globe a few years ago, I said, deliberately provocatively, that the main moral imperative of bioethics is to get out of the way since there's so much suffering that humans endure from degenerative diseases, from cancer, from heart disease and stroke. The potential for increasing happiness and well-being from biomedical research is just stupendous. So before we start to drag out Brave New World for the umpteenth time, or compare every advance in genetics to the Nazis, we should remember the costs of people dying prematurely from postponing advances in biomedical research.
Later in the book, you mention how much more efficient the production of food has become due to high-tech agriculture. But so many people today are leery of advances in the food industry, like GMOs. And we will have to feed 10 billion people in 2050. Are you concerned about how we will meet that challenge?
Yes, I think anyone has to be, and all the more reason we should be clear about what is simultaneously best for humans and for the planet, which is to grow as much food on this planet as possible. That ideal of density -- the less farmland the better -- runs up against the ideal of the organic farming and natural farming, which use lots of land. So genetically modified organisms and precision agriculture of the kind that is sometimes associated with Israel -- putting every last drop of water to use, delivering it when it's needed, using the minimum amount of fertilizer -- all of these technologically driven developments are going to be necessary to meet that need.
"The potential for increasing happiness and well-being from biomedical research is just stupendous."
You also mention "sustainability" as this big buzz word that you say is based on a flawed assumption that we will run out of resources rather than pivot to ingenious alternatives. What's the most important thing we can do as a culture to encourage innovation?
It has to be an ideal. We have restore it as what we need to encourage, to glorify in order to meet the needs of humanity. Governments have to play a role because lots of innovation is just too risky with benefits that are too widely diffuse for private companies and individuals to pursue. International cooperation has to play a role. And also, we need to change our environmental philosophy from a reflexive rejection of technology to an acknowledgement that it will be technology that is our best hope for staving off environmental problems.
And yet innovation and technology today are so often viewed fearfully by the public -- just look at AI and gene editing. If we need science and technology to solve our biggest challenges, how do we overcome this disconnect?
Part of it is simply making the argument that is challenging the ideology and untested assumptions behind traditional Greenism. Also, on the part of the promoters of technology themselves, it's crucial to make it not just clear, but to make it a reality that technology is going to be deployed to enhance human welfare.
That of course means an acknowledgement of the possible harms and limitations of technology. The fact that the first widely used genetically modified crop was soybeans that were resistant to herbicides, to Roundup -- that was at the very least a public relations disaster for genetically modified organisms. As opposed to say, highlighting crops that require less insecticide, less chemical fertilizers, less water level. The poster children for technology should really be cases that quite obviously benefit humanity.
"One of the surprises from 'Enlightenment Now' was how much moral progress depends on economic progress."
Finally, what is one emerging innovation that you're excited about for 2019?
I would say 4th generation nuclear power. Small modular reactors. Because everything depends on energy. For poor countries to get rich, they are going to have to consume far more energy than they do now and if they do it via fossil fuels, especially coal, that could spell disaster. Zero-carbon energy will allow poor countries to get richer -- and rich countries to stay rich without catastrophic environmental damage.
One of the surprises from "Enlightenment Now" was how much moral progress depends on economic progress. Rich countries not only allow the citizens to have cool gadgets, but all kinds of good things happen when a country gets rich, like Norway, Netherlands, Switzerland. Countries that are richer on average are more democratic, are less likely that to fight wars, are more feminist, are more environmentally conscientious, are smarter -- that is, they have a greater increase in IQ. So anything that makes a country get richer, and that's going to include a bunch of energy, is going to make humanity better off.
Kira Peikoff was the editor-in-chief of Leaps.org from 2017 to 2021. As a journalist, her work has appeared in The New York Times, Newsweek, Nautilus, Popular Mechanics, The New York Academy of Sciences, and other outlets. She is also the author of four suspense novels that explore controversial issues arising from scientific innovation: Living Proof, No Time to Die, Die Again Tomorrow, and Mother Knows Best. Peikoff holds a B.A. in Journalism from New York University and an M.S. in Bioethics from Columbia University. She lives in New Jersey with her husband and two young sons. Follow her on Twitter @KiraPeikoff.
Shoot for the Moon: Its Surface Contains a Pot of Gold
Here's a riddle: What do the Moon, nuclear weapons, clean energy of the future, terrorism, and lung disease all have in common?
One goal of India's upcoming space probe is to locate deposits of helium-3 that are worth trillions of dollars.
The answer is helium-3, a gas that's extremely rare on Earth but 100 million times more abundant on the Moon. This past October, the Lockheed Martin corporation announced a concept for a lunar landing craft that may return humans to the Moon in the coming decade, and yesterday China successfully landed the Change-4 probe on the far side of the Moon. Landing inside the Moon's deepest crater, the Chinese achieved a first in space exploration history.
Meanwhile, later this month, India's Chandrayaan-2 space probe will also land on the lunar surface. One of its goals is to locate deposits of helium-3 that are worth trillions of dollars, because it could be a fuel for nuclear fusion energy to generate electricity or propel a rocket.
The standard way that nuclear engineers are trying to achieve sustainable fusion uses fuels that are more plentiful on Earth: deuterium and tritium. But MIT researchers have found that adding small amounts of helium-3 to the mix could make it much more efficient, and thus a viable energy source much sooner that once thought.
Even if fusion is proven practical tomorrow, any kind of nuclear energy involves long waits for power plant construction measured in decades. However, mining helium-3 could be useful now, because of its non-energy applications. A major one is its ability to detect neutrons coming from plutonium that could be used in terrorist attacks. Here's how it works: a small amount of helium-3 is contained within a forensic instrument. When a neutron hits an atom of helium-3, the reaction produces tritium, a proton, and an electrical charge, alerting investigators to the possibility that plutonium is nearby.
Ironically, as global concern about a potential for hidden nuclear material increased in the early 2000s, so did the supply of helium-3 on Earth. That's because helium-3 comes from the decay of tritium, used in thermonuclear warheads (H-bombs). Thousands of such weapons have been dismantled from U.S. and Russian arsenals, making helium-3 available for plutonium detection, research, and other applications--including in the world of healthcare.
Helium-3 can help doctors diagnose lung diseases, since it enables imaging of the lungs in real time.
Helium-3 dramatically improves the ability of doctors to image the lungs in a range of diseases including asthma, chronic obstructive pulmonary disease and emphysema, cystic fibrosis, and bronchopulmonary dysplasia, which happens particularly in premature infants. Specifically, helium-3 is useful in magnetic resonance imaging (MRI), a procedure that creates images from within the body for diagnostic purposes.
But while a standard MRI allows doctors to visualize parts of the body like the heart or brain, it's useless for seeing the lungs. Because lungs are filled with air, which is much less dense than water or fat, effectively no signals are produced that would enable imaging.
To compensate for this problem, a patient can inhale gas that is hyperpolarized –meaning enhanced with special procedures so that the magnetic resonance signals from the lungs are finally readable. This gas is safe to breathe when mixed with enough oxygen to support life. Helium-3 is one such gas that can be hyperpolarized; since it produces such a strong signal, the MRI can literally see the air inside the lungs and in all of the airways, revealing intricate details of the bronchopulmonary tree. And it can do this in real time
The capability to show anatomic details of the lungs and airways, and the ability to display functional imaging as a patient breathes, makes helium-3 MRI far better than the standard method of testing lung function. Called spirometry, this method tells physicians how the lungs function overall, but does not home in on particular areas that may be causing a problem. Plus, spirometry requires patients to follow instructions and hold their breath, so it is not great for testing young children with pulmonary disease.
In recent years, the cost of helium-3 on Earth has skyrocketed.
Over the past several years, researchers have been developing MRI for lung testing using other hyperpolarized gases. The main alternative to helium-3 is xenon-129. Over the years, researchers have learned to overcome certain disadvantages of the latter, such as its potential to put patients to sleep. Since helium-3 provides the strongest signal, though, it is still the best gas for MRI studies in many lung conditions.
But the supply of helium-3 on Earth has been decreasing in recent years, due to the declining rate of dismantling of warheads, just as the Department of Homeland Security has required more and more of the gas for neutron detection. As a result, the cost of the gas has skyrocketed. Less is available now for medical uses – unless, of course, we begin mining it on the moon.
The question is: Are the benefits worth the 239,000-mile trip?