To Make Science Engaging, We Need a Sesame Street for Adults
This article is part of the magazine, "The Future of Science In America: The Election Issue," co-published by LeapsMag, the Aspen Institute Science & Society Program, and GOOD.
In the mid-1960s, a documentary producer in New York City wondered if the addictive jingles, clever visuals, slogans, and repetition of television ads—the ones that were captivating young children of the time—could be harnessed for good. Over the course of three months, she interviewed educators, psychologists, and artists, and the result was a bonanza of ideas.
Perhaps a new TV show could teach children letters and numbers in short animated sequences? Perhaps adults and children could read together with puppets providing comic relief and prompting interaction from the audience? And because it would be broadcast through a device already in almost every home, perhaps this show could reach across socioeconomic divides and close an early education gap?
Soon after Joan Ganz Cooney shared her landmark report, "The Potential Uses of Television in Preschool Education," in 1966, she was prototyping show ideas, attracting funding from The Carnegie Corporation, The Ford Foundation, and The Corporation for Public Broadcasting, and co-founding the Children's Television Workshop with psychologist Lloyd Morrisett. And then, on November 10, 1969, informal learning was transformed forever with the premiere of Sesame Street on public television.
For its first season, Sesame Street won three Emmy Awards and a Peabody Award. Its star, Big Bird, landed on the cover of Time Magazine, which called the show "TV's gift to children." Fifty years later, it's hard to imagine an approach to informal preschool learning that isn't Sesame Street.
And that approach can be boiled down to one word: Entertainment.
Despite decades of evidence from Sesame Street—one of the most studied television shows of all time—and more research from social science, psychology, and media communications, we haven't yet taken Ganz Cooney's concepts to heart in educating adults. Adults have news programs and documentaries and educational YouTube channels, but no Sesame Street. So why don't we? Here's how we can design a new kind of television to make science engaging and accessible for a public that is all too often intimidated by it.
We have to start from the realization that America is a nation of high-school graduates. By the end of high school, students have decided to abandon science because they think it's too difficult, and as a nation, we've made it acceptable for any one of us to say "I'm not good at science" and offload thinking to the ones who might be. So, is it surprising that a large number of Americans are likely to believe in conspiracy theories like the 25% that believe the release of COVID-19 was planned, the one in ten who believe the Moon landing was a hoax, or the 30–40% that think the condensation trails of planes are actually nefarious chemtrails? If we're meeting people where they are, the aim can't be to get the audience from an A to an A+, but from an F to a D, and without judgment of where they are starting from.
There's also a natural compulsion for a well-meaning educator to fill a literacy gap with a barrage of information, but this is what I call "factsplaining," and we know it doesn't work. And worse, it can backfire. In one study from 2014, parents were provided with factual information about vaccine safety, and it was the group that was already the most averse to vaccines that uniquely became even more averse.
Why? Our social identities and cognitive biases are stubborn gatekeepers when it comes to processing new information. We filter ideas through pre-existing beliefs—our values, our religions, our political ideologies. Incongruent ideas are rejected. Congruent ideas, no matter how absurd, are allowed through. We hear what we want to hear, and then our brains justify the input by creating narratives that preserve our identities. Even when we have all the facts, we can use them to support any worldview.
But social science has revealed many mechanisms for hijacking these processes through narrative storytelling, and this can form the foundation of a new kind of educational television.
Could new television series establish the baseline narratives for novel science like gene editing, quantum computing, or artificial intelligence?
As media creators, we can reject factsplaining and instead construct entertaining narratives that disrupt cognitive processes. Two-decade-old research tells us when people are immersed in entertaining fiction narratives, they loosen their defenses, opening a path for new information, editing attitudes, and inspiring new behavior. Where news about hot-button issues like climate change or vaccination might trigger resistance or a backfire effect, fiction can be crafted to be absorbing and, as a result, persuasive.
But the narratives can't be stuffed with information. They must be simplified. If this feels like the opposite of what an educator should be doing, it is possible to reduce the complexity of information, without oversimplification, through "exemplification," a framing device to tell the stories of individuals in specific circumstances that can speak to the greater issue without needing to explain it all. It's a technique you've seen used in biopics. The Discovery Channel true-crime miniseries Manhunt: Unabomber does many things well from a science storytelling perspective, including exemplifying the virtues of the scientific method through a character who argues for a new field of science, forensic linguistics, to catch one of the most notorious domestic terrorists in U.S. history.
We must also appeal to the audience's curiosity. We know curiosity is such a strong driver of human behavior that it can even counteract the biases put up by one's political ideology around subjects like climate change. If we treat science information like a product—and we should—advertising research tells us we can maximize curiosity though a Goldilocks effect. If the information is too complex, your show might as well be a PowerPoint presentation. If it's too simple, it's Sesame Street. There's a sweet spot for creating intrigue about new information when there's a moderate cognitive gap.
The science of "identification" tells us that the more the main character is endearing to a viewer, the more likely the viewer will adopt the character's worldview and journey of change. This insight further provides incentives to craft characters reflective of our audiences. If we accept our biases for what they are, we can understand why the messenger becomes more important than the message, because, without an appropriate messenger, the message becomes faint and ineffective. And research confirms that the stereotype-busting doctor-skeptic Dana Scully of The X-Files, a popular science-fiction series, was an inspiration for a generation of women who pursued science careers.
With these directions, we can start making a new kind of television. But is television itself still the right delivery medium? Americans do spend six hours per day—a quarter of their lives—watching video. And even with the rise of social media and apps, science-themed television shows remain popular, with four out of five adults reporting that they watch shows about science at least sometimes. CBS's The Big Bang Theory was the most-watched show on television in the 2017–2018 season, and Cartoon Network's Rick & Morty is the most popular comedy series among millennials. And medical and forensic dramas continue to be broadcast staples. So yes, it's as true today as it was in the 1980s when George Gerbner, the "cultivation theory" researcher who studied the long-term impacts of television images, wrote, "a single episode on primetime television can reach more people than all science and technology promotional efforts put together."
We know from cultivation theory that media images can shape our views of scientists. Quick, picture a scientist! Was it an old, white man with wild hair in a lab coat? If most Americans don't encounter research science firsthand, it's media that dictates how we perceive science and scientists. Characters like Sheldon Cooper and Rick Sanchez become the model. But we can correct that by representing professionals more accurately on-screen and writing characters more like Dana Scully.
Could new television series establish the baseline narratives for novel science like gene editing, quantum computing, or artificial intelligence? Or could new series counter the misinfodemics surrounding COVID-19 and vaccines through more compelling, corrective narratives? Social science has given us a blueprint suggesting they could. Binge-watching a show like the surreal NBC sitcom The Good Place doesn't replace a Ph.D. in philosophy, but its use of humor plants the seed of continued interest in a new subject. The goal of persuasive entertainment isn't to replace formal education, but it can inspire, shift attitudes, increase confidence in the knowledge of complex issues, and otherwise prime viewers for continued learning.
[Editor's Note: To read other articles in this special magazine issue, visit the beautifully designed e-reader version.]
New study: Hotter nights, climate change, cause sleep loss with some affected more than others
Data from the National Sleep Foundation finds that the optimal bedroom temperature for sleep is around 65 degrees Fahrenheit. But we may be getting fewer hours of "good sleepin’ weather" as the climate warms, according to a recent paper from researchers at the University of Copenhagen, Denmark.
Published in One Earth, the study finds that heat related to climate change could provide a “pathway” to sleep deprivation. The authors say the effect is “substantially larger” for those in lower-income countries. Hours of sleep decline when nighttime temperature exceeds 50 degrees, and temps higher than 77 reduce the chances of sleeping for seven hours by 3.5 percent. Even small losses associated with rising temperatures contribute significantly to people not getting enough sleep.
We’re affected by high temperatures at night because body temperature becomes more sensitive to the environment when slumbering. “Mechanisms that control for thermal regulation become more disordered during sleep,” explains Clete Kushida, a neurologist, professor of psychiatry at Stanford University and sleep medicine clinician.
The study finds that women and older adults are especially vulnerable. Worldwide, the elderly lost over twice as much sleep per degree of warming compared to younger people. This phenomenon was apparent between the ages of 60 and 70, and it increased beyond age 70. “The mechanism for balancing temperatures appears to be more affected with age,” Kushida adds.
Others disproportionately affected include those who live in regions with more greenhouse gas (GHG) emissions, which accelerate climate change, and people in hotter locales will lose more sleep per degree of warming, according to the study, with suboptimal temperatures potentially eroding 50 to 58 hours of sleep per person per year. One might think that those in warmer countries can adapt to the heat, but the researchers found no evidence for such adjustments. “We actually found those living in the warmest climate regions were impacted over twice as much as those in the coldest climate regions,” says the study's lead author, Kelton Minor, a Ph.D. candidate at the University of Copenhagen’s Center for Social Data Science.
Short sleep can reduce cognitive performance and productivity, increase absenteeism from work or school, and lead to a host of other physical and psychosocial problems. These issues include a compromised immune system, hypertension, depression, anger and suicide, say the study’s authors. According to a fact sheet by the U.S. Centers for Disease Control and Prevention, a third of U.S. adults already report sleeping fewer hours than the recommended amount, even though sufficient sleep “is not a luxury—it is something people need for good health.”
Equitable policy and planning are needed to ensure equal access to cooling technologies in a warming world.
Beyond global health, a sleep-deprived world will impact the economy as the climate warms. “Less productivity at work, associated with sleep loss or deprivation, would result in more sick days on a global scale, not just in individual countries,” Kushida says.
Unlike previous research that measured sleep patterns with self-reported surveys and controlled lab experiments, the study in One Earth offers a global analysis that relies on sleep-tracking wristbands that link more than seven million sleep records of 47,628 adults across 68 countries to local and daily meteorological data, offering new insight into the environmental impact on human sleep. Controlling for individual, seasonal and time-varying confounds, researchers found the main way that higher temperatures shorten slumber is by delaying sleep onset.
Heat effects on sleep were seen in industrialized countries including those with access to air conditioning, notes the study. Air conditioning may buffer high indoor temperatures, but they also increase GHG emissions and ambient heat displacement, thereby exacerbating the unequal burdens of global and local warming. Continued urbanization is expected to contribute to these problems.
Previous sleep studies have found an inverse U-shaped response to temperature in highly controlled settings, with subjects sleeping worse when room temperatures were either too cold or too warm. However, “people appear far better at adapting to colder outside temperatures than hotter conditions,” says Minor.
Although there are ways of countering the heat effect, some populations have more access to them. “Air conditioning can help with the effect of higher temperature, but not all individuals can afford air conditioners,” says Kushida. He points out that this could drive even greater inequity between higher- and lower-income countries.
Equitable policy and planning are needed to ensure equal access to cooling technologies in a warming world. “Clean and renewable energy systems and interventions will be needed to mitigate and adapt to ongoing climate warming,” Minor says. Future research should investigate “policy, planning and design innovation,” which could reduce the impact of sweltering temperatures on a good night’s sleep for the good of individuals, society and our planet, asserts the study.
Unabated and on its current trajectory, by 2099 suboptimal temperatures could shave 50 to 58 hours of sleep per person per year, predict the study authors. “Down the road, as technology develops, there might be ways of enabling people to adapt on a large scale to these higher temperatures,” says Kushida. “Right now, it’s not there.”
Why we need to get serious about ending aging
It is widely acknowledged that even a small advance in anti-aging science could yield benefits in terms of healthy years that the traditional paradigm of targeting specific diseases is not likely to produce. A more youthful population would also be less vulnerable to epidemics. Approximately 93 percent of all COVID-19 deaths reported in the U.S. occurred among those aged 50 or older. The potential economic benefits would be tremendous. A more youthful population would consume less medical resources and be able to work longer. A recent study published in Nature estimates that a slowdown in aging that increases life expectancy by one year would save $38 trillion per year for the U.S. alone.
A societal effort to understand, slow down, arrest or even reverse aging of at least the size of our response to COVID-19 would therefore be a rational commitment. In fact, given that America’s older population is projected to grow dramatically, and the cost of healthcare with it, it is not an overstatement to say that the future welfare of the country may depend on solving aging.
This year, the kingdom of Saudi Arabia has announced that it will spend up to 1 billion dollars per year on science with the potential to slow down the aging process. We have also seen important investments from billionaires like Google co-founder Larry Page, Amazon founder Jeff Bezos, business magnate Larry Ellison, and PayPal co-founder Peter Thiel.
The U.S. government, however, is lagging: The National Institutes of Health spent less than one percent of its $43 billion budget for the fiscal year of 2021 on the National Institute on Aging’s Division of Aging Biology. When you visit the division’s webpage you find that their mission statement carefully omits any mention of the possibility of slowing down the aging process.
There is a lack of political will and leadership on the issue, and the idea that we should seek to do something about aging is generally met with a great deal of suspicion and trepidation. In a large representative study conducted by the Pew Research Center in 2013, only 38% of the respondents said that they would want a treatment that could slow the aging process and allow them to live at least 120 years. Apparently, most people prefer, or at least do not mind, to age and die within a natural lifespan. This result has been confirmed by smaller studies and it is, I think, surprising. Are we not supposed to live in a youth-culture? Are people not supposed to want to stay young and alive forever? Is self-preservation not the strong drive we have always assumed it to be?
We are inundated and saturated with an ideology of death-acceptance.
In my book, The Case against Death, I suggest that we have been culturally conditioned to think that it is virtuous to accept aging and death. We are taught to believe that although aging and death seem gruesome, they are what is best for us, all things considered. This is what we are supposed to think, and the majority accept it. I call this the Wise View because death acceptance has been the dominant view of philosophers since the beginning. Socrates compared our earthly life to an illness and a prison and described death as a healer and a liberator. The Buddha taught that life is suffering and that the way to escape suffering is to end the cycle of birth, death and rebirth. Stoic philosophers from Zeno to Marcus Aurelius believed that everything that happens in accordance with nature is good, and that therefore we should not only accept death but welcome it as an aspect of a perfect totality.
Epicureans agreed with these rival schools and famously argued that death cannot harm us because where we are, death is not, and where death is we are not. We cannot be harmed if we are not, so death is harmless. The simple view that death actually can harm us greatly is one of the least philosophical views one can hold.
In The Case Against Death, philosopher Ingemar Patrick Linden argues that we frown on using science to prolong healthy life only because we're culturally conditioned to think that way.
Many of the stories we tell promote the Wise View. One of the earliest known pieces of literature, the Epic of Gilgamesh, follows Gilgamesh on a quest for eternal life ending with the wisdom that death is the destiny of man. Today we learn about the tedium of immortality from the children’s book Tuck Everlasting by Natalie Babbitt, and we are warned about the vice of wanting to resist death in other books and films such as J.K Rowling’s Harry Potter, where Voldemort must kill Harry as a step towards his own immortality; C.S. Lewis’ The Chronicles of Narnia where the White Witch has gained immortal youth and madness in equal measures; J.R.R. Tolkien’s Lord of the Rings trilogy where the ring extends the wearer’s life but can also destroy them, as exemplified by the creep Gollum; and Doctor Strange where life extension is the one magical power that is taboo. In Star Wars, Yoda, a stereotype of the sage, teaches us the wisdom handed down by philosophers and prophets: “Death is a natural part of life. Rejoice for those around you who transform into the Force. Mourn them do not. Miss them do not.”
We are inundated and saturated with an ideology of death-acceptance. Can the dear reader name one single story where the hero is pursuing anti-aging, longevity or immortality and the villain tries to stop her?
The Wise View resonates with us partly because we think that there is nothing we can do about aging and death, so we do not want to wish for what we cannot have. Youth and immortality are sour grapes to us. Believing that death is, all things considered, not such a bad thing, protects us from experiencing our aging and approaching death as a gruesome tragedy. This need to escape the thought that we are heading towards a personal catastrophe explains why many are so quick to accept arguments against radical life extension, despite their often glaring weaknesses.
One of the most common objections to radical life extension is that aging and death are natural. The problem with this argument is that many things that are natural are very bad, such as cancer, and other things that are not natural are very good, such as a cure for cancer. Why are we so sure that cancer is bad? Because we assume that it is bad to die. Indeed, nothing is more natural than wanting to live. We seem to need philosophers and story tellers to talk us out of it and, in the words of a distinguished bioethicist, “instruct and somewhat moderate our lust for life.”
Another standard objection is that we need a deadline, and that without death we could postpone every action forever. “Death brings urgency and seriousness to life,” say proponents of this view, but there are several problems with this argument. Even if our lives were endless, there would still be many things we would have to do at a certain time, and that could not be redone, for example, saving our planet from being destroyed, or becoming the first person on Venus. And if we prefer pleasant endless lives over unpleasant endless ones, we will have to exercise, eat right, keep our word, develop our talents, show up for time at work, pay our taxes by the due date, remember birthdays, and so on.
The Wise View provides us with a feel-good bromide for the anxiety created by the foreknowledge of our decay and death by telling us that these are not evils, but blessings in disguise. Once perhaps an innocuous delusion, today the view stands in the way of a necessary societal commitment to research that can prolong our healthy life.
Besides, even if we succeeded in ending aging, we would still die from other causes. Given the rate of accidental deaths we would be fortunate to live to age 2000 all things equal. So even if, contrary to what I have argued, we do need a deadline, we can still argue that the natural lifespan that we now labor under is inhuman and that it forces each human to limit her ambitions and to become only a fragment of all that she that could have been. Our tight time constraint imposes tragic choices and inflated opportunity costs. Death does not make life matter; it makes time matter.
The perhaps most awful argument against radical life extension is grounded in a pessimism that holds life in such little regard that it says that best of all is never to have born. This view was expressed by Ecclesiastes in the Hebrew Bible, by Sophocles and several other ancient Greeks, by the German philosopher Arthur Schopenhauer, and recently by, among others, the South African philosopher David Benatar who argues that it is wrong to bring children into the world and that we should euthanize all sentient life. Pessimism, one suspects, largely appeals to some for reasons having to do with personal temperament, but insofar as it is built on factual beliefs, they can be addressed by providing a less negatively biased understanding of the world, by pointing out that curing aging would decrease the badness that they are so hypersensitive to, and by reminding them that if life really becomes unbearable, they are free to quit at any time. Other means of persuasion could include recommending sleep, exercise and taking long brisk walks in nature.
The Wise View provides us with a feel-good bromide for the anxiety created by the foreknowledge of our decay and death by telling us that these are not evils, but blessings in disguise. Once perhaps an innocuous delusion, today the view stands in the way of a necessary societal commitment to research that can prolong our healthy life. We need abandon it and openly admit that aging is a scourge that deserves to be fought with the combined energies equaling those expended on fighting COVID-19, Alzheimer’s disease, cancer, stroke and all the other illnesses for which aging is the greatest risk factor. The fight to end aging transcends ordinary political boundaries and is therefore the kind of grand unifying enterprise that could re-energize a society suffering from divisiveness and the sense of a lack of a common purpose. It is hard to imagine a more worthwhile cause.