To Make Science Engaging, We Need a Sesame Street for Adults
This article is part of the magazine, "The Future of Science In America: The Election Issue," co-published by LeapsMag, the Aspen Institute Science & Society Program, and GOOD.
In the mid-1960s, a documentary producer in New York City wondered if the addictive jingles, clever visuals, slogans, and repetition of television ads—the ones that were captivating young children of the time—could be harnessed for good. Over the course of three months, she interviewed educators, psychologists, and artists, and the result was a bonanza of ideas.
Perhaps a new TV show could teach children letters and numbers in short animated sequences? Perhaps adults and children could read together with puppets providing comic relief and prompting interaction from the audience? And because it would be broadcast through a device already in almost every home, perhaps this show could reach across socioeconomic divides and close an early education gap?
Soon after Joan Ganz Cooney shared her landmark report, "The Potential Uses of Television in Preschool Education," in 1966, she was prototyping show ideas, attracting funding from The Carnegie Corporation, The Ford Foundation, and The Corporation for Public Broadcasting, and co-founding the Children's Television Workshop with psychologist Lloyd Morrisett. And then, on November 10, 1969, informal learning was transformed forever with the premiere of Sesame Street on public television.
For its first season, Sesame Street won three Emmy Awards and a Peabody Award. Its star, Big Bird, landed on the cover of Time Magazine, which called the show "TV's gift to children." Fifty years later, it's hard to imagine an approach to informal preschool learning that isn't Sesame Street.
And that approach can be boiled down to one word: Entertainment.
Despite decades of evidence from Sesame Street—one of the most studied television shows of all time—and more research from social science, psychology, and media communications, we haven't yet taken Ganz Cooney's concepts to heart in educating adults. Adults have news programs and documentaries and educational YouTube channels, but no Sesame Street. So why don't we? Here's how we can design a new kind of television to make science engaging and accessible for a public that is all too often intimidated by it.
We have to start from the realization that America is a nation of high-school graduates. By the end of high school, students have decided to abandon science because they think it's too difficult, and as a nation, we've made it acceptable for any one of us to say "I'm not good at science" and offload thinking to the ones who might be. So, is it surprising that a large number of Americans are likely to believe in conspiracy theories like the 25% that believe the release of COVID-19 was planned, the one in ten who believe the Moon landing was a hoax, or the 30–40% that think the condensation trails of planes are actually nefarious chemtrails? If we're meeting people where they are, the aim can't be to get the audience from an A to an A+, but from an F to a D, and without judgment of where they are starting from.
There's also a natural compulsion for a well-meaning educator to fill a literacy gap with a barrage of information, but this is what I call "factsplaining," and we know it doesn't work. And worse, it can backfire. In one study from 2014, parents were provided with factual information about vaccine safety, and it was the group that was already the most averse to vaccines that uniquely became even more averse.
Why? Our social identities and cognitive biases are stubborn gatekeepers when it comes to processing new information. We filter ideas through pre-existing beliefs—our values, our religions, our political ideologies. Incongruent ideas are rejected. Congruent ideas, no matter how absurd, are allowed through. We hear what we want to hear, and then our brains justify the input by creating narratives that preserve our identities. Even when we have all the facts, we can use them to support any worldview.
But social science has revealed many mechanisms for hijacking these processes through narrative storytelling, and this can form the foundation of a new kind of educational television.
Could new television series establish the baseline narratives for novel science like gene editing, quantum computing, or artificial intelligence?
As media creators, we can reject factsplaining and instead construct entertaining narratives that disrupt cognitive processes. Two-decade-old research tells us when people are immersed in entertaining fiction narratives, they loosen their defenses, opening a path for new information, editing attitudes, and inspiring new behavior. Where news about hot-button issues like climate change or vaccination might trigger resistance or a backfire effect, fiction can be crafted to be absorbing and, as a result, persuasive.
But the narratives can't be stuffed with information. They must be simplified. If this feels like the opposite of what an educator should be doing, it is possible to reduce the complexity of information, without oversimplification, through "exemplification," a framing device to tell the stories of individuals in specific circumstances that can speak to the greater issue without needing to explain it all. It's a technique you've seen used in biopics. The Discovery Channel true-crime miniseries Manhunt: Unabomber does many things well from a science storytelling perspective, including exemplifying the virtues of the scientific method through a character who argues for a new field of science, forensic linguistics, to catch one of the most notorious domestic terrorists in U.S. history.
We must also appeal to the audience's curiosity. We know curiosity is such a strong driver of human behavior that it can even counteract the biases put up by one's political ideology around subjects like climate change. If we treat science information like a product—and we should—advertising research tells us we can maximize curiosity though a Goldilocks effect. If the information is too complex, your show might as well be a PowerPoint presentation. If it's too simple, it's Sesame Street. There's a sweet spot for creating intrigue about new information when there's a moderate cognitive gap.
The science of "identification" tells us that the more the main character is endearing to a viewer, the more likely the viewer will adopt the character's worldview and journey of change. This insight further provides incentives to craft characters reflective of our audiences. If we accept our biases for what they are, we can understand why the messenger becomes more important than the message, because, without an appropriate messenger, the message becomes faint and ineffective. And research confirms that the stereotype-busting doctor-skeptic Dana Scully of The X-Files, a popular science-fiction series, was an inspiration for a generation of women who pursued science careers.
With these directions, we can start making a new kind of television. But is television itself still the right delivery medium? Americans do spend six hours per day—a quarter of their lives—watching video. And even with the rise of social media and apps, science-themed television shows remain popular, with four out of five adults reporting that they watch shows about science at least sometimes. CBS's The Big Bang Theory was the most-watched show on television in the 2017–2018 season, and Cartoon Network's Rick & Morty is the most popular comedy series among millennials. And medical and forensic dramas continue to be broadcast staples. So yes, it's as true today as it was in the 1980s when George Gerbner, the "cultivation theory" researcher who studied the long-term impacts of television images, wrote, "a single episode on primetime television can reach more people than all science and technology promotional efforts put together."
We know from cultivation theory that media images can shape our views of scientists. Quick, picture a scientist! Was it an old, white man with wild hair in a lab coat? If most Americans don't encounter research science firsthand, it's media that dictates how we perceive science and scientists. Characters like Sheldon Cooper and Rick Sanchez become the model. But we can correct that by representing professionals more accurately on-screen and writing characters more like Dana Scully.
Could new television series establish the baseline narratives for novel science like gene editing, quantum computing, or artificial intelligence? Or could new series counter the misinfodemics surrounding COVID-19 and vaccines through more compelling, corrective narratives? Social science has given us a blueprint suggesting they could. Binge-watching a show like the surreal NBC sitcom The Good Place doesn't replace a Ph.D. in philosophy, but its use of humor plants the seed of continued interest in a new subject. The goal of persuasive entertainment isn't to replace formal education, but it can inspire, shift attitudes, increase confidence in the knowledge of complex issues, and otherwise prime viewers for continued learning.
[Editor's Note: To read other articles in this special magazine issue, visit the beautifully designed e-reader version.]
Move Over, Iron Man. A Real-Life Power Suit Helped This Paralyzed Grandmother Learn to Run.
Puschel Sorensen first noticed something was wrong when her fingertips began to tingle. Later that day, she grew weak and fell.
It picked up small electrical impulses on her skin's surface and turned them into full movement in her legs.
Her family rushed her to the doctor, where she received the devastating diagnosis of Guillain-Barré Syndrome -- a rare and rapidly progressing autoimmune disorder that attacks the myelin sheath covering nerves.
Sorensen, a once-spry grandmother in her late fifties, spent 54 days in intensive care in 2018. When she was finally transferred to a rehab facility near her home in Florida, she was still on a feeding tube and ventilator, and was paralyzed from the neck down. Progress with traditional physical therapy was slow.
Sorensen in the hospital after her diagnosis of Guillain-Barré syndrome.
And then everything changed. Sorensen began using a cutting-edge technology called an exoskeleton to relearn how to walk. In the vein of Iron Man's fictional power suit, it confers strength and mobility to the wearer that isn't possible otherwise. In Sorensen's case, her device, called HAL – for hybrid assistive limb -- picked up small electrical impulses on her skin's surface and turned them into full movement in her legs while she attempted to walk on a treadmill.
"It was very difficult, but super awesome," recalls Sorensen, of first using the device. "The robot was having to do all the work for me."
Amazingly, within a year, she was running. She's one of 38 patients who have used HAL to recover from accidents or medical catastrophes.
Cyberdyne's hybrid assistive limb technology.
"How do you thank someone for giving them back the ability to walk, the ability to live your life again?" Sorensen asks effusively.
It's still early days for such exoskeleton devices, which number perhaps a few thousand worldwide, according to data from the handful of manufacturers who create them with any scale. But the devices' ability to dramatically rehabilitate patients like Sorensen highlights their potential to extract untold numbers of people from wheelchairs, and even to usher in a new paradigm for caregiving – one of the fastest growing segments of the U.S. economy.
"I've been a physical therapist for 16 years, and (these devices) help teach patients the right way to move in rehabilitation," says Robert McIver, director of clinical technology at the Brooks Cybernic Treatment Center, part of the Brooks Rehabilitation Hospital in Jacksonville, Fla, where Sorensen recovered.
Another patient there, a 17-year-old named George with a snowboarding injury that paralyzed his legs, was getting around with a walker within 20 sessions.
As patients progress in their recoveries, so does exoskeleton technology. Jack Peurach, CEO of Ekso, one of the leaders in the space, believes within a decade they could resemble an article of clothing (a "magic pair of pants" is his phrase). They also may become inexpensive and reliable enough to transition from a medical to a consumer device. McIver sees them eventually being used in the home on an ongoing basis as a personal assistive device, much like a walker or cane, to prevent falls in elderly people.
Such a transition "certainly could eventually lessen the need for caregivers," says Sharona Hoffman, a professor of law at Case Western University in Cleveland who has written extensively on aging and bioethics. "We have a real shortage of caregivers, so that would be a good thing."
Of course, having an aging and disabled population using exoskeletons in much the same way as an Apple Watch raises issues of its own.
Dr. Elizabeth Landsverk, a California-based geriatrician and founder of a company that performs house calls for elderly patients, believes the tech holds some promise in easing the burden on caregivers, who sometimes have to lift or move patients without assistance. But she also believes exoskeletons could become overhyped.
"I don't see robotics as completely replacing the caregiver," she says. And even if exoskeletons became akin to articles of clothing, she is skeptical of how convenient they could become.
"It's hard enough to get into support hose. Would an older person be able to get in and out of it on their own?" she asks, noting that a patient's cognitive levels could pose a huge barrier to donning such a device without assistance.
If personal exoskeletons did wildly succeed, Hoffman wonders whether they would leave the elderly more physically mobile yet also more socially isolated, since caregivers or even residing in an assisted living facility may no longer be required. Or, if they were priced in the hundreds or thousands of dollars, he worries that the cost would exacerbate social inequalities among the elderly and disabled.
"It's almost like a bad dream that [my illness] happened."
With any technology that confers superhuman ability, there's also the question of appropriate usage. Even the fictional Power Loader in the movie Alien required an operator's license. In the real world, such an approach would likely pay dividends.
"We would have to make sure physicians are well-trained in these devices, and patients have a way of getting training to operate them that is thorough and responsible," Hoffman says.
But despite some unresolved questions, it is a remarkable achievement to be able to give people back their lives thanks to new technology.
"It's almost like a bad dream that [my illness] happened," says Sorensen, who managed to walk in her daughter's wedding after her recovery. "Because now everything is pretty much back to normal and it's awesome."
23andMe Is Using Customers’ Genetic Data to Develop Drugs. Is This Brilliant or Dubious?
Leading direct-to-consumer (DTC) genetic testing companies are continuously unveiling novel ways to leverage their vast stores of genetic data.
"23andMe will tell you what diseases you have and then sell you the drugs to treat them."
As reported last week, 23andMe's latest concept is to develop and license new drugs using the data of consumers who have opted in to let their information be used for research. To date, over 10 million people have used the service and around 80 percent have opted in, making its database one of the largest in the world.
Culture researcher Dr. Julia Creet is one of the foremost experts on the DTC genetic testing industry, and in her forthcoming book, The Genealogical Sublime, she bluntly examines whether such companies' motives and interests are in sync with those of consumers.
Leapsmag caught up with Creet about the latest news and the wider industry's implications for health and privacy.
23andMe has just announced that it plans to license a newly developed anti-inflammatory drug, the first one created using its customers' genetic data, to Almirall, a pharma company in Spain. What's your take?
I think this development is the next step in the evolution of the company and its "double-sided" marketing model. In the past, as it enticed customers to give it their DNA, it sold the results and the medical information divulged by customers to other drug companies. Now it is positioning itself to reap the profits of a new model by developing treatments itself.
Given that there are many anti-inflammatory drugs on the market already, whatever Almirall produces might not have much of an impact. We might see this canny move as a "proof of concept," that 23andMe has learned how to "leverage" its genetic data without having to sell them to a third party. In a way, the privacy provisions will be much less complicated, and the company stands to attract investment as it turns itself into [a pseudo pharmaceutical company], a "pharma-psuedocal" company.
Emily Drabant Conley, the president of business development, has said that 23andMe is pursuing other drug compounds and may conduct their own clinical trials rather than licensing them out to their existing research partners. The end goal, it seems, is to make direct-to-consumer DNA testing to drug production and sales back to that same consumer base a seamless and lucrative circle. You have to admit it's a brilliant business model. 23andMe will tell you what diseases you have and then sell you the drugs to treat them.
In your new book, you describe how DTC genetic testing companies have capitalized on our innate human desire to connect with or ancestors and each other. I quote you: "This industry has taken that potent, spiritual, all-too-human need to belong... and monetized it in a particularly exploitative way." But others argue that DTC genetic testing companies are merely providing a service in exchange for fair-market compensation. So where does exploitation come into the picture?
Yes, the industry provides a fee for service, but that's only part of the story. The rest of the story reveals a pernicious industry that hides its business model behind the larger science project of health and heredity. All of the major testing companies play on the idea of "lack," that we can't know who we are unless we buy information about ourselves. When you really think about it, "Who do you think you are?" is a pernicious question that suggests that we don't or can't know who we or to whom we are related without advanced data searches and testing. This existential question used to be a philosophical question; now the answers are provided by databases that acquire more valuable information than they provide in the exchange.
"It's a brilliant business model that exploits consumer naiveté."
As you've said before, consumers are actually paying to be the product because the companies are likely to profit more from selling their genetic data. Could you elaborate?
The largest databases, AncestryDNA and 23andMe, have signed lucrative agreements with biotech companies that pay them for the de-identified data of their customers. What's so valuable is the DNA combined with the family relationships. Consumers provide the family relationships and the companies link and extrapolate the results to larger and larger family trees. Combined with the genetic markers for certain diseases, or increased susceptibility to certain diseases, these databases are very valuable for biotech research.
None of that value will ever be returned to consumers except in the form of for-profit drugs. Ancestry, in particular, has removed all information about its "research partners" from its website, making it very difficult to see how it is profiting from its third-party sales. 23andMe is more open about its "two-sided business model," but encourages consumers to donate their information to science. It's a brilliant business model that exploits consumer naiveté.
A WIRED journalist wrote that "23andMe has been sharing insights gleaned from consented customer data with GSK and at least six other pharmaceutical and biotechnology firms for the past three and a half years." Is this a consumer privacy risk?
I don't see that 23andMe did anything to which consumers didn't consent, albeit through arguably unreadable terms and conditions. The part that worries me more is the 300 phenotype data points that the company has collected on its consumers through longitudinal surveys designed, as Anne Wojcicki, CEO and Co-founder of 23andMe, put it, "to circumvent medical records and just self-report."
Everyone is focused on the DNA, but it's the combination of genetic samples, genealogical information and health records that is the most potent dataset, and 23andMe has figured out a way to extract all three from consumers.