To Make Science Engaging, We Need a Sesame Street for Adults
This article is part of the magazine, "The Future of Science In America: The Election Issue," co-published by LeapsMag, the Aspen Institute Science & Society Program, and GOOD.
In the mid-1960s, a documentary producer in New York City wondered if the addictive jingles, clever visuals, slogans, and repetition of television ads—the ones that were captivating young children of the time—could be harnessed for good. Over the course of three months, she interviewed educators, psychologists, and artists, and the result was a bonanza of ideas.
Perhaps a new TV show could teach children letters and numbers in short animated sequences? Perhaps adults and children could read together with puppets providing comic relief and prompting interaction from the audience? And because it would be broadcast through a device already in almost every home, perhaps this show could reach across socioeconomic divides and close an early education gap?
Soon after Joan Ganz Cooney shared her landmark report, "The Potential Uses of Television in Preschool Education," in 1966, she was prototyping show ideas, attracting funding from The Carnegie Corporation, The Ford Foundation, and The Corporation for Public Broadcasting, and co-founding the Children's Television Workshop with psychologist Lloyd Morrisett. And then, on November 10, 1969, informal learning was transformed forever with the premiere of Sesame Street on public television.
For its first season, Sesame Street won three Emmy Awards and a Peabody Award. Its star, Big Bird, landed on the cover of Time Magazine, which called the show "TV's gift to children." Fifty years later, it's hard to imagine an approach to informal preschool learning that isn't Sesame Street.
And that approach can be boiled down to one word: Entertainment.
Despite decades of evidence from Sesame Street—one of the most studied television shows of all time—and more research from social science, psychology, and media communications, we haven't yet taken Ganz Cooney's concepts to heart in educating adults. Adults have news programs and documentaries and educational YouTube channels, but no Sesame Street. So why don't we? Here's how we can design a new kind of television to make science engaging and accessible for a public that is all too often intimidated by it.
We have to start from the realization that America is a nation of high-school graduates. By the end of high school, students have decided to abandon science because they think it's too difficult, and as a nation, we've made it acceptable for any one of us to say "I'm not good at science" and offload thinking to the ones who might be. So, is it surprising that a large number of Americans are likely to believe in conspiracy theories like the 25% that believe the release of COVID-19 was planned, the one in ten who believe the Moon landing was a hoax, or the 30–40% that think the condensation trails of planes are actually nefarious chemtrails? If we're meeting people where they are, the aim can't be to get the audience from an A to an A+, but from an F to a D, and without judgment of where they are starting from.
There's also a natural compulsion for a well-meaning educator to fill a literacy gap with a barrage of information, but this is what I call "factsplaining," and we know it doesn't work. And worse, it can backfire. In one study from 2014, parents were provided with factual information about vaccine safety, and it was the group that was already the most averse to vaccines that uniquely became even more averse.
Why? Our social identities and cognitive biases are stubborn gatekeepers when it comes to processing new information. We filter ideas through pre-existing beliefs—our values, our religions, our political ideologies. Incongruent ideas are rejected. Congruent ideas, no matter how absurd, are allowed through. We hear what we want to hear, and then our brains justify the input by creating narratives that preserve our identities. Even when we have all the facts, we can use them to support any worldview.
But social science has revealed many mechanisms for hijacking these processes through narrative storytelling, and this can form the foundation of a new kind of educational television.
Could new television series establish the baseline narratives for novel science like gene editing, quantum computing, or artificial intelligence?
As media creators, we can reject factsplaining and instead construct entertaining narratives that disrupt cognitive processes. Two-decade-old research tells us when people are immersed in entertaining fiction narratives, they loosen their defenses, opening a path for new information, editing attitudes, and inspiring new behavior. Where news about hot-button issues like climate change or vaccination might trigger resistance or a backfire effect, fiction can be crafted to be absorbing and, as a result, persuasive.
But the narratives can't be stuffed with information. They must be simplified. If this feels like the opposite of what an educator should be doing, it is possible to reduce the complexity of information, without oversimplification, through "exemplification," a framing device to tell the stories of individuals in specific circumstances that can speak to the greater issue without needing to explain it all. It's a technique you've seen used in biopics. The Discovery Channel true-crime miniseries Manhunt: Unabomber does many things well from a science storytelling perspective, including exemplifying the virtues of the scientific method through a character who argues for a new field of science, forensic linguistics, to catch one of the most notorious domestic terrorists in U.S. history.
We must also appeal to the audience's curiosity. We know curiosity is such a strong driver of human behavior that it can even counteract the biases put up by one's political ideology around subjects like climate change. If we treat science information like a product—and we should—advertising research tells us we can maximize curiosity though a Goldilocks effect. If the information is too complex, your show might as well be a PowerPoint presentation. If it's too simple, it's Sesame Street. There's a sweet spot for creating intrigue about new information when there's a moderate cognitive gap.
The science of "identification" tells us that the more the main character is endearing to a viewer, the more likely the viewer will adopt the character's worldview and journey of change. This insight further provides incentives to craft characters reflective of our audiences. If we accept our biases for what they are, we can understand why the messenger becomes more important than the message, because, without an appropriate messenger, the message becomes faint and ineffective. And research confirms that the stereotype-busting doctor-skeptic Dana Scully of The X-Files, a popular science-fiction series, was an inspiration for a generation of women who pursued science careers.
With these directions, we can start making a new kind of television. But is television itself still the right delivery medium? Americans do spend six hours per day—a quarter of their lives—watching video. And even with the rise of social media and apps, science-themed television shows remain popular, with four out of five adults reporting that they watch shows about science at least sometimes. CBS's The Big Bang Theory was the most-watched show on television in the 2017–2018 season, and Cartoon Network's Rick & Morty is the most popular comedy series among millennials. And medical and forensic dramas continue to be broadcast staples. So yes, it's as true today as it was in the 1980s when George Gerbner, the "cultivation theory" researcher who studied the long-term impacts of television images, wrote, "a single episode on primetime television can reach more people than all science and technology promotional efforts put together."
We know from cultivation theory that media images can shape our views of scientists. Quick, picture a scientist! Was it an old, white man with wild hair in a lab coat? If most Americans don't encounter research science firsthand, it's media that dictates how we perceive science and scientists. Characters like Sheldon Cooper and Rick Sanchez become the model. But we can correct that by representing professionals more accurately on-screen and writing characters more like Dana Scully.
Could new television series establish the baseline narratives for novel science like gene editing, quantum computing, or artificial intelligence? Or could new series counter the misinfodemics surrounding COVID-19 and vaccines through more compelling, corrective narratives? Social science has given us a blueprint suggesting they could. Binge-watching a show like the surreal NBC sitcom The Good Place doesn't replace a Ph.D. in philosophy, but its use of humor plants the seed of continued interest in a new subject. The goal of persuasive entertainment isn't to replace formal education, but it can inspire, shift attitudes, increase confidence in the knowledge of complex issues, and otherwise prime viewers for continued learning.
[Editor's Note: To read other articles in this special magazine issue, visit the beautifully designed e-reader version.]
Phil Gutis never had a stellar memory, but when he reached his early 50s, it became a problem he could no longer ignore. He had trouble calculating how much to tip after a meal, finding things he had just put on his desk, and understanding simple driving directions.
From 1998-2017, industry sources reported 146 failed attempts at developing Alzheimer's drugs.
So three years ago, at age 54, he answered an ad for a drug trial seeking people experiencing memory issues. He scored so low in the memory testing he was told something was wrong. M.R.I.s and PET scans confirmed that he had early-onset Alzheimer's disease.
Gutis, who is a former New York Times reporter and American Civil Liberties Union spokesman, felt fortunate to get into an advanced clinical trial of a new treatment for Alzheimer's disease. The drug, called aducanumab, had shown promising results in earlier studies.
Four years of data had found that the drug effectively reduced the burden of protein fragments called beta-amyloids, which destroy connections between nerve cells. Amyloid plaques are found in the brains of patients with Alzheimer's disease and are associated with impairments in thinking and memory.
Gutis eagerly participated in the clinical trial and received 35 monthly infusions. "For the first 20 infusions, I did not know whether I was receiving the drug or the placebo," he says. "During the last 15 months, I received aducanumab. But it really didn't matter if I was receiving the drug or the placebo because on March 21, the trial was stopped because [the drug company] Biogen found that the treatments were ineffective."
The news was devastating to the trial participants, but also to the Alzheimer's research community. Earlier this year, another pharmaceutical company, Roche, announced it was discontinuing two of its Alzheimer's clinical trials. From 1998-2017, industry sources reported 146 failed attempts at developing Alzheimer's drugs. There are five prescription drugs approved to treat its symptoms, but a cure remains elusive. The latest failures have left researchers scratching their heads about how to approach attacking the disease.
The failure of aducanumab was also another setback for the estimated 5.8 million people who have Alzheimer's in the United States. Of these, around 5.6 million are older than 65 and 200,000 suffer from the younger-onset form, including Gutis.
Gutis is understandably distraught about the cancellation of the trial. "I really had hopes it would work. So did all the patients."
While drug companies have failed so far, another group is stepping up to expedite the development of a cure: venture philanthropists.
For now, he is exercising every day to keep his blood flowing, which is supposed to delay the progression of the disease, and trying to eat a low-fat diet. "But I know that none of it will make a difference. Alzheimer's is a progressive disease. There are no treatments to delay it, let alone cure it."
But while drug companies have failed so far, another group is stepping up to expedite the development of a cure: venture philanthropists. These are successful titans of industry and dedicated foundations who are donating large sums of money to fill a much-needed void – funding research to look for new biomarkers.
Biomarkers are neurochemical indicators that can be used to detect the presence of a disease and objectively measure its progression. There are currently no validated biomarkers for Alzheimer's, but researchers are actively studying promising candidates. The hope is that they will find a reliable way to identify the disease even before the symptoms of mental decline show up, so that treatments can be directed at a very early stage.
Howard Fillit, Founding Executive Director and Chief Science Officer of the Alzheimer's Drug Discovery Foundation, says, "We need novel biomarkers to diagnose Alzheimer's disease and related dementias. But pharmaceutical companies don't put money into biomarkers research."
One of the venture philanthropists who has recently stepped up to the task is Bill Gates. In January 2018, he announced his father had Alzheimer's disease in an interview on the Today Show with Maria Shriver, whose father Sargent Shriver, died of Alzheimer's disease in 2011. Gates told Ms. Shriver that he had invested $100 million into Alzheimer's research, with $50 million of his donation going to Dementia Discovery Fund, which looks for new cures and treatments.
That August, Gates joined other investors in a new fund called Diagnostics Accelerator. The project aims to supports researchers looking to speed up new ideas for earlier and better diagnosis of the disease.
Gates and other donors committed more than $35 million to help launch it, and this April, Jeff and Mackenzie Bezos joined the coalition, bringing the current program funding to nearly $50 million.
"It makes sense that a challenge this significant would draw the attention of some of the world's leading thinkers."
None of these funders stand to make a profit on their donation, unlike traditional research investments by drug companies. The standard alternatives to such funding have upsides -- and downsides.
As Bill Gates wrote on his blog, "Investments from governments or charitable organizations are fantastic at generating new ideas and cutting-edge research -- but they're not always great at creating usable products, since no one stands to make a profit at the end of the day.
"Venture capital, on the other end of the spectrum, is more likely to develop a test that will reach patients, but its financial model favors projects that will earn big returns for investors. Venture philanthropy splits the difference. It incentivizes a bold, risk-taking approach to research with an end goal of a real product for real patients. If any of the projects backed by Diagnostics Accelerator succeed, our share of the financial windfall goes right back into the fund."
Gutis said he is thankful for any attention given to finding a cure for Alzheimer's.
"Most doctors and scientists will tell you that we're still in the dark ages when it comes to fully understanding how the brain works, let alone figuring out the cause or treatment for Alzheimer's.
"It makes sense that a challenge this significant would draw the attention of some of the world's leading thinkers. I only hope they can be more successful with their entrepreneurial approach to finding a cure than the drug companies have been with their more traditional paths."
If any malady proves the fragile grace of the human genome, it is sickle cell disease.
If experimental treatments receive regulatory approval, it would be a watershed breakthrough for tens of thousands of Americans.
It occurs because of a single "misspelled" letter of DNA, causing red blood cells to run low on oxygen and transforming the hemoglobin in each cell into a stiff rod. Normally round cells become rigid crescents that hamper the flow of blood throughout the body, like leaves clumping in a drain.
Strokes in toddlers are merely the beginning of the circulatory calamities this disease may inflict. Most sickled cells cannot carry oxygen through the body, causing anemia as well as excruciating chronic pain. Older patients are at risk of kidney failure, heart disease and all the other collateral damage caused by poor circulation. Few live beyond middle age.
The only way to cure it has been through a bone marrow transplant from a donor, which requires not only a closely matching volunteer, but bouts of chemotherapy to allow new stem cells to take root, as well as rounds of immunosuppressive drugs that may last for years.
Recent advances in genomic medicine may soon alter the disease's outlook, although many obstacles remain.
In one treatment under development, patient's skin cells are converted into stem cells, allowing them to be inserted into the bone marrow without the need for a donor. Another treatment known as gene therapy involves replacing the aberrant gene in the patient's body with new genetic material.
Although both remain in clinical trials -- and also require at least chemotherapy -- they have shown promise. Matthew Hsieh, a hematologist and staff scientist with the National Heart Lung and Blood Institute in Maryland, has performed about 10 gene therapy procedures over the past three years as part of a clinical trial. Ongoing tweaks in the procedure have led to the blood in more recent patients showing sickle cell trait -- not a perfect outcome, but one that leaves patients with far fewer symptoms than if they have the full-blown disease.
If one or both treatments receive regulatory approval, it would be a watershed breakthrough for the tens of thousands of Americans who suffer from the disease.
Yet it is entirely possible many patients may decline the cure.
A Painful History
The vast majority of sickle cell sufferers in the U.S. -- well beyond 90 percent -- are African-American, a population with a historically uneasy relationship toward healthcare.
"There is a lot of data on distrust between African-Americans and American medical institutions," says J. Corey Williams, a psychiatrist with the Children's Hospital of Philadelphia who has written extensively on racial disparities in healthcare. "It comes from a long legacy of feeling victimized by medicine."
"What you hear from many patients is 'I am not going to be your guinea pig, and I am not going to be experimented on.'"
As a result, Williams is among several clinicians interviewed for this story who believe a cure for sickle cell disease would be embraced reluctantly.
"What you hear from many patients is 'I am not going to be your guinea pig, and I am not going to be experimented on.' And so the history of African-Americans and research will manifest as we develop gene therapies for [these] patients," says Christopher L. Edwards, a clinical psychologist and researcher with the Maya Angelou Center for Health Equity at the Wake Forest University School of Medicine.
Fear among African-Americans of becoming guinea pigs is well-founded. The first c-sections and fistula repairs occurring in North America were performed on enslaved women -- all without consent and virtually none with anesthesia.
Modern 20th century medicine led to the Tuskegee syphilis experiments conducted by the U.S. Public Health Service. Researchers withheld treatment from some 400 African-American men from the 1930s well into the 1970s to observe how they reacted to the disease -- even though curative antibiotics had been around for decades. Only news reports ended the experiment.
The long-standing distrust of American healthcare in the African-American community is also baked into the care provided to sickle cell patients. Despite affecting one in 365 African-Americans, there is no disease registry to assist clinical trials, according to Mary Hulihan, a blood disorders epidemiologist with the Centers for Disease Control and Prevention. Edwards says many sufferers are suspicious of being monitored.
Meanwhile, only two drugs are available to alleviate the worst symptoms. The first one, hydroxyurea, received FDA approval only in 1998 -- nearly 90 years after the disease was first diagnosed. Moreover, Edwards says that some sufferers shy away from using hydroxyurea because it is also used to treat cancer. It's part of what he calls the "myth and folklore" in the African-American community about sickle cell disease.
Economics plays a role as well in the often-fragmented care such patients receive. According to CDC data, many patients rely extensively on public insurance programs such as Medicaid, whose coverage varies from state to state.
A Tough Transition
Edwards notes that sickle cell sufferers usually receive good care when they're children because of support provided by family members. But that often breaks down in adulthood. According to CDC data, an adult sickle cell patient visits a hospital emergency room three times as often as a child patient.
The consensus is that the path to a medical cure for sickle cell will first need to be smoothed over with a talk cure.
Modupe Idowu, a hematologist with the University of Texas Health system, estimates that there are perhaps a dozen comprehensive care centers for the estimated 100,000 sickle cell patients in the U.S., including the one she operates in Houston. That means a significant proportion of those afflicted are on their own to procure care.
And since many patients are on Medicaid, "a lot of hematologists that train to take care of blood disorders, many are not interested in treating [sickle cell disease] because the reimbursement for providers is not great," Idowu says.
Hsieh acknowledges that many of his patients can be suspicious about the care they are receiving. Frustration with fragmented care is usually the biggest driver, he adds.
Meanwhile, the skepticism that patients have about the treatments they seek is often reciprocated by their caregivers.
"The patients have experiences with medication and know what works at a very young age (for their pain)," Edwards says. Such expertise demonstrated by an African-American patient often leads to them being labeled as narcotics seekers.
The Correct Path
This all begs the question of how to deploy a cure. Idowu, who regularly holds town hall-style meetings with Houston-area patients, often must allay anxieties. For example, the gene therapy approach uses a harmless virus to transport new genetic material into cells. That virus happens to be a benign version of HIV, and convincing patients they won't be infected with HIV is a fraught issue.
The consensus is that the path to a medical cure for sickle cell will first need to be smoothed over with a talk cure.
Idowu tries to hammer home the fact that patients are afforded vastly more protections than in the past. "There are a lot of committees and investigational review boards that keep track of clinical trials; things just don't happen anymore as they did in the past," she says. She also believes it helps if more providers of color communicate to patients.
Hsieh is very straightforward with his patients. He informs them about the HIV vector but assures them no one has ever tested positive for the virus as a result of its use.
Edwards notes that since many patients suffer psychosocial trauma as a result of their chronic pain, there already is some counseling infrastructure in place to help them cope. He believes such resources will have to be stretched further as a cure looms closer.
In the absence of formal mental health services, straight talk may be the best way to overcome wariness.
"If patients have misgivings, we try our best to address them, and let them know at the end of the day it is their decision to make," Hsieh says. "And even the patients who have gone through the gene therapy and it didn't work well -- they're still glad they took the chance."