To Make Science Engaging, We Need a Sesame Street for Adults
This article is part of the magazine, "The Future of Science In America: The Election Issue," co-published by LeapsMag, the Aspen Institute Science & Society Program, and GOOD.
In the mid-1960s, a documentary producer in New York City wondered if the addictive jingles, clever visuals, slogans, and repetition of television ads—the ones that were captivating young children of the time—could be harnessed for good. Over the course of three months, she interviewed educators, psychologists, and artists, and the result was a bonanza of ideas.
Perhaps a new TV show could teach children letters and numbers in short animated sequences? Perhaps adults and children could read together with puppets providing comic relief and prompting interaction from the audience? And because it would be broadcast through a device already in almost every home, perhaps this show could reach across socioeconomic divides and close an early education gap?
Soon after Joan Ganz Cooney shared her landmark report, "The Potential Uses of Television in Preschool Education," in 1966, she was prototyping show ideas, attracting funding from The Carnegie Corporation, The Ford Foundation, and The Corporation for Public Broadcasting, and co-founding the Children's Television Workshop with psychologist Lloyd Morrisett. And then, on November 10, 1969, informal learning was transformed forever with the premiere of Sesame Street on public television.
For its first season, Sesame Street won three Emmy Awards and a Peabody Award. Its star, Big Bird, landed on the cover of Time Magazine, which called the show "TV's gift to children." Fifty years later, it's hard to imagine an approach to informal preschool learning that isn't Sesame Street.
And that approach can be boiled down to one word: Entertainment.
Despite decades of evidence from Sesame Street—one of the most studied television shows of all time—and more research from social science, psychology, and media communications, we haven't yet taken Ganz Cooney's concepts to heart in educating adults. Adults have news programs and documentaries and educational YouTube channels, but no Sesame Street. So why don't we? Here's how we can design a new kind of television to make science engaging and accessible for a public that is all too often intimidated by it.
We have to start from the realization that America is a nation of high-school graduates. By the end of high school, students have decided to abandon science because they think it's too difficult, and as a nation, we've made it acceptable for any one of us to say "I'm not good at science" and offload thinking to the ones who might be. So, is it surprising that a large number of Americans are likely to believe in conspiracy theories like the 25% that believe the release of COVID-19 was planned, the one in ten who believe the Moon landing was a hoax, or the 30–40% that think the condensation trails of planes are actually nefarious chemtrails? If we're meeting people where they are, the aim can't be to get the audience from an A to an A+, but from an F to a D, and without judgment of where they are starting from.
There's also a natural compulsion for a well-meaning educator to fill a literacy gap with a barrage of information, but this is what I call "factsplaining," and we know it doesn't work. And worse, it can backfire. In one study from 2014, parents were provided with factual information about vaccine safety, and it was the group that was already the most averse to vaccines that uniquely became even more averse.
Why? Our social identities and cognitive biases are stubborn gatekeepers when it comes to processing new information. We filter ideas through pre-existing beliefs—our values, our religions, our political ideologies. Incongruent ideas are rejected. Congruent ideas, no matter how absurd, are allowed through. We hear what we want to hear, and then our brains justify the input by creating narratives that preserve our identities. Even when we have all the facts, we can use them to support any worldview.
But social science has revealed many mechanisms for hijacking these processes through narrative storytelling, and this can form the foundation of a new kind of educational television.
Could new television series establish the baseline narratives for novel science like gene editing, quantum computing, or artificial intelligence?
As media creators, we can reject factsplaining and instead construct entertaining narratives that disrupt cognitive processes. Two-decade-old research tells us when people are immersed in entertaining fiction narratives, they loosen their defenses, opening a path for new information, editing attitudes, and inspiring new behavior. Where news about hot-button issues like climate change or vaccination might trigger resistance or a backfire effect, fiction can be crafted to be absorbing and, as a result, persuasive.
But the narratives can't be stuffed with information. They must be simplified. If this feels like the opposite of what an educator should be doing, it is possible to reduce the complexity of information, without oversimplification, through "exemplification," a framing device to tell the stories of individuals in specific circumstances that can speak to the greater issue without needing to explain it all. It's a technique you've seen used in biopics. The Discovery Channel true-crime miniseries Manhunt: Unabomber does many things well from a science storytelling perspective, including exemplifying the virtues of the scientific method through a character who argues for a new field of science, forensic linguistics, to catch one of the most notorious domestic terrorists in U.S. history.
We must also appeal to the audience's curiosity. We know curiosity is such a strong driver of human behavior that it can even counteract the biases put up by one's political ideology around subjects like climate change. If we treat science information like a product—and we should—advertising research tells us we can maximize curiosity though a Goldilocks effect. If the information is too complex, your show might as well be a PowerPoint presentation. If it's too simple, it's Sesame Street. There's a sweet spot for creating intrigue about new information when there's a moderate cognitive gap.
The science of "identification" tells us that the more the main character is endearing to a viewer, the more likely the viewer will adopt the character's worldview and journey of change. This insight further provides incentives to craft characters reflective of our audiences. If we accept our biases for what they are, we can understand why the messenger becomes more important than the message, because, without an appropriate messenger, the message becomes faint and ineffective. And research confirms that the stereotype-busting doctor-skeptic Dana Scully of The X-Files, a popular science-fiction series, was an inspiration for a generation of women who pursued science careers.
With these directions, we can start making a new kind of television. But is television itself still the right delivery medium? Americans do spend six hours per day—a quarter of their lives—watching video. And even with the rise of social media and apps, science-themed television shows remain popular, with four out of five adults reporting that they watch shows about science at least sometimes. CBS's The Big Bang Theory was the most-watched show on television in the 2017–2018 season, and Cartoon Network's Rick & Morty is the most popular comedy series among millennials. And medical and forensic dramas continue to be broadcast staples. So yes, it's as true today as it was in the 1980s when George Gerbner, the "cultivation theory" researcher who studied the long-term impacts of television images, wrote, "a single episode on primetime television can reach more people than all science and technology promotional efforts put together."
We know from cultivation theory that media images can shape our views of scientists. Quick, picture a scientist! Was it an old, white man with wild hair in a lab coat? If most Americans don't encounter research science firsthand, it's media that dictates how we perceive science and scientists. Characters like Sheldon Cooper and Rick Sanchez become the model. But we can correct that by representing professionals more accurately on-screen and writing characters more like Dana Scully.
Could new television series establish the baseline narratives for novel science like gene editing, quantum computing, or artificial intelligence? Or could new series counter the misinfodemics surrounding COVID-19 and vaccines through more compelling, corrective narratives? Social science has given us a blueprint suggesting they could. Binge-watching a show like the surreal NBC sitcom The Good Place doesn't replace a Ph.D. in philosophy, but its use of humor plants the seed of continued interest in a new subject. The goal of persuasive entertainment isn't to replace formal education, but it can inspire, shift attitudes, increase confidence in the knowledge of complex issues, and otherwise prime viewers for continued learning.
[Editor's Note: To read other articles in this special magazine issue, visit the beautifully designed e-reader version.]
Trying to get a handle on CRISPR news in 2019 can be daunting if you haven't been avidly reading up on it for the last five years.
CRISPR as a diagnostic tool would be a major game changer for medicine and agriculture.
On top of trying to grasp how the science works, and keeping track of its ever expanding applications, you may also have seen coverage of an ongoing legal battle about who owns the intellectual property behind the gene-editing technology CRISPR-Cas9. And then there's the infamous controversy surrounding a scientist who claimed to have used the tool to edit the genomes of two babies in China last year.
But gene editing is not the only application of CRISPR-based biotechnologies. In the future, it may also be used as a tool to diagnose infectious diseases, which could be a major game changer for medicine and agriculture.
How It Works
CRISPR is an acronym for a naturally occurring DNA sequence that normally protects microbes from viruses. It's been compared to a Swiss army knife that can recognize an invader's DNA and precisely destroy it. Repurposed for humans, CRISPR can be paired with a protein called Cas9 that can detect a person's own DNA sequence (usually a problematic one), cut it out, and replace it with a different sequence. Used this way, CRISPR-Cas9 has become a valuable gene-editing tool that is currently being tested to treat numerous genetic diseases, from cancer to blood disorders to blindness.
CRISPR can also be paired with other proteins, like Cas13, which target RNA, the single-stranded twin of DNA that viruses rely on to infect their hosts and cause disease. In a future clinical setting, CRISPR-Cas13 might be used to diagnose whether you have the flu by cutting a target RNA sequence from the virus. That spliced sequence could stick to a paper test strip, causing a band to show up, like on a pregnancy test strip. If the influenza virus and its RNA are not present, no band would show up.
To understand how close to reality this diagnostic scenario is right now, leapsmag chatted with CRISPR pioneer Dr. Feng Zhang, a molecular biologist at the Broad Institute of MIT and Harvard.
What do you think might be the first point of contact that a regular person or patient would have with a CRISPR diagnostic tool?
FZ: I think in the long run it will be great to see this for, say, at-home disease testing, for influenza and other sorts of important public health [concerns]. To be able to get a readout at home, people can potentially quarantine themselves rather than traveling to a hospital and then carrying the risk of spreading that disease to other people as they get to the clinic.
"You could conceivably get a readout during the same office visit, and then the doctor will be able to prescribe the right treatment right away."
Is this just something that people will use at home, or do you also foresee clinical labs at hospitals applying CRISPR-Cas13 to samples that come through?
FZ: I think we'll see applications in both settings, and I think there are advantages to both. One of the nice things about SHERLOCK [a playful acronym for CRISPR-Cas13's longer name, Specific High-sensitivity Enzymatic Reporter unLOCKing] is that it's rapid; you can get a readout fairly quickly. So, right now, what people do in hospitals is they will collect your sample and then they'll send it out to a clinical testing lab, so you wouldn't get a result back until many hours if not several days later. With SHERLOCK, you could conceivably get a readout during the same office visit, and then the doctor will be able to prescribe the right treatment right away.
I just want to clarify that when you say a doctor would take a sample, that's referring to urine, blood, or saliva, correct?
FZ: Right. Yeah, exactly.
Thinking more long term, are there any Holy Grail applications that you hope CRISPR reaches as a diagnostic tool?
FZ: I think in the developed world we'll hopefully see this being used for influenza testing, and many other viral and pathogen-based diseases—both at home and also in the hospital—but I think the even more exciting direction is that this could be used and deployed in parts of the developing world where there isn't a fancy laboratory with elaborate instrumentation. SHERLOCK is relatively inexpensive to develop, and you can turn it into a paper strip test.
Can you quantify what you mean by relatively inexpensive? What range of prices are we talking about here?
FZ: So without accounting for economies of scale, we estimate that it can cost less than a dollar per test. With economy of scale that cost can go even lower.
Is there value in developing what is actually quite an innovative tool in a way that visually doesn't seem innovative because it's reminiscent of a pregnancy test? And I don't mean that as an insult.
FZ: [Laughs] Ultimately, we want the technology to be as accessible as possible, and pregnancy test strips have such a convenient and easy-to-use form. I think modeling after something that people are already familiar with and just changing what's under the hood makes a lot of sense.
Feng Zhang
(Photo credit: Justin Knight, McGovern Institute)
It's probably one of the most accessible at-home diagnostic tools at this point that people are familiar with.
FZ: Yeah, so if people know how to use that, then using something that's very similar to it should make the option very easy.
You've been quite vocal in calling for some pauses in CRISPR-Cas9 research to make sure it doesn't outpace the ethics of establishing pregnancies with that version of the tool. Do you have any concerns about using CRISPR-Cas13 as a diagnostic tool?
I think overall, the reception for CRISPR-based diagnostics has been overwhelmingly positive. People are very excited about the prospect of using this—for human health and also in agriculture [for] detection of plant infections and plant pathogens, so that farmers will be able to react quickly to infection in the field. If we're looking at contamination of foods by certain bacteria, [food safety] would also be a really exciting application.
Do you feel like the controversies surrounding using CRISPR as a gene-editing tool have overshadowed its potential as a diagnostics tool?
FZ: I don't think so. I think the potential for using CRISPR-Cas9 or CRISPR-Cas12 for gene therapy, and treating disease, has captured people's imaginations, but at the same time, every time I talk with someone about the ability to use CRISPR-Cas13 as a diagnostic tool, people are equally excited. Especially when people see the very simple paper strip that we developed for detecting diseases.
Are CRISPR as a gene-editing tool and CRISPR as a diagnostics tool on different timelines, as far as when the general public might encounter them in their real lives?
FZ: I think they are all moving forward quite quickly. CRISPR as a gene-editing tool is already being deployed in human health and agriculture. We've already seen the approval for the development of growing genome-edited mushrooms, soybeans, and other crop species. So I think people will encounter those in their daily lives in that manner.
Then, of course, for disease treatment, that's progressing rapidly as well. For patients who are affected by sickle cell disease, and also by a degenerative eye disease, clinical trials are already starting in those two areas. Diagnostic tests are also developing quickly, and I think in the coming couple of years, we'll begin to see some of these reaching into the public realm.
"There are probably 7,000 genetic diseases identified today, and most of them don't have any way of being treated."
As far its limits, will it be hard to use CRISPR as a diagnostic tool in situations where we don't necessarily understand the biological underpinnings of a disease?
FZ: CRISPR-Cas13, as a diagnostic tool, at least in the current way that it's implemented, is a detection tool—it's not a discovery tool. So if we don't know what we're looking for, then it's going to be hard to develop Cas13 to detect it. But even in the case of a new infectious disease, if DNA sequencing or RNA sequencing information is available for that new virus, then we can very rapidly program a Cas13-based system to detect it, based on that sequence.
What's something you think the public misunderstands about CRISPR, either in general, or specifically as a diagnostic tool, that you wish were better understood?
FZ: That's a good question. CRISPR-Cas9 and CRISPR-Cas12 as gene editing tools, and also CRISPR-Cas13 as a diagnostic tool, are able to do some things, but there are still a lot of capabilities that need to be further developed. So I think the potential for the technology will unfold over the next decade or so, but it will take some time for the full impact of the technology to really get realized in real life.
What do you think that full impact is?
FZ: There are probably 7,000 genetic diseases identified today, and most of them don't have any way of being treated. It will take some time for CRISPR-Cas9 and Cas12 to be really developed for addressing a larger number of those diseases. And then for CRISPR-based diagnostics, I think you'll see the technology being applied in a couple of initial cases, and it will take some time to develop that more broadly for many other applications.
Researchers Are Experimenting With Magic Mushrooms' Fascinating Ability to Improve Mental Health Disorders
Mental illness is a dark undercurrent in the lives of tens of millions of Americans. According to the World Health Organization, about 450 million people worldwide have a mental health disorder, which cut across all demographics, cultures, and socioeconomic classes.
One area of research seems to herald the first major breakthrough in decades — hallucinogen-assisted psychotherapy.
The U.S. National Institute on Mental Health estimates that severely debilitating mental health disorders cost the U.S. more than $300 billion per year, and that's not even counting the human toll of broken lives, devastated families, and a health care system stretched to the limit.
However, one area of research seems to herald the first major breakthrough in decades — hallucinogen-assisted psychotherapy. Drugs like psilocybin (obtained from "magic mushrooms"), LSD, and MDMA (known as the club drug, ecstasy) are being tested in combination with talk therapy for a variety of mental illnesses. These drugs, administered by a psychotherapist in a safe and controlled environment, are showing extraordinary results that other conventional treatments would take years to accomplish.
But the therapy will likely continue to face an uphill legal battle before it achieves FDA approval. It is up against not only current drug laws (all psychedelics remain illegal on the federal level) and strict FDA regulations, but a powerful status quo that has institutionalized fear of any drug used for recreational purposes.
How We Got Here
According to researchers Sean Belouin and Jack Henningfield, the use of psychedelic drugs has a long and winding history. It's believed that hallucinogenic substances have been used in healing ceremonies and religious rituals for thousands of years. Indigenous people in the U.S., Mexico, and Central and South America still use distillations from the peyote cactus and other hallucinogens in their religious ceremonies. And psilocybin mushrooms, also capable of causing hallucinations, grow throughout the world and are thought to have been used for millennia.
But psychedelic drugs didn't receive much research until 1943, when LSD's psychoactive effects were discovered by chemist Albert Hoffman. Hoffman tested the compound he had discovered years earlier on himself and found that the drug had profound mind-altering effects. He made the drug available to psychiatrists who were interested in testing it out as an adjunct to talk therapy. There were no truly effective drugs at the time for mental illnesses, and psychiatrists early on saw the possibility of psychedelics providing a kind of emotional catharsis that might represent therapeutic breakthroughs for many mental conditions.
During the 1950s and early 1960s, psychedelic drugs saw an increase in use within psychology, according to a 2018 article in Neuropharmacology. During this time, research on LSD and other hallucinogens was the subject of over 1,000 scientific papers, six international conferences, and several dozen books. LSD was widely prescribed to psychiatric patients, and by 1958, Hoffman had identified psilocybin as the hallucinogenic in "magic mushrooms," which was also administered. By 1965 some type of hallucinogenic had been given to more than 40,000 patients.
Then came a sea change. Psychedelic drugs caught the public's attention and there was widespread experimentation. The association with Hippie counterculture alarmed many and led to a legal and cultural backlash that stigmatized psychedelics for decades to come. In the mid-1960s, psychedelics were designated Schedule 1 drugs in the U.S., meaning they were seen as having "no accepted medical use and a high potential of abuse." Schedule 1 also implied that the drugs were more dangerous than cocaine, methamphetamine, Vicodin, and oxycodone, a perception that was far from proven but became an institutionalized part of drug enforcement. Medical use ceased and research dwindled down to close to zero.
For years, research into hallucinogenic-assisted therapy was basically dormant, until the 1990s when interest started to revive. In the 2000s, the first modern clinical trials of psilocybin were done by Francisco Moreno at the University of Arizona and Matthew Johnson at Johns Hopkins. Scientists in the 2010s, including Robin Carhart-Harris, started studying the use of psychedelics in the treatment of major depressive disorder (MDD).
In small trials with these patients, results showed significant and long-term improvement (for at least six months) after only two episodes of psilocybin-assisted therapy. In several studies, the guided experience of administering one of the psychedelic drugs along with psychotherapy seemed to result in marked improvement in a variety of disorders, including depression, anxiety, PTSD, and addiction.
The drugs allowed patients to experience a radical reframing of reality, helping them to become "unstuck" from the anxious and negative tape loops that played in their heads. According to Michael Pollan, an American author and professor of journalism who wrote the book, "How to Change Your Mind: What the New Science of Psychedelics Teaches Us About Consciousness, Dying, Addiction, Depression and Transcendence," psychedelics allow patients to see their lives through a kind of wide angle, where boundaries vanish and they're able to experience "consciousness without self." This perspective is usually accompanied by profound feelings of oneness with the universe.
Pollan likens the effect to a fresh blanketing of snow over the deep ruts of unproductive thinking, which characterize depression and other mental disorders. Once the new snow has fallen, the ruts disappear and a new path can be chosen. Relief from symptoms comes immediately, and in numerous studies, is sustained for months.
In spite of growing evidence for the safety and efficacy of psychedelic-assisted psychotherapy, the practice has major hurdles to cross on its quest for FDA approval.
Some of the most influential studies have focused on testing the use of psilocybin to treat end-of-life anxiety in patients diagnosed with a terminal illness. In 2016, Stephen Ross and colleagues tested a single dose of psilocybin on 29 subjects with end-of-life anxiety due to a terminal cancer diagnosis. A control group received a niacin pill. The researchers reported that of the 29 receiving psilocybin, all of the patients had "immediate, substantial, and sustained clinical benefits," even after six months.
In spite of growing evidence for the safety and efficacy of psychedelic-assisted psychotherapy, the practice has major hurdles to cross on its quest for FDA approval. The National Institutes of Health is not currently supporting any clinical trials and the research relies on private sources of funding, often with small research organizations that cannot afford the high cost of clinical trials.
Given the controversial nature of the drugs, researchers in psychedelic-assisted therapies may be cautious about publicity. Leapsmag reached out to several leaders in the field but none agreed to an interview.
Looking Ahead
Still, interest is building in the combination of psychedelic drugs and psychotherapy for treatment-resistant mental illnesses. Two months ago, Johns Hopkins University launched a new psychedelic research center with an infusion of $17 million from private investors. The center will focus on psychedelic-assisted therapies for opioid addiction, Alzheimer's disease, PTSD and major depression, to name just a few. Currently, of 51 cancer patients enrolled in a Hopkins study, more than half reported a decrease in depression and anxiety after receiving therapy with psilocybin. Two thirds even claimed that the experience was one of the most meaningful of their lives.
It is not unheard of for Schedule 1 drugs to make their way into medical use if they're shown to provide a bonafide improvement in a medical condition through well-designed clinical trials. MDMA, for example, has been designated a Breakthrough Therapy by the FDA as part of an Investigational New Drug Application. The FDA has agreed to a special protocol assessment that could speed up phase three clinical trials. The next step is for the data to be submitted to the FDA for an in-depth regulatory review. If the FDA agrees, MDMA-assisted therapy could be legalized.
Will the positive buzz around psychedelics persuade the NIH to provide the millions of dollars needed to push the field forward?
Robin Carhart-Harris believes the first drug that will receive FDA clearance is psilocybin, which he speculates could become legal in the next five to ten years. However, the field of psychedelic-assisted therapy needs more and larger clinical trials, preferably with the support of the NIH.
As Rucker and colleagues noted, the scientific literature bends toward the theme that the drugs are not necessarily therapeutic in and of themselves. It's the use of hallucinogens within a "psychologically supportive context" with a trained expert that's helpful. It's currently unknown how many users of recreational drugs are self-medicating for depression, anxiety, or other mental illnesses. But without the guidance of a knowledgeable psychotherapist, those who are self-medicating may not be helping themselves at all.
Will the positive buzz around psychedelics persuade the NIH to provide the millions of dollars needed to push the field forward? Given the changing climate in public opinion around these drugs and the need for breakthroughs in mental health therapies, it's possible that in the foreseeable future, this bold new therapy will become part of the mental health arsenal.