What’s the Right Way to Regulate Gene-Edited Crops?
In the next few decades, humanity faces its biggest food crisis since the invention of the plow. The planet's population, currently 7.6 billion, is expected to reach 10 billion by 2050; to avoid mass famine, according to the World Resource Institute, we'll need to produce 70 percent more calories than we do today.
Imagine that a cheap, easy-to-use, and rapidly deployable technology could make crops more fertile and strengthen their resistance to threats.
Meanwhile, climate change will bring intensifying assaults by heat, drought, storms, pests, and weeds, depressing farm yields around the globe. Epidemics of plant disease—already laying waste to wheat, citrus, bananas, coffee, and cacao in many regions—will spread ever further through the vectors of modern trade and transportation.
So here's a thought experiment: Imagine that a cheap, easy-to-use, and rapidly deployable technology could make crops more fertile and strengthen their resistance to these looming threats. Imagine that it could also render them more nutritious and tastier, with longer shelf lives and less vulnerability to damage in shipping—adding enhancements to human health and enjoyment, as well as reduced food waste, to the possible benefits.
Finally, imagine that crops bred with the aid of this tool might carry dangers. Some could contain unsuspected allergens or toxins. Others might disrupt ecosystems, affecting the behavior or very survival of other species, or infecting wild relatives with their altered DNA.
Now ask yourself: If such a technology existed, should policymakers encourage its adoption, or ban it due to the risks? And if you chose the former alternative, how should crops developed by this method be regulated?
In fact, this technology does exist, though its use remains mostly experimental. It's called gene editing, and in the past five years it has emerged as a potentially revolutionary force in many areas—among them, treating cancer and genetic disorders; growing transplantable human organs in pigs; controlling malaria-spreading mosquitoes; and, yes, transforming agriculture. Several versions are currently available, the newest and nimblest of which goes by the acronym CRISPR.
Gene editing is far simpler and more efficient than older methods used to produce genetically modified organisms (GMOs). Unlike those methods, moreover, it can be used in ways that leave no foreign genes in the target organism—an advantage that proponents argue should comfort anyone leery of consuming so-called "Frankenfoods." But debate persists over what precautions must be taken before these crops come to market.
Recently, two of the world's most powerful regulatory bodies offered very different answers to that question. The United States Department of Agriculture (USDA) declared in March 2018 that it "does not currently regulate, or have any plans to regulate" plants that are developed through most existing methods of gene editing. The Court of Justice of the European Union (ECJ), by contrast, ruled in July that such crops should be governed by the same stringent regulations as conventional GMOs.
Some experts suggest that the broadly permissive American approach and the broadly restrictive EU policy are equally flawed.
Each announcement drew protests, for opposite reasons. Anti-GMO activists assailed the USDA's statement, arguing that all gene-edited crops should be tested and approved before marketing. "You don't know what those mutations or rearrangements might do in a plant," warned Michael Hansen, a senior scientist with the advocacy group Consumers Union. Biotech boosters griped that the ECJ's decision would stifle innovation and investment. "By any sensible standard, this judgment is illogical and absurd," wrote the British newspaper The Observer.
Yet some experts suggest that the broadly permissive American approach and the broadly restrictive EU policy are equally flawed. "What's behind these regulatory decisions is not science," says Jennifer Kuzma, co-director of the Genetic Engineering and Society Center at North Carolina State University, a former advisor to the World Economic Forum, who has researched and written extensively on governance issues in biotechnology. "It's politics, economics, and culture."
The U.S. Welcomes Gene-Edited Food
Humans have been modifying the genomes of plants and animals for 10,000 years, using selective breeding—a hit-or-miss method that can take decades or more to deliver rewards. In the mid-20th century, we learned to speed up the process by exposing organisms to radiation or mutagenic chemicals. But it wasn't until the 1980s that scientists began modifying plants by altering specific stretches of their DNA.
Today, about 90 percent of the corn, cotton and soybeans planted in the U.S. are GMOs; such crops cover nearly 4 million square miles (10 million square kilometers) of land in 29 countries. Most of these plants are transgenic, meaning they contain genes from an unrelated species—often as biologically alien as a virus or a fish. Their modifications are designed primarily to boost profit margins for mechanized agribusiness: allowing crops to withstand herbicides so that weeds can be controlled by mass spraying, for example, or to produce their own pesticides to lessen the need for chemical inputs.
In the early days, the majority of GM crops were created by extracting the gene for a desired trait from a donor organism, multiplying it, and attaching it to other snippets of DNA—usually from a microbe called an agrobacterium—that could help it infiltrate the cells of the target plant. Biotechnologists injected these particles into the target, hoping at least one would land in a place where it would perform its intended function; if not, they kept trying. The process was quicker than conventional breeding, but still complex, scattershot, and costly.
Because agrobacteria can cause plant tumors, Kuzma explains, policymakers in the U.S. decided to regulate GMO crops under an existing law, the Plant Pest Act of 1957, which addressed dangers like imported trees infested with invasive bugs. Every GMO containing the DNA of agrobacterium or another plant pest had to be tested to see whether it behaved like a pest, and undergo a lengthy approval process. By 2010, however, new methods had been developed for creating GMOs without agrobacteria; such plants could typically be marketed without pre-approval.
Soon after that, the first gene-edited crops began appearing. If old-school genetic engineering was a shotgun, techniques like TALEN and CRISPR were a scalpel—or the search-and-replace function on a computer program. With CRISPR/Cas9, for example, an enzyme that bacteria use to recognize and chop up hostile viruses is reprogrammed to find and snip out a desired bit of a plant or other organism's DNA. The enzyme can also be used to insert a substitute gene. If a DNA sequence is simply removed, or the new gene comes from a similar species, the changes in the target plant's genotype and phenotype (its general characteristics) may be no different from those that could be produced through selective breeding. If a foreign gene is added, the plant becomes a transgenic GMO.
Companies are already teeing up gene-edited products for the U.S. market, like a cooking oil and waxy corn.
This development, along with the emergence of non-agrobacterium GMOs, eventually prompted the USDA to propose a tiered regulatory system for all genetically engineered crops, beginning with an initial screening for potentially hazardous metaboloids or ecological impacts. (The screening was intended, in part, to guard against the "off-target effects"—stray mutations—that occasionally appear in gene-edited organisms.) If no red flags appeared, the crop would be approved; otherwise, it would be subject to further review, and possible regulation.
The plan was unveiled in January 2017, during the last week of the Obama presidency. Then, under the Trump administration, it was shelved. Although the USDA continues to promise a new set of regulations, the only hint of what they might contain has been Secretary of Agriculture Sonny Perdue's statement last March that gene-edited plants would remain unregulated if they "could otherwise have been developed through traditional breeding techniques, as long as they are not plant pests or developed using plant pests."
Because transgenic plants could not be "developed through traditional breeding techniques," this statement could be taken to mean that gene editing in which foreign DNA is introduced might actually be regulated. But because the USDA regulates conventional transgenic GMOs only if they trigger the plant-pest stipulation, experts assume gene-edited crops will face similarly limited oversight.
Meanwhile, companies are already teeing up gene-edited products for the U.S. market. An herbicide-resistant oilseed rape, developed using a proprietary technique, has been available since 2016. A cooking oil made from TALEN-tweaked soybeans, designed to have a healthier fatty-acid profile, is slated for release within the next few months. A CRISPR-edited "waxy" corn, designed with a starch profile ideal for processed foods, should be ready by 2021.
In all likelihood, none of these products will have to be tested for safety.
In the E.U., Stricter Rules Apply
Now let's look at the European Union. Since the late 1990s, explains Gregory Jaffe, director of the Project on Biotechnology at the Center for Science in the Public Interest, the EU has had a "process-based trigger" for genetically engineered products: "If you use recombinant DNA, you are going to be regulated." All foods and animal feeds must be approved and labeled if they consist of or contain more than 0.9 percent GM ingredients. (In the U.S., "disclosure" of GM ingredients is mandatory, if someone asks, but labeling is not required.) The only GM crop that can be commercially grown in EU member nations is a type of insect-resistant corn, though some countries allow imports.
European scientists helped develop gene editing, and they—along with the continent's biotech entrepreneurs—have been busy developing applications for crops. But European farmers seem more divided over the technology than their American counterparts. The main French agricultural trades union, for example, supports research into non-transgenic gene editing and its exemption from GMO regulation. But it was the country's small-farmers' union, the Confédération Paysanne, along with several allied groups, that in 2015 submitted a complaint to the ECJ, asking that all plants produced via mutagenesis—including gene-editing—be regulated as GMOs.
At this point, it should be mentioned that in the past 30 years, large population studies have found no sign that consuming GM foods is harmful to human health. GMO critics can, however, point to evidence that herbicide-resistant crops have encouraged overuse of herbicides, giving rise to poison-proof "superweeds," polluting the environment with suspected carcinogens, and inadvertently killing beneficial plants. Those allegations were key to the French plaintiffs' argument that gene-edited crops might similarly do unexpected harm. (Disclosure: Leapsmag's parent company, Bayer, recently acquired Monsanto, a maker of herbicides and herbicide-resistant seeds. Also, Leaps by Bayer, an innovation initiative of Bayer and Leapsmag's direct founder, has funded a biotech startup called JoynBio that aims to reduce the amount of nitrogen fertilizer required to grow crops.)
The ruling was "scientifically nonsensical. It's because of things like this that I'll never go back to Europe."
In the end, the EU court found in the Confédération's favor on gene editing—though the court maintained the regulatory exemption for mutagenesis induced by chemicals or radiation, citing the 'long safety record' of those methods.
The ruling was "scientifically nonsensical," fumes Rodolphe Barrangou, a French food scientist who pioneered CRISPR while working for DuPont in Wisconsin and is now a professor at NC State. "It's because of things like this that I'll never go back to Europe."
Nonetheless, the decision was consistent with longstanding EU policy on crops made with recombinant DNA. Given the difficulty and expense of getting such products through the continent's regulatory system, many other European researchers may wind up following Barrangou to America.
Getting to the Root of the Cultural Divide
What explains the divergence between the American and European approaches to GMOs—and, by extension, gene-edited crops? In part, Jennifer Kuzma speculates, it's that Europeans have a different attitude toward eating. "They're generally more tied to where their food comes from, where it's produced," she notes. They may also share a mistrust of government assurances on food safety, borne of the region's Mad Cow scandals of the 1980s and '90s. In Catholic countries, consumers may have misgivings about tinkering with the machinery of life.
But the principal factor, Kuzma argues, is that European and American agriculture are structured differently. "GM's benefits have mostly been designed for large-scale industrial farming and commodity crops," she says. That kind of farming is dominant in the U.S., but not in Europe, leading to a different balance of political power. In the EU, there was less pressure on decisionmakers to approve GMOs or exempt gene-edited crops from regulation—and more pressure to adopt a GM-resistant stance.
Such dynamics may be operating in other regions as well. In China, for example, the government has long encouraged research in GMOs; a state-owned company recently acquired Syngenta, a Swiss-based multinational corporation that is a leading developer of GM and gene-edited crops. GM animal feed and cooking oil can be freely imported. Yet commercial cultivation of most GM plants remains forbidden, out of deference to popular suspicions of genetically altered food. "As a new item, society has debates and doubts on GMO techniques, which is normal," President Xi Jinping remarked in 2014. "We must be bold in studying it, [but] be cautious promoting it."
The proper balance between boldness and caution is still being worked out all over the world. Europe's process-based approach may prevent researchers from developing crops that, with a single DNA snip, could rescue millions from starvation. EU regulations will also make it harder for small entrepreneurs to challenge Big Ag with a technology that, as Barrangou puts it, "can be used affordably, quickly, scalably, by anyone, without even a graduate degree in genetics." America's product-based approach, conversely, may let crops with hidden genetic dangers escape detection. And by refusing to investigate such risks, regulators may wind up exacerbating consumers' doubts about GM and gene-edited products, rather than allaying them.
"Science...can't tell you what to regulate. That's a values-based decision."
Perhaps the solution lies in combining both approaches, and adding some flexibility and nuance to the mix. "I don't believe in regulation by the product or the process," says CSPI's Jaffe. "I think you need both." Deleting a DNA base pair to silence a gene, for example, might be less risky than inserting a foreign gene into a plant—unless the deletion enables the production of an allergen, and the transgene comes from spinach.
Kuzma calls for the creation of "cooperative governance networks" to oversee crop genome editing, similar to bodies that already help develop and enforce industry standards in fisheries, electronics, industrial cleaning products, and (not incidentally) organic agriculture. Such a network could include farmers, scientists, advocacy groups, private companies, and governmental agencies. "Safety isn't an all-or-nothing concept," Kuzma says. "Science can tell you what some of the issues are in terms of risk and benefit, but it can't tell you what to regulate. That's a values-based decision."
By drawing together a wide range of stakeholders to make such decisions, she adds, "we're more likely to anticipate future consequences, and to develop a robust approach—one that not only seems more legitimate to people, but is actually just plain old better."
Thousands of Vaccine Volunteers Got a Dummy Shot. Should They Get the Real Deal Now?
The highly anticipated rollout of a COVID-19 vaccine poses ethical considerations: When will trial volunteers who got a placebo be vaccinated? And how will this affect the data in those trials?
It's an issue that vaccine manufacturers and study investigators are wrestling with as the Food and Drug Administration is expected to grant emergency use authorization this weekend to a vaccine developed by Pfizer and the German company BioNTech. Another vaccine, produced by Moderna, is nearing approval in the United States.
The most vulnerable—health care workers and nursing home residents—are deemed eligible to receive the initial limited supply in accordance with priority recommendations from the Centers for Disease Control and Prevention (CDC).
With health care workers constituting an estimated 20 percent of trial participants, this question also comes to the fore: "Is it now ethically imperative that we offer them the vaccine, those who have had placebo?" says William Schaffner, an infectious diseases physician at Vanderbilt University and an adviser to the CDC's immunization practices committee.
When a "gold-standard" measure becomes available, participants in the placebo group "would ordinarily be notified" of the strong public health recommendation to opt for immunization, says Johan Bester, interim assistant dean for biomedical science education and director of bioethics at the University of Nevada, Las Vegas School of Medicine.
"If a treatment or prevention exists that we know works, it is unethical to withhold it from people who would benefit from it just to answer a research question." This moral principle poses a quandary for ethicists and physicians alike, as they ponder possible paths to proceed with vaccination amid ongoing trials. Rigorous trials are double-blinded—neither the participants nor the investigators know who received the actual vaccine and who got a dummy injection.
"The intent of these trials is to follow these folks for up to two years," says Marci Drees, infection prevention officer and hospital epidemiologist for ChristianaCare in Wilmington, Delaware. At a minimum, she adds, researchers would prefer to monitor participants for six months.
"You can still follow safety over a long-term period of time without actually continuing to have a placebo group for comparison."
But in the midst of a pandemic, that may not be feasible. Prolonged exposure to the highly contagious and lethal virus could have dire consequences.
To avoid compromising the integrity of the blinded data, "there are some potentially creative solutions," Drees says. For instance, trial participants could receive the opposite of what they initially got, whether it was the vaccine or the placebo.
One factor in this decision-making process depends on when a particular trial is slated to conclude. If that time is approaching, the risk of waiting would be lower than if the trial is only halfway in progress, says Eric Lofgren, an epidemiologist at Washington State University who has studied the impact of COVID-19 in jails and at in-person sporting events.
Sometimes a study concludes earlier than the projected completion date. "All clinical trials have a data and safety monitoring board that reviews the interim results," Lofgren says. The board may halt a trial after finding evidence of harm, or when a treatment or vaccine has proven to be "sufficiently good," rendering it unethical to deprive the placebo group of its benefits.
The initial months of a trial are most crucial for assessing a vaccine's safety. Differences between the trial groups would be illuminating if fewer individuals who got the active vaccine contracted the virus and developed symptoms when compared to the placebo recipients. After that point, in vaccine-administered participants, "you can still follow safety over a long-term period of time without actually continuing to have a placebo group for comparison," says Dial Hewlett Jr., medical director for disease control at the Westchester County Department of Health in New York.
Even outside of a trial, safety is paramount and any severe side effects that occur will be closely monitored and investigated through national reporting networks. For example, regulators in the U.K. are investigating several rare but serious allergic reactions to the Pfizer vaccine given on Tuesday. The FDA has asked Pfizer to track allergic reactions in its safety monitoring plan, and some experts are proposing that Pfizer conduct a separate study of the vaccine on people with a history of severe allergies.
As the FDA eventually grants authorization to multiple vaccines, more participants are likely to leave trials and opt to be vaccinated. It is important that enough participants choose to stay in ongoing trials, says Nicole Hassoun, professor of philosophy at the State University of New York at Binghamton, where she directs the Global Health Impact program to extend medical access to the poor.
She's hopeful that younger participants and individuals without underlying medical conditions will make that determination. But the departure of too many participants at high risk for the virus would make it more difficult to evaluate the vaccine's safety and efficacy in those populations, Hassoun says, while acknowledging, "We can't have the best of both worlds."
Once a safe and effective vaccine is approved in the United States, "it would not be ethically appropriate to do placebo trials to test new vaccines."
One solution would entail allowing health care workers to exit a trial after a vaccine is approved, even though this would result in "a conundrum when the next group of people are brought forward to get the vaccine—whether they're people age 65 and older or they're essential workers, or whoever they are," says Vanderbilt physician Schaffner, who is a former board member of the Infectious Diseases Society of America. "All of a sudden, you'll have an erosion of the volunteers who are in the trial."
For now, one way or another, experts agree that current and subsequent trials should proceed. There is a compelling reason to identify additional vaccines with potentially greater effectiveness but with fewer side effects or less complex delivery methods that don't require storage at extremely low temperatures.
"Continuing with existing vaccine trials and starting others remains important," says Nir Eyal, professor and director of Rutgers University's Center for Population-Level Bioethics in New Brunswick, New Jersey. "We still need to tell how much proven vaccines block infections and how long their duration lasts. And populations around the world need vaccines that are easier to store and deliver, or simply cheaper."
But once a safe and effective vaccine is approved in the United States, "it would not be ethically appropriate to do placebo trials to test new vaccines," says bioethicist Bester at the University of Nevada, Las Vegas School of Medicine. "One possibility if a new vaccine emerges, is to test it against existing vaccines."
In a letter sent to trial volunteers in November, Pfizer and BioNTech committed to establishing "a process that would allow interested participants in the placebo group who meet the eligibility criteria for early access in their country to 'cross-over' to the vaccine group." The trial plans to continue monitoring all subjects regardless of whether people in the placebo group cross over, Pfizer said in a presentation to the FDA today. After Pfizer has collected six months of safety data, in April 2021, it plans to ask the FDA for full approval of the vaccine.
In the meantime, the company pledged to update volunteers as they obtain more input from regulatory authorities. "Thank you again for making a difference by being a part of this study," they wrote. "It is only through the efforts of volunteers like you that reaching this important milestone and developing a potential vaccine against COVID-19 is possible."
CORRECTION: An earlier version of this article mistakenly stated that the FDA would be granting emergency "approval" to the Pfizer/BioNTech vaccine, rather than "emergency use authorization." We regret the error.
Since March, 35 patients in the care of Dr. Gregory Jicha, a neurologist at the University of Kentucky, have died of Alzheimer's disease or related dementia.
Meanwhile, with 233 active clinical trials underway to find treatments, Jicha wonders why mainstream media outlets don't do more to highlight potential solutions to the physical, emotional and economic devastation of these diseases. "Unfortunately, it's not until we're right at the cusp of a major discovery that anybody pays attention to these very promising agents," he says.
Heightened awareness would bring more resources for faster progress, according to Jicha. Otherwise, he's concerned that current research pipelines will take over a decade.
In recent years, newspapers with national readerships have devoted more technology reporting to key developments in social media, artificial intelligence, wired gadgets and telecom. Less prominent has been news about biotech—innovations based on biology research—and new medicines emerging from this technology. That's the impression of Jicha as well as Craig Lipset, former head of clinical innovation at Pfizer. "Scientists and clinicians are entirely invested [in biotech], yet no one talks about their discoveries," he says.
With the popular press rightly focusing on progress with a vaccine for COVID-19 this year, notable developments in biomarkers, Alzheimer's and cancer research, gene therapies for cystic fibrosis, and therapeutics related to biological age may be going unreported. Jennifer Goldsack, Executive Director of the nonprofit Digital Medicine Society, is confused over the media's soft touch with biotech. "I'm genuinely interested in understanding what editors of technology sections think the public wants to be reading."
The Numbers on Media Coverage
A newspaper's health section is a sensible fit for biotech reporting. In 2020, these departments have concentrated largely on COVID-19—as they should—while sections on technology and science don't necessarily pick up on other biotech news. Emily Mullin, staff writer for the tech magazine OneZero, has observed a gap in newspaper coverage. "You have a lot of [niche outlets] reporting biotech on the business side for industry experts, and you have a lot of reporting heavily from the science side focused on [readers who are] scientists. But there aren't a lot of outlets doing more humanizing coverage of biotech."
Indeed, the volume of coverage by top-tier media outlets in the U.S. for non-COVID biotech has dropped 32 percent since the pandemic spiked in March, according to an analysis run for this article by Commetric, a company that looks at media reputation for clients in many sectors including biotech and artificial intelligence. Meanwhile, the volume of coverage for AI has held steady, up one percent.
Commetric's CEO, Magnus Hakansson, thinks important biotech stories were omitted from mainstream coverage even before the world fell into the grips of the virus. "Apart from COVID, it's been extremely difficult for biotech companies to push out their discoveries," he says. "People in biotech have to be quite creative when they want to communicate [progress in] different therapeutic areas, and that is a problem."
In mid-February, just before the pandemic dominated the news cycle, researchers used machine learning to find a powerful new antibiotic capable of killing strains of disease-causing bacteria that had previously resisted all known antibiotics. Science-focused outlets hailed the work as a breakthrough, but some nationally-read newspapers didn't mention it. "There is this very silent crisis around antibiotic resistance that no one is aware of," says Goldsack. "We could be 50 years away from not being able to give elective surgeries because we are at such a high risk of being unable to control infection."
Could mainstream media strike a better balance between cynicism toward biotech and hyping animal studies that probably won't ever benefit the humans reading about them?
What's to Gain from More Mainstream Biotech
A brighter public spotlight on biotech could result in greater support and faster progress with research, says Lipset. "One of the biggest delays in drug development is patient recruitment. Patients don't know about the opportunities," he said, because, "clinical research pipelines aren't talked about in the mainstream news." Only about eight percent of oncology patients participate.
The current focus on COVID-19, while warranted, could also be excluding lines of research that seem separate from the virus, but are actually relevant. In September, Nir Barzilai, director of the Institute of Aging Research at Albert Einstein College of Medicine, told me about eight different observational studies finding decreased COVID-19 severity among people taking a drug called metformin, which is believed to slow down the major hallmarks of biological aging, such as inflammation. Once a vaccine is approved and distributed, biologically older people could supplement it with metformin.
"Shining the spotlight on this research now could really be critical because COVID has shown what happens in older adults and how they're more at risk," says Jenna Bartley, a researcher of aging and immunology at the University of Connecticut, but she believes mainstream media sometimes miss stories on anti-aging therapies or portray them inaccurately.
The question remains why.
The Theranos Effect and Other Image Problems
Before the pandemic, Mullin, the biotech writer at OneZero, looked into a story for her editor about a company with a new test for infectious diseases. The company said its test, based on technology for editing genes, was fast, easy to use, and could be tailored to any pathogen. Mullin told her editor the evidence for the test's validity was impressive.
He wondered if readers would agree. "This is starting to sound like Theranos," he said.
The brainchild of entrepreneur Elizabeth Holmes, Theranos was valued at $9 billion in 2014. Time Magazine named Holmes one of its most influential people, and the blood-testing company was heavily covered by the media as a game changer for health outcomes—until Holmes was exposed by the Wall Street Journal as a fraud and criminally charged.
In the OneZero article, Mullin and her editor were careful to explain the gene-editing tech was legit, explicitly distinguishing it from Theranos. "I was like, yes—but this actually works! And they can show it works."
While the Holmes scandal explains some of the mistrust, it's part of a bigger pattern. The public's hopes for biotech have been frustrated repeatedly in recent decades, fostering a media mantra of fool me twice, shame on me. A recent report by Commetric noted that after the bursting of the biotech market bubble in the early 2000s, commentators grew deeply skeptical of the field. An additional source of caution may be the number of researchers in biotech with conflicts of interest such as patents or their own startups. "It's a landmine," Mullin said. "We're conditioned to think that scientists are out for the common good, but they have their own biases."
Yet another source of uncertainty: the long regulatory road and cost for new therapies to be approved by the FDA. The process can take 15 years and over a billion dollars; the percentage of drugs actually crossing the final strand of red tape is notoriously low.
"The only time stories have reached the news is when there's a sensational headline about the cure for cancer," said Lipset, "when, in fact it's about mice, and then things drop off." Meanwhile, consumer protection hurdles for some technologies, such as computer chips, are less onerous than the FDA gauntlet for new medicines. The media may view research breakthroughs in digital tech as more impactful because they're likelier to find their way into commercially available products.
And whereas a handful of digital innovations have been democratized for widespread consumption—96 percent of Americans now own a cell phone, and 72 percent use social media—journalists at nationally-read newspapers may see biotech as less attainable for the average reader. Sure, we're all aging, but will the healthcare system grant everyone fair access to treatments for slowing the aging process? Current disparities in healthcare sow reason for doubt.
And yet. Recall Lipset's point that more press coverage would drive greater participation in clinical trials, which could accelerate them and diversify participants. Could mainstream media strike a better balance between cynicism toward biotech and hyping animal studies that probably won't ever benefit the humans reading about them?
Biotech in a Post-COVID World
Imagine it's early 2022. Hopefully, much of the population is protected from the virus through some combination of vaccines, therapeutics, and herd immunity. We're starting to bounce back from the social and economic shocks of 2020. COVID-19 headlines recede from the front pages, then disappear altogether. Gradually, certain aspects of life pick up where they left off in 2019, while a few changes forced by the pandemic prove to be more lasting, some for the better.
Among its possible legacies, the virus could usher in a new era of biotech development and press coverage, with these two trends reinforcing each other. While government has mismanaged its response to the virus, the level of innovation, collaboration and investment in pandemic-related biotech has been compared to the Manhattan Project. "There's no question that vaccine acceleration is a success story," said Kevin Schulman, a professor of medicine and economics at Stanford. "We could use this experience to build new economic models to correct market failures. It could carry over to oncology or Alzheimer's."
As Winston Churchill said, never let a good crisis go to waste.
Lipset thinks the virus has primed us to pay attention, bringing biotech into the public's consciousness like never before. He's amazed at how many neighbors and old friends from high school are coming out of the woodwork to ask him how clinical trials work. "What happens next is interesting. Does this open a window of opportunity to get more content out? People's appetites have been whetted."
High-profile wins could help to sustain interest, such as deployment of rapid tests of COVID-19 to be taken at home, a version of which the FDA authorized on November 18th. The idea bears resemblance to the Theranos concept, also designed as a portable analysis, except this test met the FDA's requirements and has a legitimate chance of changing people's lives. Meanwhile, at least two vaccines are on track to gain government approval in record time. The unprecedented speed could be a catalyst for streamlining inefficiencies in the FDA's approval process in non-emergency situations.
Tests for COVID-19 represent what some view as the future of managing diseases: early detection. This paradigm may be more feasible—and deserving of journalistic ink—than research on diseases in advanced stages, says Azra Raza, professor of medicine at Columbia University. "Journalists have to challenge this conceit of thinking we can cure end-stage cancer," says Raza, author of The First Cell. Beyond animal studies and "exercise helps" articles, she thinks writers should focus on biotech for catching the earliest footprints of cancer when it's more treatable. "Not enough people appreciate the extent of this tragedy, but journalists can help us do it. COVID-19 is a great moment of truth telling."
Another pressing truth is the need for vaccination, as half of Americans have said they'll skip them due to concerns about safety and effectiveness. It's not the kind of stumbling block faced by iPhones or social media algorithms. AI stirs plenty of its own controversy, but the public's interest in learning about AI and engaging with it seems to grow regardless. "Who are the publicists doing such a good job for AI that biotechnology is lacking?" Lipset wonders.
The job description of those publicists, whoever they are, could be expanding. Scientists are increasingly using AI to measure the effects of new medicines that target diseases—including COVID-19—and the pathways of aging. Mullin noted the challenge of reporting breakthroughs in the life sciences in ways the public understands. With many newsrooms tightening budgets, fewer writers have science backgrounds, and "biotech is daunting for journalists," she says. "It's daunting for me and I work in this area." Now factor in the additional expertise required to understand biotech and AI. "I learned the ropes for how to read a biotech paper, but I have no idea how to read an AI paper."
Nevertheless, Mullin believes reporters have a duty to scrutinize whether this convergence of AI and biotech will foster better outcomes. "Is it just the shiny new tool we're employing because we can? Will algorithms help eliminate health disparities or contribute to them even more? We need to pay attention."