What’s the Right Way to Regulate Gene-Edited Crops?
In the next few decades, humanity faces its biggest food crisis since the invention of the plow. The planet's population, currently 7.6 billion, is expected to reach 10 billion by 2050; to avoid mass famine, according to the World Resource Institute, we'll need to produce 70 percent more calories than we do today.
Imagine that a cheap, easy-to-use, and rapidly deployable technology could make crops more fertile and strengthen their resistance to threats.
Meanwhile, climate change will bring intensifying assaults by heat, drought, storms, pests, and weeds, depressing farm yields around the globe. Epidemics of plant disease—already laying waste to wheat, citrus, bananas, coffee, and cacao in many regions—will spread ever further through the vectors of modern trade and transportation.
So here's a thought experiment: Imagine that a cheap, easy-to-use, and rapidly deployable technology could make crops more fertile and strengthen their resistance to these looming threats. Imagine that it could also render them more nutritious and tastier, with longer shelf lives and less vulnerability to damage in shipping—adding enhancements to human health and enjoyment, as well as reduced food waste, to the possible benefits.
Finally, imagine that crops bred with the aid of this tool might carry dangers. Some could contain unsuspected allergens or toxins. Others might disrupt ecosystems, affecting the behavior or very survival of other species, or infecting wild relatives with their altered DNA.
Now ask yourself: If such a technology existed, should policymakers encourage its adoption, or ban it due to the risks? And if you chose the former alternative, how should crops developed by this method be regulated?
In fact, this technology does exist, though its use remains mostly experimental. It's called gene editing, and in the past five years it has emerged as a potentially revolutionary force in many areas—among them, treating cancer and genetic disorders; growing transplantable human organs in pigs; controlling malaria-spreading mosquitoes; and, yes, transforming agriculture. Several versions are currently available, the newest and nimblest of which goes by the acronym CRISPR.
Gene editing is far simpler and more efficient than older methods used to produce genetically modified organisms (GMOs). Unlike those methods, moreover, it can be used in ways that leave no foreign genes in the target organism—an advantage that proponents argue should comfort anyone leery of consuming so-called "Frankenfoods." But debate persists over what precautions must be taken before these crops come to market.
Recently, two of the world's most powerful regulatory bodies offered very different answers to that question. The United States Department of Agriculture (USDA) declared in March 2018 that it "does not currently regulate, or have any plans to regulate" plants that are developed through most existing methods of gene editing. The Court of Justice of the European Union (ECJ), by contrast, ruled in July that such crops should be governed by the same stringent regulations as conventional GMOs.
Some experts suggest that the broadly permissive American approach and the broadly restrictive EU policy are equally flawed.
Each announcement drew protests, for opposite reasons. Anti-GMO activists assailed the USDA's statement, arguing that all gene-edited crops should be tested and approved before marketing. "You don't know what those mutations or rearrangements might do in a plant," warned Michael Hansen, a senior scientist with the advocacy group Consumers Union. Biotech boosters griped that the ECJ's decision would stifle innovation and investment. "By any sensible standard, this judgment is illogical and absurd," wrote the British newspaper The Observer.
Yet some experts suggest that the broadly permissive American approach and the broadly restrictive EU policy are equally flawed. "What's behind these regulatory decisions is not science," says Jennifer Kuzma, co-director of the Genetic Engineering and Society Center at North Carolina State University, a former advisor to the World Economic Forum, who has researched and written extensively on governance issues in biotechnology. "It's politics, economics, and culture."
The U.S. Welcomes Gene-Edited Food
Humans have been modifying the genomes of plants and animals for 10,000 years, using selective breeding—a hit-or-miss method that can take decades or more to deliver rewards. In the mid-20th century, we learned to speed up the process by exposing organisms to radiation or mutagenic chemicals. But it wasn't until the 1980s that scientists began modifying plants by altering specific stretches of their DNA.
Today, about 90 percent of the corn, cotton and soybeans planted in the U.S. are GMOs; such crops cover nearly 4 million square miles (10 million square kilometers) of land in 29 countries. Most of these plants are transgenic, meaning they contain genes from an unrelated species—often as biologically alien as a virus or a fish. Their modifications are designed primarily to boost profit margins for mechanized agribusiness: allowing crops to withstand herbicides so that weeds can be controlled by mass spraying, for example, or to produce their own pesticides to lessen the need for chemical inputs.
In the early days, the majority of GM crops were created by extracting the gene for a desired trait from a donor organism, multiplying it, and attaching it to other snippets of DNA—usually from a microbe called an agrobacterium—that could help it infiltrate the cells of the target plant. Biotechnologists injected these particles into the target, hoping at least one would land in a place where it would perform its intended function; if not, they kept trying. The process was quicker than conventional breeding, but still complex, scattershot, and costly.
Because agrobacteria can cause plant tumors, Kuzma explains, policymakers in the U.S. decided to regulate GMO crops under an existing law, the Plant Pest Act of 1957, which addressed dangers like imported trees infested with invasive bugs. Every GMO containing the DNA of agrobacterium or another plant pest had to be tested to see whether it behaved like a pest, and undergo a lengthy approval process. By 2010, however, new methods had been developed for creating GMOs without agrobacteria; such plants could typically be marketed without pre-approval.
Soon after that, the first gene-edited crops began appearing. If old-school genetic engineering was a shotgun, techniques like TALEN and CRISPR were a scalpel—or the search-and-replace function on a computer program. With CRISPR/Cas9, for example, an enzyme that bacteria use to recognize and chop up hostile viruses is reprogrammed to find and snip out a desired bit of a plant or other organism's DNA. The enzyme can also be used to insert a substitute gene. If a DNA sequence is simply removed, or the new gene comes from a similar species, the changes in the target plant's genotype and phenotype (its general characteristics) may be no different from those that could be produced through selective breeding. If a foreign gene is added, the plant becomes a transgenic GMO.
Companies are already teeing up gene-edited products for the U.S. market, like a cooking oil and waxy corn.
This development, along with the emergence of non-agrobacterium GMOs, eventually prompted the USDA to propose a tiered regulatory system for all genetically engineered crops, beginning with an initial screening for potentially hazardous metaboloids or ecological impacts. (The screening was intended, in part, to guard against the "off-target effects"—stray mutations—that occasionally appear in gene-edited organisms.) If no red flags appeared, the crop would be approved; otherwise, it would be subject to further review, and possible regulation.
The plan was unveiled in January 2017, during the last week of the Obama presidency. Then, under the Trump administration, it was shelved. Although the USDA continues to promise a new set of regulations, the only hint of what they might contain has been Secretary of Agriculture Sonny Perdue's statement last March that gene-edited plants would remain unregulated if they "could otherwise have been developed through traditional breeding techniques, as long as they are not plant pests or developed using plant pests."
Because transgenic plants could not be "developed through traditional breeding techniques," this statement could be taken to mean that gene editing in which foreign DNA is introduced might actually be regulated. But because the USDA regulates conventional transgenic GMOs only if they trigger the plant-pest stipulation, experts assume gene-edited crops will face similarly limited oversight.
Meanwhile, companies are already teeing up gene-edited products for the U.S. market. An herbicide-resistant oilseed rape, developed using a proprietary technique, has been available since 2016. A cooking oil made from TALEN-tweaked soybeans, designed to have a healthier fatty-acid profile, is slated for release within the next few months. A CRISPR-edited "waxy" corn, designed with a starch profile ideal for processed foods, should be ready by 2021.
In all likelihood, none of these products will have to be tested for safety.
In the E.U., Stricter Rules Apply
Now let's look at the European Union. Since the late 1990s, explains Gregory Jaffe, director of the Project on Biotechnology at the Center for Science in the Public Interest, the EU has had a "process-based trigger" for genetically engineered products: "If you use recombinant DNA, you are going to be regulated." All foods and animal feeds must be approved and labeled if they consist of or contain more than 0.9 percent GM ingredients. (In the U.S., "disclosure" of GM ingredients is mandatory, if someone asks, but labeling is not required.) The only GM crop that can be commercially grown in EU member nations is a type of insect-resistant corn, though some countries allow imports.
European scientists helped develop gene editing, and they—along with the continent's biotech entrepreneurs—have been busy developing applications for crops. But European farmers seem more divided over the technology than their American counterparts. The main French agricultural trades union, for example, supports research into non-transgenic gene editing and its exemption from GMO regulation. But it was the country's small-farmers' union, the Confédération Paysanne, along with several allied groups, that in 2015 submitted a complaint to the ECJ, asking that all plants produced via mutagenesis—including gene-editing—be regulated as GMOs.
At this point, it should be mentioned that in the past 30 years, large population studies have found no sign that consuming GM foods is harmful to human health. GMO critics can, however, point to evidence that herbicide-resistant crops have encouraged overuse of herbicides, giving rise to poison-proof "superweeds," polluting the environment with suspected carcinogens, and inadvertently killing beneficial plants. Those allegations were key to the French plaintiffs' argument that gene-edited crops might similarly do unexpected harm. (Disclosure: Leapsmag's parent company, Bayer, recently acquired Monsanto, a maker of herbicides and herbicide-resistant seeds. Also, Leaps by Bayer, an innovation initiative of Bayer and Leapsmag's direct founder, has funded a biotech startup called JoynBio that aims to reduce the amount of nitrogen fertilizer required to grow crops.)
The ruling was "scientifically nonsensical. It's because of things like this that I'll never go back to Europe."
In the end, the EU court found in the Confédération's favor on gene editing—though the court maintained the regulatory exemption for mutagenesis induced by chemicals or radiation, citing the 'long safety record' of those methods.
The ruling was "scientifically nonsensical," fumes Rodolphe Barrangou, a French food scientist who pioneered CRISPR while working for DuPont in Wisconsin and is now a professor at NC State. "It's because of things like this that I'll never go back to Europe."
Nonetheless, the decision was consistent with longstanding EU policy on crops made with recombinant DNA. Given the difficulty and expense of getting such products through the continent's regulatory system, many other European researchers may wind up following Barrangou to America.
Getting to the Root of the Cultural Divide
What explains the divergence between the American and European approaches to GMOs—and, by extension, gene-edited crops? In part, Jennifer Kuzma speculates, it's that Europeans have a different attitude toward eating. "They're generally more tied to where their food comes from, where it's produced," she notes. They may also share a mistrust of government assurances on food safety, borne of the region's Mad Cow scandals of the 1980s and '90s. In Catholic countries, consumers may have misgivings about tinkering with the machinery of life.
But the principal factor, Kuzma argues, is that European and American agriculture are structured differently. "GM's benefits have mostly been designed for large-scale industrial farming and commodity crops," she says. That kind of farming is dominant in the U.S., but not in Europe, leading to a different balance of political power. In the EU, there was less pressure on decisionmakers to approve GMOs or exempt gene-edited crops from regulation—and more pressure to adopt a GM-resistant stance.
Such dynamics may be operating in other regions as well. In China, for example, the government has long encouraged research in GMOs; a state-owned company recently acquired Syngenta, a Swiss-based multinational corporation that is a leading developer of GM and gene-edited crops. GM animal feed and cooking oil can be freely imported. Yet commercial cultivation of most GM plants remains forbidden, out of deference to popular suspicions of genetically altered food. "As a new item, society has debates and doubts on GMO techniques, which is normal," President Xi Jinping remarked in 2014. "We must be bold in studying it, [but] be cautious promoting it."
The proper balance between boldness and caution is still being worked out all over the world. Europe's process-based approach may prevent researchers from developing crops that, with a single DNA snip, could rescue millions from starvation. EU regulations will also make it harder for small entrepreneurs to challenge Big Ag with a technology that, as Barrangou puts it, "can be used affordably, quickly, scalably, by anyone, without even a graduate degree in genetics." America's product-based approach, conversely, may let crops with hidden genetic dangers escape detection. And by refusing to investigate such risks, regulators may wind up exacerbating consumers' doubts about GM and gene-edited products, rather than allaying them.
"Science...can't tell you what to regulate. That's a values-based decision."
Perhaps the solution lies in combining both approaches, and adding some flexibility and nuance to the mix. "I don't believe in regulation by the product or the process," says CSPI's Jaffe. "I think you need both." Deleting a DNA base pair to silence a gene, for example, might be less risky than inserting a foreign gene into a plant—unless the deletion enables the production of an allergen, and the transgene comes from spinach.
Kuzma calls for the creation of "cooperative governance networks" to oversee crop genome editing, similar to bodies that already help develop and enforce industry standards in fisheries, electronics, industrial cleaning products, and (not incidentally) organic agriculture. Such a network could include farmers, scientists, advocacy groups, private companies, and governmental agencies. "Safety isn't an all-or-nothing concept," Kuzma says. "Science can tell you what some of the issues are in terms of risk and benefit, but it can't tell you what to regulate. That's a values-based decision."
By drawing together a wide range of stakeholders to make such decisions, she adds, "we're more likely to anticipate future consequences, and to develop a robust approach—one that not only seems more legitimate to people, but is actually just plain old better."
Two years, six million deaths and still counting, scientists are searching for answers to prevent another COVID-19-like tragedy from ever occurring again. And it’s a gargantuan task.
Our disturbed ecosystems are creating more favorable conditions for the spread of infectious disease. Global warming, deforestation, rising sea levels and flooding have contributed to a rise in mosquito-borne infections and longer tick seasons. Disease-carrying animals are in closer range to other species and humans as they migrate to escape the heat. Bats are thought to have carried the SARS-CoV-2 virus to Wuhan, either directly or through another host animal, but thousands of novel viruses are lurking within other wild creatures.
Understanding how climate change contributes to the spread of disease is critical in predicting and thwarting future calamities. But the problem is that predictive models aren’t yet where they need to be for forecasting with certainty beyond the next year, as we could for weather, for instance.
The association between climate and infectious disease is poorly understood, says Irina Tezaur, a computational scientist at Sandia National Laboratories. “Correlations have been observed but it’s not known if these correlations translate to causal relationships.”
To make accurate longer-term predictions, scientists need more empirical data, multiple datasets specific to locations and diseases, and the ability to calculate risks that depend on unpredictable nature and human behavior. Another obstacle is that climate scientists and epidemiologists are not collaborating effectively, so some researchers are calling for a multidisciplinary approach, a new field called Outbreak Science.
Climate scientists are far ahead of epidemiologists in gathering essential data.
Earth System Models—combining the interactions of atmosphere, ocean, land, ice and biosphere—have been in place for two decades to monitor the effects of global climate change. These models must be combined with epidemiological and human model research, areas that are easily skewed by unpredictable elements, from extreme weather events to public environmental policy shifts.
“There is never just one driver in tracking the impact of climate on infectious disease,” says Joacim Rocklöv, a professor at the Heidelberg Institute of Global Health & Heidelberg Interdisciplinary Centre for Scientific Computing in Germany. Rocklöv has studied how climate affects vector-borne diseases—those transmitted to humans by mosquitoes, ticks or fleas. “You need to disentangle the variables to find out how much difference climate makes to the outcome and how much is other factors.” Determinants from deforestation to population density to lack of healthcare access influence the spread of disease.
Even though climate change is not the primary driver of infectious disease today, it poses a major threat to public health in the future, says Rocklöv.
The promise of predictive modeling
“Models are simplifications of a system we’re trying to understand,” says Jeremy Hess, who directs the Center for Health and the Global Environment at University of Washington in Seattle. “They’re tools for learning that improve over time with new observations.”
Accurate predictions depend on high-quality, long-term observational data but models must start with assumptions. “It’s not possible to apply an evidence-based approach for the next 40 years,” says Rocklöv. “Using models to experiment and learn is the only way to figure out what climate means for infectious disease. We collect data and analyze what already happened. What we do today will not make a difference for several decades.”
To improve accuracy, scientists develop and draw on thousands of models to cover as many scenarios as possible. One model may capture the dynamics of disease transmission while another focuses on immunity data or ocean influences or seasonal components of a virus. Further, each model needs to be disease-specific and often location-specific to be useful.
“All models have biases so it’s important to use a suite of models,” Tezaur stresses.
The modeling scientist chooses the drivers of change and parameters based on the question explored. The drivers could be increased precipitation, poverty or mosquito prevalence, for instance. Later, the scientist may need to isolate the effect of one driver so that will require another model.
There have been some related successes, such as the latest models for mosquito-borne diseases like Dengue, Zika and malaria as well as those for flu and tick-borne diseases, says Hess.
Rocklöv was part of a research team that used test data from 2018 and 2019 to identify regions at risk for West Nile virus outbreaks. Using AI, scientists were able to forecast outbreaks of the virus for the entire transmission season in Europe. “In the end, we want data-driven models; that’s what AI can accomplish,” says Rocklöv. Other researchers are making an important headway in creating a framework to predict novel host–parasite interactions.
Modeling studies can run months, years or decades. “The scientist is working with layers of data. The challenge is how to transform and couple different models together on a planetary scale,” says Jeanne Fair, a scientist at Los Alamos National Laboratory, Biosecurity and Public Health, in New Mexico.
Disease forecasting will require a significant investment into the infrastructure needed to collect data about the environment, vectors, and hosts a tall spatial and temporal resolutions.
And it’s a constantly changing picture. A modeling study in an April 2022 issue of Nature predicted that thousands of animals will migrate to cooler locales as temperatures rise. This means that various species will come into closer contact with people and other mammals for the first time. This is likely to increase the risk of emerging infectious disease transmitted from animals to humans, especially in Africa and Asia.
Other things can happen too. Global warming could precipitate viral mutations or new infectious diseases that don’t respond to antimicrobial treatments. Insecticide-resistant mosquitoes could evolve. Weather-related food insecurity could increase malnutrition and weaken people’s immune systems. And the impact of an epidemic will be worse if it co-occurs during a heatwave, flood, or drought, says Hess.
The devil is in the climate variables
Solid predictions about the future of climate and disease are not possible with so many uncertainties. Difficult-to-measure drivers must be added to the empirical model mix, such as land and water use, ecosystem changes or the public’s willingness to accept a vaccine or practice social distancing. Nor is there any precedent for calculating the effect of climate changes that are accelerating at a faster speed than ever before.
The most critical climate variables thought to influence disease spread are temperature, precipitation, humidity, sunshine and wind, according to Tezaur’s research. And then there are variables within variables. Influenza scientists, for example, found that warm winters were predictors of the most severe flu seasons in the following year.
The human factor may be the most challenging determinant. To what degree will people curtail greenhouse gas emissions, if at all? The swift development of effective COVID-19 vaccines was a game-changer, but will scientists be able to repeat it during the next pandemic? Plus, no model could predict the amount of internet-fueled COVID-19 misinformation, Fair noted. To tackle this issue, infectious disease teams are looking to include more sociologists and political scientists in their modeling.
Addressing the gaps
Currently, researchers are focusing on the near future, predicting for next year, says Fair. “When it comes to long-term, that’s where we have the most work to do.” While scientists cannot foresee how political influences and misinformation spread will affect models, they are positioned to make headway in collecting and assessing new data streams that have never been merged.
Disease forecasting will require a significant investment into the infrastructure needed to collect data about the environment, vectors, and hosts at all spatial and temporal resolutions, Fair and her co-authors stated in their recent study. For example real-time data on mosquito prevalence and diversity in various settings and times is limited or non-existent. Fair also would like to see standards set in mosquito data collection in every country. “Standardizing across the US would be a huge accomplishment,” she says.
Understanding how climate change contributes to the spread of disease is critical for thwarting future calamities.
Jeanne Fair
Hess points to a dearth of data in local and regional datasets about how extreme weather events play out in different geographic locations. His research indicates that Africa and the Middle East experienced substantial climate shifts, for example, but are unrepresented in the evidentiary database, which limits conclusions. “A model for dengue may be good in Singapore but not necessarily in Port-au-Prince,” Hess explains. And, he adds, scientists need a way of evaluating models for how effective they are.
The hope, Rocklöv says, is that in the future we will have data-driven models rather than theoretical ones. In turn, sharper statistical analyses can inform resource allocation and intervention strategies to prevent outbreaks.
Most of all, experts emphasize that epidemiologists and climate scientists must stop working in silos. If scientists can successfully merge epidemiological data with climatic, biological, environmental, ecological and demographic data, they will make better predictions about complex disease patterns. Modeling “cross talk” and among disciplines and, in some cases, refusal to release data between countries is hindering discovery and advances.
It’s time for bold transdisciplinary action, says Hess. He points to initiatives that need funding in disease surveillance and control; developing and testing interventions; community education and social mobilization; decision-support analytics to predict when and where infections will emerge; advanced methodologies to improve modeling; training scientists in data management and integrated surveillance.
Establishing a new field of Outbreak Science to coordinate collaboration would accelerate progress. Investment in decision-support modeling tools for public health teams, policy makers, and other long-term planning stakeholders is imperative, too. We need to invest in programs that encourage people from climate modeling and epidemiology to work together in a cohesive fashion, says Tezaur. Joining forces is the only way to solve the formidable challenges ahead.
This article originally appeared in One Health/One Planet, a single-issue magazine that explores how climate change and other environmental shifts are increasing vulnerabilities to infectious diseases by land and by sea. The magazine probes how scientists are making progress with leaders in other fields toward solutions that embrace diverse perspectives and the interconnectedness of all lifeforms and the planet.
Scientists use AI to predict how hospital stays will go
The Friday Five covers five stories in research that you may have missed this week. There are plenty of controversies and troubling ethical issues in science – and we get into many of them in our online magazine – but this news roundup focuses on scientific creativity and progress to give you a therapeutic dose of inspiration headed into the weekend.
Here are the promising studies covered in this week's Friday Five:
- The problem with bedtime munching
- Scientists use AI to predict how stays in hospitals will go
- How to armor the shields of our livers against cancer
- One big step to save the world: turn one kind of plastic into another
- The perfect recipe for tiny brains
And an honorable mention this week: Bigger is better when it comes to super neurons in super agers