What’s the Right Way to Regulate Gene-Edited Crops?
In the next few decades, humanity faces its biggest food crisis since the invention of the plow. The planet's population, currently 7.6 billion, is expected to reach 10 billion by 2050; to avoid mass famine, according to the World Resource Institute, we'll need to produce 70 percent more calories than we do today.
Imagine that a cheap, easy-to-use, and rapidly deployable technology could make crops more fertile and strengthen their resistance to threats.
Meanwhile, climate change will bring intensifying assaults by heat, drought, storms, pests, and weeds, depressing farm yields around the globe. Epidemics of plant disease—already laying waste to wheat, citrus, bananas, coffee, and cacao in many regions—will spread ever further through the vectors of modern trade and transportation.
So here's a thought experiment: Imagine that a cheap, easy-to-use, and rapidly deployable technology could make crops more fertile and strengthen their resistance to these looming threats. Imagine that it could also render them more nutritious and tastier, with longer shelf lives and less vulnerability to damage in shipping—adding enhancements to human health and enjoyment, as well as reduced food waste, to the possible benefits.
Finally, imagine that crops bred with the aid of this tool might carry dangers. Some could contain unsuspected allergens or toxins. Others might disrupt ecosystems, affecting the behavior or very survival of other species, or infecting wild relatives with their altered DNA.
Now ask yourself: If such a technology existed, should policymakers encourage its adoption, or ban it due to the risks? And if you chose the former alternative, how should crops developed by this method be regulated?
In fact, this technology does exist, though its use remains mostly experimental. It's called gene editing, and in the past five years it has emerged as a potentially revolutionary force in many areas—among them, treating cancer and genetic disorders; growing transplantable human organs in pigs; controlling malaria-spreading mosquitoes; and, yes, transforming agriculture. Several versions are currently available, the newest and nimblest of which goes by the acronym CRISPR.
Gene editing is far simpler and more efficient than older methods used to produce genetically modified organisms (GMOs). Unlike those methods, moreover, it can be used in ways that leave no foreign genes in the target organism—an advantage that proponents argue should comfort anyone leery of consuming so-called "Frankenfoods." But debate persists over what precautions must be taken before these crops come to market.
Recently, two of the world's most powerful regulatory bodies offered very different answers to that question. The United States Department of Agriculture (USDA) declared in March 2018 that it "does not currently regulate, or have any plans to regulate" plants that are developed through most existing methods of gene editing. The Court of Justice of the European Union (ECJ), by contrast, ruled in July that such crops should be governed by the same stringent regulations as conventional GMOs.
Some experts suggest that the broadly permissive American approach and the broadly restrictive EU policy are equally flawed.
Each announcement drew protests, for opposite reasons. Anti-GMO activists assailed the USDA's statement, arguing that all gene-edited crops should be tested and approved before marketing. "You don't know what those mutations or rearrangements might do in a plant," warned Michael Hansen, a senior scientist with the advocacy group Consumers Union. Biotech boosters griped that the ECJ's decision would stifle innovation and investment. "By any sensible standard, this judgment is illogical and absurd," wrote the British newspaper The Observer.
Yet some experts suggest that the broadly permissive American approach and the broadly restrictive EU policy are equally flawed. "What's behind these regulatory decisions is not science," says Jennifer Kuzma, co-director of the Genetic Engineering and Society Center at North Carolina State University, a former advisor to the World Economic Forum, who has researched and written extensively on governance issues in biotechnology. "It's politics, economics, and culture."
The U.S. Welcomes Gene-Edited Food
Humans have been modifying the genomes of plants and animals for 10,000 years, using selective breeding—a hit-or-miss method that can take decades or more to deliver rewards. In the mid-20th century, we learned to speed up the process by exposing organisms to radiation or mutagenic chemicals. But it wasn't until the 1980s that scientists began modifying plants by altering specific stretches of their DNA.
Today, about 90 percent of the corn, cotton and soybeans planted in the U.S. are GMOs; such crops cover nearly 4 million square miles (10 million square kilometers) of land in 29 countries. Most of these plants are transgenic, meaning they contain genes from an unrelated species—often as biologically alien as a virus or a fish. Their modifications are designed primarily to boost profit margins for mechanized agribusiness: allowing crops to withstand herbicides so that weeds can be controlled by mass spraying, for example, or to produce their own pesticides to lessen the need for chemical inputs.
In the early days, the majority of GM crops were created by extracting the gene for a desired trait from a donor organism, multiplying it, and attaching it to other snippets of DNA—usually from a microbe called an agrobacterium—that could help it infiltrate the cells of the target plant. Biotechnologists injected these particles into the target, hoping at least one would land in a place where it would perform its intended function; if not, they kept trying. The process was quicker than conventional breeding, but still complex, scattershot, and costly.
Because agrobacteria can cause plant tumors, Kuzma explains, policymakers in the U.S. decided to regulate GMO crops under an existing law, the Plant Pest Act of 1957, which addressed dangers like imported trees infested with invasive bugs. Every GMO containing the DNA of agrobacterium or another plant pest had to be tested to see whether it behaved like a pest, and undergo a lengthy approval process. By 2010, however, new methods had been developed for creating GMOs without agrobacteria; such plants could typically be marketed without pre-approval.
Soon after that, the first gene-edited crops began appearing. If old-school genetic engineering was a shotgun, techniques like TALEN and CRISPR were a scalpel—or the search-and-replace function on a computer program. With CRISPR/Cas9, for example, an enzyme that bacteria use to recognize and chop up hostile viruses is reprogrammed to find and snip out a desired bit of a plant or other organism's DNA. The enzyme can also be used to insert a substitute gene. If a DNA sequence is simply removed, or the new gene comes from a similar species, the changes in the target plant's genotype and phenotype (its general characteristics) may be no different from those that could be produced through selective breeding. If a foreign gene is added, the plant becomes a transgenic GMO.
Companies are already teeing up gene-edited products for the U.S. market, like a cooking oil and waxy corn.
This development, along with the emergence of non-agrobacterium GMOs, eventually prompted the USDA to propose a tiered regulatory system for all genetically engineered crops, beginning with an initial screening for potentially hazardous metaboloids or ecological impacts. (The screening was intended, in part, to guard against the "off-target effects"—stray mutations—that occasionally appear in gene-edited organisms.) If no red flags appeared, the crop would be approved; otherwise, it would be subject to further review, and possible regulation.
The plan was unveiled in January 2017, during the last week of the Obama presidency. Then, under the Trump administration, it was shelved. Although the USDA continues to promise a new set of regulations, the only hint of what they might contain has been Secretary of Agriculture Sonny Perdue's statement last March that gene-edited plants would remain unregulated if they "could otherwise have been developed through traditional breeding techniques, as long as they are not plant pests or developed using plant pests."
Because transgenic plants could not be "developed through traditional breeding techniques," this statement could be taken to mean that gene editing in which foreign DNA is introduced might actually be regulated. But because the USDA regulates conventional transgenic GMOs only if they trigger the plant-pest stipulation, experts assume gene-edited crops will face similarly limited oversight.
Meanwhile, companies are already teeing up gene-edited products for the U.S. market. An herbicide-resistant oilseed rape, developed using a proprietary technique, has been available since 2016. A cooking oil made from TALEN-tweaked soybeans, designed to have a healthier fatty-acid profile, is slated for release within the next few months. A CRISPR-edited "waxy" corn, designed with a starch profile ideal for processed foods, should be ready by 2021.
In all likelihood, none of these products will have to be tested for safety.
In the E.U., Stricter Rules Apply
Now let's look at the European Union. Since the late 1990s, explains Gregory Jaffe, director of the Project on Biotechnology at the Center for Science in the Public Interest, the EU has had a "process-based trigger" for genetically engineered products: "If you use recombinant DNA, you are going to be regulated." All foods and animal feeds must be approved and labeled if they consist of or contain more than 0.9 percent GM ingredients. (In the U.S., "disclosure" of GM ingredients is mandatory, if someone asks, but labeling is not required.) The only GM crop that can be commercially grown in EU member nations is a type of insect-resistant corn, though some countries allow imports.
European scientists helped develop gene editing, and they—along with the continent's biotech entrepreneurs—have been busy developing applications for crops. But European farmers seem more divided over the technology than their American counterparts. The main French agricultural trades union, for example, supports research into non-transgenic gene editing and its exemption from GMO regulation. But it was the country's small-farmers' union, the Confédération Paysanne, along with several allied groups, that in 2015 submitted a complaint to the ECJ, asking that all plants produced via mutagenesis—including gene-editing—be regulated as GMOs.
At this point, it should be mentioned that in the past 30 years, large population studies have found no sign that consuming GM foods is harmful to human health. GMO critics can, however, point to evidence that herbicide-resistant crops have encouraged overuse of herbicides, giving rise to poison-proof "superweeds," polluting the environment with suspected carcinogens, and inadvertently killing beneficial plants. Those allegations were key to the French plaintiffs' argument that gene-edited crops might similarly do unexpected harm. (Disclosure: Leapsmag's parent company, Bayer, recently acquired Monsanto, a maker of herbicides and herbicide-resistant seeds. Also, Leaps by Bayer, an innovation initiative of Bayer and Leapsmag's direct founder, has funded a biotech startup called JoynBio that aims to reduce the amount of nitrogen fertilizer required to grow crops.)
The ruling was "scientifically nonsensical. It's because of things like this that I'll never go back to Europe."
In the end, the EU court found in the Confédération's favor on gene editing—though the court maintained the regulatory exemption for mutagenesis induced by chemicals or radiation, citing the 'long safety record' of those methods.
The ruling was "scientifically nonsensical," fumes Rodolphe Barrangou, a French food scientist who pioneered CRISPR while working for DuPont in Wisconsin and is now a professor at NC State. "It's because of things like this that I'll never go back to Europe."
Nonetheless, the decision was consistent with longstanding EU policy on crops made with recombinant DNA. Given the difficulty and expense of getting such products through the continent's regulatory system, many other European researchers may wind up following Barrangou to America.
Getting to the Root of the Cultural Divide
What explains the divergence between the American and European approaches to GMOs—and, by extension, gene-edited crops? In part, Jennifer Kuzma speculates, it's that Europeans have a different attitude toward eating. "They're generally more tied to where their food comes from, where it's produced," she notes. They may also share a mistrust of government assurances on food safety, borne of the region's Mad Cow scandals of the 1980s and '90s. In Catholic countries, consumers may have misgivings about tinkering with the machinery of life.
But the principal factor, Kuzma argues, is that European and American agriculture are structured differently. "GM's benefits have mostly been designed for large-scale industrial farming and commodity crops," she says. That kind of farming is dominant in the U.S., but not in Europe, leading to a different balance of political power. In the EU, there was less pressure on decisionmakers to approve GMOs or exempt gene-edited crops from regulation—and more pressure to adopt a GM-resistant stance.
Such dynamics may be operating in other regions as well. In China, for example, the government has long encouraged research in GMOs; a state-owned company recently acquired Syngenta, a Swiss-based multinational corporation that is a leading developer of GM and gene-edited crops. GM animal feed and cooking oil can be freely imported. Yet commercial cultivation of most GM plants remains forbidden, out of deference to popular suspicions of genetically altered food. "As a new item, society has debates and doubts on GMO techniques, which is normal," President Xi Jinping remarked in 2014. "We must be bold in studying it, [but] be cautious promoting it."
The proper balance between boldness and caution is still being worked out all over the world. Europe's process-based approach may prevent researchers from developing crops that, with a single DNA snip, could rescue millions from starvation. EU regulations will also make it harder for small entrepreneurs to challenge Big Ag with a technology that, as Barrangou puts it, "can be used affordably, quickly, scalably, by anyone, without even a graduate degree in genetics." America's product-based approach, conversely, may let crops with hidden genetic dangers escape detection. And by refusing to investigate such risks, regulators may wind up exacerbating consumers' doubts about GM and gene-edited products, rather than allaying them.
"Science...can't tell you what to regulate. That's a values-based decision."
Perhaps the solution lies in combining both approaches, and adding some flexibility and nuance to the mix. "I don't believe in regulation by the product or the process," says CSPI's Jaffe. "I think you need both." Deleting a DNA base pair to silence a gene, for example, might be less risky than inserting a foreign gene into a plant—unless the deletion enables the production of an allergen, and the transgene comes from spinach.
Kuzma calls for the creation of "cooperative governance networks" to oversee crop genome editing, similar to bodies that already help develop and enforce industry standards in fisheries, electronics, industrial cleaning products, and (not incidentally) organic agriculture. Such a network could include farmers, scientists, advocacy groups, private companies, and governmental agencies. "Safety isn't an all-or-nothing concept," Kuzma says. "Science can tell you what some of the issues are in terms of risk and benefit, but it can't tell you what to regulate. That's a values-based decision."
By drawing together a wide range of stakeholders to make such decisions, she adds, "we're more likely to anticipate future consequences, and to develop a robust approach—one that not only seems more legitimate to people, but is actually just plain old better."
Scientists find enzymes in nature that could replace toxic chemicals
Some 900 miles off the coast of Portugal, nine major islands rise from the mid-Atlantic. Verdant and volcanic, the Azores archipelago hosts a wealth of biodiversity that keeps field research scientist, Marlon Clark, returning for more. “You’ve got this really interesting biogeography out there,” says Clark. “There’s real separation between the continents, but there’s this inter-island dispersal of plants and seeds and animals.”
It’s a visual paradise by any standard, but on a microscopic level, there’s even more to see. The Azores’ nutrient-rich volcanic rock — and its network of lagoons, cave systems, and thermal springs — is home to a vast array of microorganisms found in a variety of microclimates with different elevations and temperatures.
Clark works for Basecamp Research, a biotech company headquartered in London, and his job is to collect samples from ecosystems around the world. By extracting DNA from soil, water, plants, microbes and other organisms, Basecamp is building an extensive database of the Earth’s proteins. While DNA itself isn’t a protein, the information stored in DNA is used to create proteins, so extracting, sequencing, and annotating DNA allows for the discovery of unique protein sequences.
Using what they’re finding in the middle of the Atlantic and beyond, Basecamp’s detailed database is constantly growing. The outputs could be essential for cleaning up the damage done by toxic chemicals and finding alternatives to these chemicals.
Catalysts for change
Proteins provide structure and function in all living organisms. Some of these functional proteins are enzymes, which quite literally make things happen.
“Industrial chemistry is heavily polluting, especially the chemistry done in pharmaceutical drug development. Biocatalysis is providing advantages, both to make more complex drugs and to be more sustainable, reducing the pollution and toxicity of conventional chemistry," says Ahir Pushpanath, who heads partnerships for Basecamp.
“Enzymes are perfectly evolved catalysts,” says Ahir Pushpanath, a partnerships lead at Basecamp. ”Enzymes are essentially just a polymer, and polymers are made up of amino acids, which are nature’s building blocks.” He suggests thinking about it like Legos — if you have a bunch of Lego pieces and use them to build a structure that performs a function, “that’s basically how an enzyme works. In nature, these monuments have evolved to do life’s chemistry. If we didn’t have enzymes, we wouldn’t be alive.”
In our own bodies, enzymes catalyze everything from vision to digesting food to regrowing muscles, and these same types of enzymes are necessary in the pharmaceutical, agrochemical and fine chemical industries. But industrial conditions differ from those inside our bodies. So, when scientists need certain chemical reactions to create a particular product or substance, they make their own catalysts in their labs — generally through the use of petroleum and heavy metals.
These petrochemicals are effective and cost-efficient, but they’re wasteful and often hazardous. With growing concerns around sustainability and long-term public health, it's essential to find alternative solutions to toxic chemicals. “Industrial chemistry is heavily polluting, especially the chemistry done in pharmaceutical drug development,” Pushpanath says.
Basecamp is trying to replace lab-created catalysts with enzymes found in the wild. This concept is called biocatalysis, and in theory, all scientists have to do is find the right enzymes for their specific need. Yet, historically, researchers have struggled to find enzymes to replace petrochemicals. When they can’t identify a suitable match, they turn to what Pushpanath describes as “long, iterative, resource-intensive, directed evolution” in the laboratory to coax a protein into industrial adaptation. But the latest scientific advances have enabled these discoveries in nature instead.
Marlon Clark, a research scientist at Basecamp Research, looks for novel biochemistries in the Azores.
Glen Gowers
Enzyme hunters
Whether it’s Clark and a colleague setting off on an expedition, or a local, on-the-ground partner gathering and processing samples, there’s a lot to be learned from each collection. “Microbial genomes contain complete sets of information that define an organism — much like how letters are a code allowing us to form words, sentences, pages, and books that contain complex but digestible knowledge,” Clark says. He thinks of the environmental samples as biological libraries, filled with thousands of species, strains, and sequence variants. “It’s our job to glean genetic information from these samples.”
“We can actually dream up new proteins using generative AI," Pushpanath says.
Basecamp researchers manage this feat by sequencing the DNA and then assembling the information into a comprehensible structure. “We’re building the ‘stories’ of the biota,” Clark says. The more varied the samples, the more valuable insights his team gains into the characteristics of different organisms and their interactions with the environment. Sequencing allows scientists to examine the order of nucleotides — the organic molecules that form DNA — to identify genetic makeups and find changes within genomes. The process used to be too expensive, but the cost of sequencing has dropped from $10,000 a decade ago to as low as $100. Notably, biocatalysis isn’t a new concept — there have been waves of interest in using natural enzymes in catalysis for over a century, Pushpanath says. “But the technology just wasn’t there to make it cost effective,” he explains. “Sequencing has been the biggest boon.”
AI is probably the second biggest boon.
“We can actually dream up new proteins using generative AI,” Pushpanath says, which means that biocataylsis now has real potential to scale.
Glen Gowers, the co-founder of Basecamp, compares the company’s AI approach to that of social networks and streaming services. Consider how these platforms suggest connecting with the friends of your friends, or how watching one comedy film from the 1990s leads to a suggestion of three more.
“They’re thinking about data as networks of relationships as opposed to lists of items,” says Gowers. “By doing the same, we’re able to link the metadata of the proteins — by their relationships to each other, the environments in which they’re found, the way those proteins might look similar in sequence and structure, their surrounding genome context — really, this just comes down to creating a searchable network of proteins.”
On an Azores island, this volcanic opening may harbor organisms that can help scientists identify enzymes for biocatalysis to replace toxic chemicals.
Emma Bolton
Uwe Bornscheuer, professor at the Institute of Biochemistry at the University of Greifswald, and co-founder of Enzymicals, another biocatalysis company, says that the development of machine learning is a critical component of this work. “It’s a very hot topic, because the challenge in protein engineering is to predict which mutation at which position in the protein will make an enzyme suitable for certain applications,” Bornscheuer explains. These predictions are difficult for humans to make at all, let alone quickly. “It is clear that machine learning is a key technology.”
Benefiting from nature’s bounty
Biodiversity commonly refers to plants and animals, but the term extends to all life, including microbial life, and some regions of the world are more biodiverse than others. Building relationships with global partners is another key element to Basecamp’s success. Doing so in accordance with the access and benefit sharing principles set forth by the Nagoya Protocol — an international agreement that seeks to ensure the benefits of using genetic resources are distributed in a fair and equitable way — is part of the company's ethos. “There's a lot of potential for us, and there’s a lot of potential for our partners to have exactly the same impact in building and discovering commercially relevant proteins and biochemistries from nature,” Clark says.
Bornscheuer points out that Basecamp is not the first company of its kind. A former San Diego company called Diversa went public in 2000 with similar work. “At that time, the Nagoya Protocol was not around, but Diversa also wanted to ensure that if a certain enzyme or microorganism from Costa Rica, for example, were used in an industrial process, then people in Costa Rica would somehow profit from this.”
An eventual merger turned Diversa into Verenium Corporation, which is now a part of the chemical producer BASF, but it laid important groundwork for modern companies like Basecamp to continue to scale with today’s technologies.
“To collect natural diversity is the key to identifying new catalysts for use in new applications,” Bornscheuer says. “Natural diversity is immense, and over the past 20 years we have gained the advantages that sequencing is no longer a cost or time factor.”
This has allowed Basecamp to rapidly grow its database, outperforming Universal Protein Resource or UniProt, which is the public repository of protein sequences most commonly used by researchers. Basecamp’s database is three times larger, totaling about 900 million sequences. (UniProt isn’t compliant with the Nagoya Protocol, because, as a public database, it doesn’t provide traceability of protein sequences. Some scientists, however, argue that Nagoya compliance hinders progress.)
“Eventually, this work will reduce chemical processes. We’ll have cleaner processes, more sustainable processes," says Uwe Bornscheuer, a professor at the University of Greifswald.
With so much information available, Basecamp’s AI has been trained on “the true dictionary of protein sequence life,” Pushpanath says, which makes it possible to design sequences for particular applications. “Through deep learning approaches, we’re able to find protein sequences directly from our database, without the need for further laboratory-directed evolution.”
Recently, a major chemical company was searching for a specific transaminase — an enzyme that catalyzes a transfer of amino groups. “They had already spent a year-and-a-half and nearly two million dollars to evolve a public-database enzyme, and still had not reached their goal,” Pushpanath says. “We used our AI approaches on our novel database to yield 10 candidates within a week, which, when validated by the client, achieved the desired target even better than their best-evolved candidate.”
Basecamp’s other huge potential is in bioremediation, where natural enzymes can help to undo the damage caused by toxic chemicals. “Biocatalysis impacts both sides,” says Gowers. “It reduces the usage of chemicals to make products, and at the same time, where contamination sites do exist from chemical spills, enzymes are also there to kind of mop those up.”
So far, Basecamp's round-the-world sampling has covered 50 percent of the 14 major biomes, or regions of the planet that can be distinguished by their flora, fauna, and climate, as defined by the World Wildlife Fund. The other half remains to be catalogued — a key milestone for understanding our planet’s protein diversity, Pushpanath notes.
There’s still a long road ahead to fully replace petrochemicals with natural enzymes, but biocatalysis is on an upward trajectory. "Eventually, this work will reduce chemical processes,” Bornscheuer says. “We’ll have cleaner processes, more sustainable processes.”
Small changes in how a person talks could reveal Alzheimer’s earlier
Dave Arnold retired in his 60s and began spending time volunteering in local schools. But then he started misplacing items, forgetting appointments and losing his sense of direction. Eventually he was diagnosed with early stage Alzheimer’s.
“Hearing the diagnosis made me very emotional and tearful,” he said. “I immediately thought of all my mom had experienced.” His mother suffered with the condition for years before passing away. Over the last year, Arnold has worked for the Alzheimer’s Association as one of its early stage advisors, sharing his insights to help others in the initial stages of the disease.
Arnold was diagnosed sooner than many others. It's important to find out early, when interventions can make the most difference. One promising avenue is looking at how people talk. Research has shown that Alzheimer’s affects a part of the brain that controls speech, resulting in small changes before people show other signs of the disease.
Now, Canary Speech, a company based in Utah, is using AI to examine elements like the pitch of a person’s voice and their pauses. In an initial study, Canary analyzed speech recordings with AI and identified early stage Alzheimer’s with 96 percent accuracy.
Developing the AI model
Canary Speech’s CEO, Henry O’Connell, met cofounder Jeff Adams about 40 years before they started the company. Back when they first crossed paths, they were both living in Bethesda, Maryland; O’Connell was a research fellow at the National Institutes of Health studying rare neurological diseases, while Adams was working to decode spy messages. Later on, Adams would specialize in building mathematical models to analyze speech and sound as a team leader in developing Amazon's Alexa.
It wasn't until 2015 that they decided to make use of the fit between their backgrounds. ““We established Canary Speech in 2017 to build a product that could be used in multiple languages in clinical environments,” O'Connell says.
The need is growing. About 55 million people worldwide currently live with Alzheimer’s, a number that is expected to double by 2050. Some scientists think the disease results from a buildup of plaque in the brain. It causes mild memory loss at first and, over time, this issue get worse while other symptoms, such as disorientation and hallucinations, can develop. Treatment to manage the disease is more effective in the earlier stages, but detection is difficult since mild symptoms are often attributed to the normal aging process.
O’Connell and Adams specialize in the complex ways that Alzheimer’s effects how people speak. Using AI, their mathematical model analyzes 15 million data points every minute, focusing on certain features of speech such as pitch, pauses and elongation of words. It also pays attention to how the vibrations of vocal cords change in different stages of the disease.
To create their model, the team used a type of machine learning called deep neural nets, which looks at multiple layers of data - in this case, the multiple features of a person’s speech patterns.
“Deep neural nets allow us to look at much, much larger data sets built out of millions of elements,” O’Connell explained. “Through machine learning and AI, we’ve identified features that are very sensitive to an Alzheimer’s patient versus [people without the disease] and also very sensitive to mild cognitive impairment, early stage and moderate Alzheimer's.” Based on their learnings, Canary is able to classify the disease stage very quickly, O’Connell said.
“When we’re listening to sublanguage elements, we’re really analyzing the direct result of changes in the brain in the physical body,” O’Connell said. “The brain controls your vocal cords: how fast they vibrate, the expansion of them, the contraction.” These factors, along with where people put their tongues when talking, function subconsciously and result in subtle changes in the sounds of speech.
Further testing is needed
In an initial trial, Canary analyzed speech recordings from phone calls to a large U.S. health insurer. They looked at the audio recordings of 651 policyholders who had early stage Alzheimer’s and 1018 who did not have the condition, aiming for a representative sample of age, gender and race. They used this data to create their first diagnostic model and found that it was 96 percent accurate in identifying Alzheimer’s.
Christian Herff, an assistant professor of neuroscience at Maastricht University in the Netherlands, praised this approach while adding that further testing is needed to assess its effectiveness.
“I think the general idea of identifying increased risk for cognitive impairment based on speech characteristics is very feasible, particularly when change in a user’s voice is monitored, for example, by recording speech every year,” Herff said. He noted that this can only be a first indication, not a full diagnosis. The accuracy still needs to be validated in studies that follows individuals over a period of time, he said.
Toby Walsh, a professor of artificial intelligence at the University of New South Wales, also thinks Canary’s tool has potential but highlights that Canary could diagnose some people who don’t really have the disease. “This is an interesting and promising application of AI,” he said, “but these tools need to be used carefully. Imagine the anxiety of being misdiagnosed with Alzheimer’s.”
As with many other AI tools, privacy and bias are additional issues to monitor closely, Walsh said.
Other languages
A related issue is that not everyone is fluent in English. Mahnaz Arvaneh, a senior lecturer in automatic control and systems engineering at the University of Sheffield, said this could be a blind spot.
“The system may not be very accurate for those who have English as their second language as their speaking patterns would be different, and any issue might be because of language deficiency rather than cognitive issues,” Arvaneh said.
The team is expanding to multiple languages starting with Japanese and Spanish. The elements of the model that make up the algorithm are very similar, but they need to be validated and retrained in a different language, which will require access to more data.
Recently, Canary analyzed the phone calls of 233 Japanese patients who had mild cognitive impairment and 704 healthy people. Using an English model they were able to identify the Japanese patients who had mild cognitive impairment with 78 percent accuracy. They also developed a model in Japanese that was 45 percent accurate, and they’re continuing to train it with more data.
The future
Canary is using their model to look at other diseases like Huntington’s and Parkinson’s. They’re also collaborating with pharmaceuticals to validate potential therapies for Alzheimer’s. By looking at speech patterns over time, Canary can get an indication of how well these drugs are working.
Dave Arnold and his wife dance at his nephew’s wedding in Rochester, New York, ten years ago, before his Alzheimer's diagnosis.
Dave Arnold
Ultimately, they want to integrate their tool into everyday life. “We want it to be used in a smartphone, or a teleconference call so that individuals could be examined in their home,” O’Connell said. “We could follow them over time and work with clinical teams and hospitals to improve the evaluation of patients and contribute towards an accurate diagnosis.”
Arnold, the patient with early stage Alzheimer’s, sees great promise. “The process of getting a diagnosis is already filled with so much anxiety,” he said. “Anything that can be done to make it easier and less stressful would be a good thing, as long as it’s proven accurate.”