With Lab-Grown Chicken Nuggets, Dumplings, and Burgers, Futuristic Foods Aim to Seem Familiar
Sandhya Sriram is at the forefront of the expanding lab-grown meat industry in more ways than one.
"[Lab-grown meat] is kind of a brave new world for a lot of people, and food isn't something people like being brave about."
She's the CEO and co-founder of one of fewer than 30 companies that is even in this game in the first place. Her Singapore-based company, Shiok Meats, is the only one to pop up in Southeast Asia. And it's the only company in the world that's attempting to grow crustaceans in a lab, starting with shrimp. This spring, the company debuted a prototype of its shrimp, and completed a seed funding round of $4.6 million.
Yet despite all of these wins, Sriram's own mother won't try the company's shrimp. She's a staunch, lifelong vegetarian, adhering to a strict definition of what that means.
"[Lab-grown meat] is kind of a brave new world for a lot of people, and food isn't something people like being brave about. It's really a rather hard-wired thing," says Kate Krueger, the research director at New Harvest, a non-profit accelerator for cellular agriculture (the umbrella field that studies how to grow animal products in the lab, including meat, dairy, and eggs).
It's so hard-wired, in fact, that trends in food inform our species' origin story. In 2017, a group of paleoanthropologists caused an upset when they unearthed fossils in present day Morocco showing that our earliest human ancestors lived much further north and 100,000 years earlier than expected -- the remains date back 300,000 years. But the excavation not only included bones and tools, it also painted a clear picture of the prevailing menu at the time: The oldest humans were apparently chomping on tons of gazelle, as well as wildebeest and zebra when they could find them, plus the occasional seasonal ostrich egg.
These were people with a diet shaped by available resources, but also by the ability to cook in the first place. In his book Catching Fire: How Cooking Made Us Human, Harvard primatologist Richard Wrangam writes that the very thing that allowed for the evolution of Homo sapiens was the ability to transform raw ingredients into edible nutrients through cooking.
Today, our behavior and feelings around food are the product of local climate, crops, animal populations, and tools, but also religion, tradition, and superstition. So what happens when you add science to the mix? Turns out, we still trend toward the familiar. The innovations in lab-grown meat that are picking up the most steam are foods like burgers, not meat chips, and salmon, not salmon-cod-tilapia hybrids. It's not for lack of imagination, it's because the industry's practitioners know that a lifetime of food memories is a hard thing to contend with. So far, the nascent lab-grown meat industry is not so much disrupting as being shaped by the oldest culture we have.
Not a single piece of lab-grown meat is commercially available to consumers yet, and already so much ink has been spilled debating if it's really meat, if it's kosher, if it's vegetarian, if it's ethical, if it's sustainable. But whether or not the industry succeeds and sticks around is almost moot -- watching these conversations and innovations unfold serves as a mirror reflecting back who we are, what concerns us, and what we aspire to.
The More Things Change, the More They Stay the Same
The building blocks for making lab-grown meat right now are remarkably similar, no matter what type of animal protein a company is aiming to produce.
First, a small biopsy, about the size of a sesame seed, is taken from a single animal. Then, the muscle cells are isolated and added to a nutrient-dense culture in a bioreactor -- the same tool used to make beer -- where the cells can multiply, grow, and form muscle tissue. This tissue can then be mixed with additives like nutrients, seasonings, binders, and sometimes colors to form a food product. Whether a company is attempting to make chicken, fish, beef, shrimp, or any other animal protein in a lab, the basic steps remain similar. Cells from various animals do behave differently, though, and each company has its own proprietary techniques and tools. Some, for example, use fetal calf serum as their cell culture, while others, aiming for a more vegan approach, eschew it.
"New gadgets feel safest when they remind us of other objects that we already know."
According to Mark Post, who made the first lab-grown hamburger at Maastricht University in the Netherlands in 2013, the cells of just one cow can give way to 175 million four-ounce burgers. By today's available burger-making methods, you'd need to slaughter 440,000 cows for the same result. The projected difference in the purely material efficiency between the two systems is staggering. The environmental impact is hard to predict, though. Some companies claim that their lab-grown meat requires 99 percent less land and 96 percent less water than traditional farming methods -- and that rearing fewer cows, specifically, would reduce methane emissions -- but the energy cost of running a lab-grown-meat production facility at an industrial scale, especially as compared to small-scale, pasture-raised farming, could be problematic. It's difficult to truly measure any of this in a burgeoning industry.
At this point, growing something like an intact shrimp tail or a marbled steak in a lab is still a Holy Grail. It would require reproducing the complex musculo-skeletal and vascular structure of meat, not just the cellular basis, and no one's successfully done it yet. Until then, many companies working on lab-grown meat are perfecting mince. Each new company's demo of a prototype food feels distinctly regional, though: At the Disruption in Food and Sustainability Summit in March, Shiok (which is pronounced "shook," and is Singaporean slang for "very tasty and delicious") first shared a prototype of its shrimp as an ingredient in siu-mai, a dumpling of Chinese origin and a fixture at dim sum. JUST, a company based in the U.S., produced a demo chicken nugget.
As Jean Anthelme Brillat-Savarin, the 17th century founder of the gastronomic essay, famously said, "Show me what you eat, and I'll tell you who you are."
For many of these companies, the baseline animal protein they are trying to innovate also feels tied to place and culture: When meat comes from a bioreactor, not a farm, the world's largest exporter of seafood could be a landlocked region, and beef could be "reared" in a bayou, yet the handful of lab-grown fish companies, like Finless Foods and BlueNalu, hug the American coasts; VOW, based in Australia, started making lab-grown kangaroo meat in August; and of course the world's first lab-grown shrimp is in Singapore.
"In the U.S., shrimps are either seen in shrimp cocktail, shrimp sushi, and so on, but [in Singapore] we have everything from shrimp paste to shrimp oil," Sriram says. "It's used in noodles and rice, as flavoring in cup noodles, and in biscuits and crackers as well. It's seen in every form, shape, and size. It just made sense for us to go after a protein that was widely used."
It's tempting to assume that innovating on pillars of cultural significance might be easier if the focus were on a whole new kind of food to begin with, not your popular dim sum items or fast food offerings. But it's proving to be quite the opposite.
"That could have been one direction where [researchers] just said, 'Look, it's really hard to reproduce raw ground beef. Why don't we just make something completely new, like meat chips?'" says Mike Lee, co-founder and co-CEO of Alpha Food Labs, which works on food innovation more broadly. "While that strategy's interesting, I think we've got so many new things to explain to people that I don't know if you want to also explain this new format of food that you've never, ever seen before."
We've seen this same cautious approach to change before in other ways that relate to cooking. Perhaps the most obvious example is the kitchen range. As Bee Wilson writes in her book Consider the Fork: A History of How We Cook and Eat, in the 1880s, convincing ardent coal-range users to switch to newfangled gas was a hard sell. To win them over, inventor William Sugg designed a range that used gas, but aesthetically looked like the coal ones already in fashion at the time -- and which in some visual ways harkened even further back to the days of open-hearth cooking. Over time, gas range designs moved further away from those of the past, but the initial jump was only made possible through familiarity. There's a cleverness to meeting people where they are.
"New gadgets feel safest when they remind us of other objects that we already know," writes Wilson. "It is far harder to accept a technology that is entirely new."
Maybe someday we won't want anything other than meat chips, but not today.
Measuring Success
A 2018 Gallup poll shows that in the U.S., rates of true vegetarianism and veganism have been stagnant for as long as they've been measured. When the poll began in 1999, six percent of Americans were vegetarian, a number that remained steady until 2012, when the number dropped one point. As of 2018, it remained at five percent.
In 2012, when Gallup first measured the percentage of vegans, the rate was two percent. By 2018 it had gone up just one point, to three percent. Increasing awareness of animal welfare, health, and environmental concerns don't seem to be incentive enough to convince Americans, en masse, to completely slam the door on a food culture characterized in many ways by its emphasis on traditional meat consumption.
"A lot of consumers get over the ick factor when you tell them that most of the food that you're eating right now has entered the lab at some point."
Wilson writes that "experimenting with new foods has always been a dangerous business. In the wild, trying out some tempting new berries might lead to death. A lingering sense of this danger may make us risk-averse in the kitchen."
That might be one psychologically deep-seated reason that Americans are so resistant to ditch meat altogether. But a middle ground is emerging with a rise in flexitarianism, which aims to reduce reliance on traditional animal products. "Americans are eager to include alternatives to animal products in their diets, but are not willing to give up animal products completely," the same 2018 Gallup poll reported. This may represent the best opportunity for lab-grown meat to wedge itself into the culture.
Quantitatively predicting a population's willingness to try a lab-grown version of its favorite protein is proving a hard thing to measure, however, because it's still science fiction to a regular consumer. Measuring popular opinion of something that doesn't really exist yet is a dubious pastime.
In 2015, University of Wisconsin School of Public Health researchers Linnea Laestadius and Mark Caldwell conducted a study using online comments on articles about lab-grown meat to suss out public response to the food. The results showed a mostly negative attitude, but that was only two years into a field that is six years old today. Already public opinion may have shifted.
Shiok Meat's Sriram and her co-founder Ka Yi Ling have used online surveys to get a sense of the landscape, but they also take a more direct approach sometimes. Every time they give a public talk about their company and their shrimp, they poll their audience before and after the talk, using the question, "How many of you are willing to try, and pay, to eat lab-grown meat?"
They consistently find that the percentage of people willing to try goes up from 50 to 90 percent after hearing their talk, which includes information about the downsides of traditional shrimp farming (for one thing, many shrimp are raised in sewage, and peeled and deveined by slaves) and a bit of information about how lab-grown animal protein is being made now. I saw this pan out myself when Ling spoke at a New Harvest conference in Cambridge, Massachusetts in July.
"A lot of consumers get over the ick factor when you tell them that most of the food that you're eating right now has entered the lab at some point," Sriram says. "We're not going to grow our meat in the lab always. It's in the lab right now, because we're in R&D. Once we go into manufacturing ... it's going to be a food manufacturing facility, where a lot of food comes from."
The downside of the University of Wisconsin's and Shiok Meat's approach to capturing public opinion is that they each look at self-selecting groups: Online commenters are often fueled by a need to complain, and it's likely that anyone attending a talk by the co-founders of a lab-grown meat company already has some level of open-mindedness.
So Sriram says that she and Ling are also using another method to assess the landscape, and it's somewhere in the middle. They've been watching public responses to the closest available product to lab-grown meat that's on the market: Impossible Burger. As a 100 percent plant-based burger, it's not quite the same, but this bleedable, searable patty is still very much the product of science and laboratory work. Its remarkable similarity to beef is courtesy of yeast that have been genetically engineered to contain DNA from soy plant roots, which produce a protein called heme as they multiply. This heme is a plant-derived protein that can look and act like the heme found in animal muscle.
So far, the sciencey underpinnings of the burger don't seem to be turning people off. In just four years, it's already found its place within other American food icons. It's readily available everywhere from nationwide Burger Kings to Boston's Warren Tavern, which has been in operation since 1780, is one of the oldest pubs in America, and is even named after the man who sent Paul Revere on his midnight ride. Some people have already grown so attached to the Impossible Burger that they will actually walk out of a restaurant that's out of stock. Demand for the burger is outpacing production.
"Even though [Impossible] doesn't consider their product cellular agriculture, it's part of a spectrum of innovation," Krueger says. "There are novel proteins that you're not going to find in your average food, and there's some cool tech there. So to me, that does show a lot of willingness on people's part to think about trying something new."
The message for those working on animal-based lab-grown meat is clear: People will accept innovation on their favorite food if it tastes good enough and evokes the same emotional connection as the real deal.
"How people talk about lab-grown meat now, it's still a conversation about science, not about culture and emotion," Lee says. But he's confident that the conversation will start to shift in that direction if the companies doing this work can nail the flavor memory, above all.
And then proving how much power flavor lords over us, we quickly derail into a conversation about Doritos, which he calls "maniacally delicious." The chips carry no health value whatsoever and are a native product of food engineering and manufacturing — just watch how hard it is for Bon Appetit associate food editor Claire Saffitz to try and recreate them in the magazine's test kitchen — yet devotees remain unfazed and crunch on.
"It's funny because it shows you that people don't ask questions about how [some foods] are made, so why are they asking so many questions about how lab-grown meat is made?" Lee asks.
For all the hype around Impossible Burger, there are still controversies and hand-wringing around lab-grown meat. Some people are grossed out by the idea, some people are confused, and if you're the U.S. Cattlemen's Association (USCA), you're territorial. Last year, the group sent a petition to the USDA to "exclude products not derived directly from animals raised and slaughtered from the definition of 'beef' and meat.'"
"I think we are probably three or four big food safety scares away from everyone, especially younger generations, embracing lab-grown meat as like, 'Science is good; nature is dirty, and can kill you.'"
"I have this working hypothesis that if you look at the nation in 50-year spurts, we revolve back and forth between artisanal, all-natural food that's unadulterated and pure, and food that's empowered by science," Lee says. "Maybe we've only had one lap around the track on that, but I think we are probably three or four big food safety scares away from everyone, especially younger generations, embracing lab-grown meat as like, 'Science is good; nature is dirty, and can kill you.'"
Food culture goes beyond just the ingredients we know and love — it's also about how we interact with them, produce them, and expect them to taste and feel when we bite down. We accept a margin of difference among a fast food burger, a backyard burger from the grill, and a gourmet burger. Maybe someday we'll accept the difference between a burger created by killing a cow and a burger created by biopsying one.
Looking to the Future
Every time we engage with food, "we are enacting a ritual that binds us to the place we live and to those in our family, both living and dead," Wilson writes in Consider the Fork. "Such things are not easily shrugged off. Every time a new cooking technology has been introduced, however useful … it has been greeted in some quarters with hostility and protestations that the old ways were better and safer."
This is why it might be hard for a vegetarian mother to try her daughter's lab-grown shrimp, no matter how ethically it was produced or how awe-inspiring the invention is. Yet food cultures can and do change. "They're not these static things," says Benjamin Wurgaft, a historian whose book Meat Planet: Artificial Flesh and the Future of Food comes out this month. "The real tension seems to be between slow change and fast change."
In fact, the very definition of the word "meat" has never exclusively meant what the USCA wants it to mean. Before the 12th century, when it first appeared in Old English as "mete," it wasn't very specific at all and could be used to describe anything from "nourishment," to "food item," to "fodder," to "sustenance." By the 13th century it had been narrowed down to mean "flesh of warm-blooded animals killed and used as food." And yet the British mincemeat pie lives on as a sweet Christmas treat full of -- to the surprise of many non-Brits -- spiced, dried fruit. Since 1901, we've also used this word with ease as a general term for anything that's substantive -- as in, "the meat of the matter." There is room for yet more definitions to pile on.
"The conversation [about lab-ground meat] has changed remarkably in the last six years," Wurgaft says. "It has become a conversation about whether or not specific companies will bring a product to market, and that's a really different conversation than asking, 'Should we produce meat in the lab?'"
As part of the field research for his book, Wurgaft visited the Rijksmuseum Boerhaave, a Dutch museum that specializes in the history of science and medicine. It was 2015, and he was there to see an exhibit on the future of food. Just two years earlier, Mark Post had made that first lab-grown hamburger about a two-and-a-half hour drive south of the museum. When Wurgaft arrived, he found the novel invention, which Post had donated to the museum, already preserved and served up on a dinner plate, the whole outfit protected by plexiglass.
"They put this in the exhibit as if it were already part of the historical records, which to a historian looked really weird," Wurgaft says. "It looked like somebody taking the most recent supercomputer and putting it in a museum exhibit saying, 'This is the supercomputer that changed everything,' as if you were already 100 years in the future, looking back."
It seemed to symbolize an effort to codify a lab-grown hamburger as a matter of Dutch pride, perhaps someday occupying a place in people's hearts right next to the stroopwafel.
"Who's to say that we couldn't get a whole school of how to cook with lab-grown meat?"
Lee likes to imagine that part of the legacy of lab-grown meat, if it succeeds, will be to inspire entirely new fads in cooking -- a step beyond ones like the crab-filled avocado of the 1960s or the pesto of the 1980s in the U.S.
"[Lab-grown meat] is inherently going to be a different quality than anything we've done with an animal," he says. "Look at every cut [of meat] on the sphere today -- each requires a slightly different cooking method to optimize the flavor of that cut. Who's to say that we couldn't get a whole school of how to cook with lab-grown meat?"
At this point, most of us have no way of trying lab-grown meat. It remains exclusively available through sometimes gimmicky demos reserved for investors and the media. But Wurgaft says the stories we tell about this innovation, the articles we write, the films we make, and yes, even the museum exhibits we curate, all hold as much cultural significance as the product itself might someday.
A vaccine for Lyme disease could be coming. But will patients accept it?
For more than two decades, Marci Flory, a 40-year-old emergency room nurse from Lawrence, Kan., has battled the recurring symptoms of chronic Lyme disease, an illness which she believes began after being bitten by a tick during her teenage years.
Over the years, Flory has been plagued by an array of mysterious ailments, ranging from fatigue to crippling pain in her eyes, joints and neck, and even postural tachycardia syndrome or PoTS, an abnormal increase in heart rate after sitting up or standing. Ten years ago, she began to experience the onset of neurological symptoms which ranged from brain fog to sudden headaches, and strange episodes of leg weakness which would leave her unable to walk.
“Initially doctors thought I had ALS, or less likely, multiple sclerosis,” she says. “But after repeated MRI scans for a year, they concluded I had a rare neurological condition called acute transverse myelitis.”
But Flory was not convinced. After ordering a variety of private blood tests, she discovered she was infected with a range of bacteria in the genus Borrelia that live in the guts of ticks, the infectious agents responsible for Lyme disease.
“It made sense,” she says. “Looking back, I was bitten in high school and misdiagnosed with mononucleosis. This was probably the start, and my immune system kept it under wraps for a while. The Lyme bacteria can burrow into every tissue in the body, go into cyst form and become dormant before reactivating.”
The reason why cases of Lyme disease are increasing is down to changing weather patterns, triggered by climate change, meaning that ticks are now found across a much wider geographic range than ever before.
When these species of bacteria are transmitted to humans, they can attack the nervous system, joints and even internal organs which can lead to serious health complications such as arthritis, meningitis and even heart failure. While Lyme disease can sometimes be successfully treated with antibiotics if spotted early on, not everyone responds to these drugs, and for patients who have developed chronic symptoms, there is no known cure. Flory says she knows of fellow Lyme disease patients who have spent hundreds of thousands of dollars seeking treatments.
Concerningly, statistics show that Lyme and other tick-borne diseases are on the rise. Recently released estimates based on health insurance records suggest that at least 476,000 Americans are diagnosed with Lyme disease every year, and many experts believe the true figure is far higher.
The reason why the numbers are growing is down to changing weather patterns, triggered by climate change, meaning that ticks are now found across a much wider geographic range than ever before. Health insurance data shows that cases of Lyme disease have increased fourfold in rural parts of the U.S. over the last 15 years, and 65 percent in urban regions.
As a result, many scientists who have studied Lyme disease feel that it is paramount to bring some form of protective vaccine to market which can be offered to people living in the most at-risk areas.
“Even the increased awareness for Lyme disease has not stopped the cases,” says Eva Sapi, professor of cellular and molecular biology at the University of New Haven. “Some of these patients are looking for answers for years, running from one doctor to another, so that is obviously a very big cost for our society at so many levels.”
Emerging vaccines – and backlash
But with the rising case numbers, interest has grown among the pharmaceutical industry and research communities. Vienna-based biotech Valneva have partnered with Pfizer to take their vaccine – a seasonal jab which offers protection against the six most common strains of Lyme disease in the northern hemisphere – into a Phase III clinical trial which began in August. Involving 6,000 participants in a number of U.S. states and northern Europe where Lyme disease is endemic, it could lead to a licensed vaccine by 2025, if it proves successful.
“For many years Lyme was considered a small market vaccine,” explains Monica E. Embers, assistant professor of parasitology at Tulane University in New Orleans. “Now we know that this is a much bigger problem, Pfizer has stepped up to invest in preventing this disease and other pharmaceutical companies may as well.”
Despite innovations, patient communities and their representatives remain ambivalent about the idea of a vaccine. Some of this skepticism dates back to the failed LYMErix vaccine which was developed in the late 1990s before being withdrawn from the market.
At the same time, scientists at Yale University are developing a messenger RNA vaccine which aims to train the immune system to respond to tick bites by exposing it to 19 proteins found in tick saliva. Whereas the Valneva vaccine targets the bacteria within ticks, the Yale vaccine attempts to provoke an instant and aggressive immune response at the site of the bite. This causes the tick to fall off and limits the potential for transmitting dangerous infections.
But despite these innovations, patient communities and their representatives remain ambivalent about the idea of a vaccine. Some of this skepticism dates back to the failed LYMErix vaccine which was developed in the late 1990s before being withdrawn from the market in 2002 after concerns were raised that it might induce autoimmune reactions in humans.
While this theory was ultimately disproved, the lingering stigma attached to LYMErix meant that most vaccine manufacturers chose to stay away from the disease for many years, something which Gregory Poland, head of the Mayo Clinic’s Vaccine Research Group in Minnesota, describes as a tragedy.
“Since 2002, we have not had a human Lyme vaccine in the U.S. despite the increasing number of cases,” says Poland. “Pretty much everyone in the field thinks they’re ten times higher than the official numbers, so you’re probably talking at least 400,000 each year. It’s an incredible burden but because of concerns about anti-vax protestors, until very recently, no manufacturer has wanted to touch this.”
Such was the backlash surrounding the failed LYMErix program that scientists have even explored the most creative of workarounds for protecting people in tick-populated regions, without needing to actually vaccinate them. One research program at the University of Tennessee came up with the idea of leaving food pellets containing a vaccine in woodland areas with the idea that rodents would eat the pellets, and the vaccine would then kill Borrelia bacteria within any ticks which subsequently fed on the animals.
Even the Pfizer-Valneva vaccine has been cautiously designed to try and allay any lingering concerns, two decades after LYMErix. “The concept is the same as the original LYMErix vaccine, but it has been made safer by removing regions that had the potential to induce autoimmunity,” says Embers. “There will always be individuals who oppose vaccines, Lyme or otherwise, but it will be a tremendous boost to public health to have the option.”
Vaccine alternatives
Researchers are also considering alternative immunization approaches in case sufficiently large numbers of people choose to reject any Lyme vaccine which gets approved. Researchers at UMass Chan Medical School have developed an artificially generated antibody, administered via an annual injection, which is capable of killing Borrelia bacteria in the guts of ticks before they can get into the human host.
So far animal studies have shown it to be 100 percent effective, while the scientists have completed a Phase I trial in which they tested it for safety on 48 volunteers in Nebraska. Because this approach provides the antibody directly, rather than triggering the human immune system to produce the antibody like a vaccine would, Embers predicts that it could be a viable alternative for the vaccine hesitant as well as providing an option for immunocompromised individuals who cannot produce enough of their own antibodies.
At the same time, many patient groups still raise concerns over the fact that numerous diagnostic tests for Lyme disease have been reported to have a poor accuracy. Without this, they argue that it is difficult to prove whether vaccines or any other form of immunization actually work. “If the disease is not understood enough to create a more accurate test and a universally accepted treatment protocol, particularly for those who weren’t treated promptly, how can we be sure about the efficacy of a vaccine?” says Natasha Metcalf, co-founder of the organization Lyme Disease UK.
Flory points out that there are so many different types of Borrelia bacteria which cause Lyme disease, that the immunizations being developed may only stop a proportion of cases. In addition, she says that chronic Lyme patients often report a whole myriad of co-infections which remain poorly understood and are likely to also be involved in the disease process.
Marci Flory undergoes an infusion in an attempt to treat her Lyme disease symptoms.
Marci Flory
“I would love to see an effective Lyme vaccine but I have my reservations,” she says. “I am infected with four types of Borrelia bacteria, plus many co-infections – Babesia, Bartonella, Erlichiosis, Rickettsia, and Mycoplasma – all from a single Douglas County Kansas tick bite. Lyme never travels alone and the vaccine won’t protect against all the many strains of Borrelia and co-infections.”
Valneva CEO Thomas Lingelbach admits that the Pfizer-Valneva vaccine is not perfect, but predicts that it will still have significant impact if approved.
“We expect the vaccine to have 75 percent plus efficacy,” he says. “There is this legacy around the old Lyme vaccines, but the world is very, very different today. The number of clinical manifestations known to be caused by infection with Lyme Borreliosis has significantly increased, and the understanding around severity has certainly increased.”
Embers agrees that while it will still be important for doctors to monitor for other tick-borne infections which are not necessarily covered by the vaccine, having any clinically approved jab would still represent a major step forward in the fight against the disease.
“I think that any vaccine must be properly vetted, and these companies are performing extensive clinical trials to do just that,” she says. “Lyme is the most common tick-borne disease in the U.S. so the public health impact could be significant. However, clinicians and the general public must remain aware of all of the other tick-borne diseases such as Babesia and Anaplasma, and continue to screen for those when a tick bite is suspected.”
Two years, six million deaths and still counting, scientists are searching for answers to prevent another COVID-19-like tragedy from ever occurring again. And it’s a gargantuan task.
Our disturbed ecosystems are creating more favorable conditions for the spread of infectious disease. Global warming, deforestation, rising sea levels and flooding have contributed to a rise in mosquito-borne infections and longer tick seasons. Disease-carrying animals are in closer range to other species and humans as they migrate to escape the heat. Bats are thought to have carried the SARS-CoV-2 virus to Wuhan, either directly or through another host animal, but thousands of novel viruses are lurking within other wild creatures.
Understanding how climate change contributes to the spread of disease is critical in predicting and thwarting future calamities. But the problem is that predictive models aren’t yet where they need to be for forecasting with certainty beyond the next year, as we could for weather, for instance.
The association between climate and infectious disease is poorly understood, says Irina Tezaur, a computational scientist at Sandia National Laboratories. “Correlations have been observed but it’s not known if these correlations translate to causal relationships.”
To make accurate longer-term predictions, scientists need more empirical data, multiple datasets specific to locations and diseases, and the ability to calculate risks that depend on unpredictable nature and human behavior. Another obstacle is that climate scientists and epidemiologists are not collaborating effectively, so some researchers are calling for a multidisciplinary approach, a new field called Outbreak Science.
Climate scientists are far ahead of epidemiologists in gathering essential data.
Earth System Models—combining the interactions of atmosphere, ocean, land, ice and biosphere—have been in place for two decades to monitor the effects of global climate change. These models must be combined with epidemiological and human model research, areas that are easily skewed by unpredictable elements, from extreme weather events to public environmental policy shifts.
“There is never just one driver in tracking the impact of climate on infectious disease,” says Joacim Rocklöv, a professor at the Heidelberg Institute of Global Health & Heidelberg Interdisciplinary Centre for Scientific Computing in Germany. Rocklöv has studied how climate affects vector-borne diseases—those transmitted to humans by mosquitoes, ticks or fleas. “You need to disentangle the variables to find out how much difference climate makes to the outcome and how much is other factors.” Determinants from deforestation to population density to lack of healthcare access influence the spread of disease.
Even though climate change is not the primary driver of infectious disease today, it poses a major threat to public health in the future, says Rocklöv.
The promise of predictive modeling
“Models are simplifications of a system we’re trying to understand,” says Jeremy Hess, who directs the Center for Health and the Global Environment at University of Washington in Seattle. “They’re tools for learning that improve over time with new observations.”
Accurate predictions depend on high-quality, long-term observational data but models must start with assumptions. “It’s not possible to apply an evidence-based approach for the next 40 years,” says Rocklöv. “Using models to experiment and learn is the only way to figure out what climate means for infectious disease. We collect data and analyze what already happened. What we do today will not make a difference for several decades.”
To improve accuracy, scientists develop and draw on thousands of models to cover as many scenarios as possible. One model may capture the dynamics of disease transmission while another focuses on immunity data or ocean influences or seasonal components of a virus. Further, each model needs to be disease-specific and often location-specific to be useful.
“All models have biases so it’s important to use a suite of models,” Tezaur stresses.
The modeling scientist chooses the drivers of change and parameters based on the question explored. The drivers could be increased precipitation, poverty or mosquito prevalence, for instance. Later, the scientist may need to isolate the effect of one driver so that will require another model.
There have been some related successes, such as the latest models for mosquito-borne diseases like Dengue, Zika and malaria as well as those for flu and tick-borne diseases, says Hess.
Rocklöv was part of a research team that used test data from 2018 and 2019 to identify regions at risk for West Nile virus outbreaks. Using AI, scientists were able to forecast outbreaks of the virus for the entire transmission season in Europe. “In the end, we want data-driven models; that’s what AI can accomplish,” says Rocklöv. Other researchers are making an important headway in creating a framework to predict novel host–parasite interactions.
Modeling studies can run months, years or decades. “The scientist is working with layers of data. The challenge is how to transform and couple different models together on a planetary scale,” says Jeanne Fair, a scientist at Los Alamos National Laboratory, Biosecurity and Public Health, in New Mexico.
Disease forecasting will require a significant investment into the infrastructure needed to collect data about the environment, vectors, and hosts a tall spatial and temporal resolutions.
And it’s a constantly changing picture. A modeling study in an April 2022 issue of Nature predicted that thousands of animals will migrate to cooler locales as temperatures rise. This means that various species will come into closer contact with people and other mammals for the first time. This is likely to increase the risk of emerging infectious disease transmitted from animals to humans, especially in Africa and Asia.
Other things can happen too. Global warming could precipitate viral mutations or new infectious diseases that don’t respond to antimicrobial treatments. Insecticide-resistant mosquitoes could evolve. Weather-related food insecurity could increase malnutrition and weaken people’s immune systems. And the impact of an epidemic will be worse if it co-occurs during a heatwave, flood, or drought, says Hess.
The devil is in the climate variables
Solid predictions about the future of climate and disease are not possible with so many uncertainties. Difficult-to-measure drivers must be added to the empirical model mix, such as land and water use, ecosystem changes or the public’s willingness to accept a vaccine or practice social distancing. Nor is there any precedent for calculating the effect of climate changes that are accelerating at a faster speed than ever before.
The most critical climate variables thought to influence disease spread are temperature, precipitation, humidity, sunshine and wind, according to Tezaur’s research. And then there are variables within variables. Influenza scientists, for example, found that warm winters were predictors of the most severe flu seasons in the following year.
The human factor may be the most challenging determinant. To what degree will people curtail greenhouse gas emissions, if at all? The swift development of effective COVID-19 vaccines was a game-changer, but will scientists be able to repeat it during the next pandemic? Plus, no model could predict the amount of internet-fueled COVID-19 misinformation, Fair noted. To tackle this issue, infectious disease teams are looking to include more sociologists and political scientists in their modeling.
Addressing the gaps
Currently, researchers are focusing on the near future, predicting for next year, says Fair. “When it comes to long-term, that’s where we have the most work to do.” While scientists cannot foresee how political influences and misinformation spread will affect models, they are positioned to make headway in collecting and assessing new data streams that have never been merged.
Disease forecasting will require a significant investment into the infrastructure needed to collect data about the environment, vectors, and hosts at all spatial and temporal resolutions, Fair and her co-authors stated in their recent study. For example real-time data on mosquito prevalence and diversity in various settings and times is limited or non-existent. Fair also would like to see standards set in mosquito data collection in every country. “Standardizing across the US would be a huge accomplishment,” she says.
Understanding how climate change contributes to the spread of disease is critical for thwarting future calamities.
Jeanne Fair
Hess points to a dearth of data in local and regional datasets about how extreme weather events play out in different geographic locations. His research indicates that Africa and the Middle East experienced substantial climate shifts, for example, but are unrepresented in the evidentiary database, which limits conclusions. “A model for dengue may be good in Singapore but not necessarily in Port-au-Prince,” Hess explains. And, he adds, scientists need a way of evaluating models for how effective they are.
The hope, Rocklöv says, is that in the future we will have data-driven models rather than theoretical ones. In turn, sharper statistical analyses can inform resource allocation and intervention strategies to prevent outbreaks.
Most of all, experts emphasize that epidemiologists and climate scientists must stop working in silos. If scientists can successfully merge epidemiological data with climatic, biological, environmental, ecological and demographic data, they will make better predictions about complex disease patterns. Modeling “cross talk” and among disciplines and, in some cases, refusal to release data between countries is hindering discovery and advances.
It’s time for bold transdisciplinary action, says Hess. He points to initiatives that need funding in disease surveillance and control; developing and testing interventions; community education and social mobilization; decision-support analytics to predict when and where infections will emerge; advanced methodologies to improve modeling; training scientists in data management and integrated surveillance.
Establishing a new field of Outbreak Science to coordinate collaboration would accelerate progress. Investment in decision-support modeling tools for public health teams, policy makers, and other long-term planning stakeholders is imperative, too. We need to invest in programs that encourage people from climate modeling and epidemiology to work together in a cohesive fashion, says Tezaur. Joining forces is the only way to solve the formidable challenges ahead.
This article originally appeared in One Health/One Planet, a single-issue magazine that explores how climate change and other environmental shifts are increasing vulnerabilities to infectious diseases by land and by sea. The magazine probes how scientists are making progress with leaders in other fields toward solutions that embrace diverse perspectives and the interconnectedness of all lifeforms and the planet.