With Lab-Grown Chicken Nuggets, Dumplings, and Burgers, Futuristic Foods Aim to Seem Familiar
Sandhya Sriram is at the forefront of the expanding lab-grown meat industry in more ways than one.
"[Lab-grown meat] is kind of a brave new world for a lot of people, and food isn't something people like being brave about."
She's the CEO and co-founder of one of fewer than 30 companies that is even in this game in the first place. Her Singapore-based company, Shiok Meats, is the only one to pop up in Southeast Asia. And it's the only company in the world that's attempting to grow crustaceans in a lab, starting with shrimp. This spring, the company debuted a prototype of its shrimp, and completed a seed funding round of $4.6 million.
Yet despite all of these wins, Sriram's own mother won't try the company's shrimp. She's a staunch, lifelong vegetarian, adhering to a strict definition of what that means.
"[Lab-grown meat] is kind of a brave new world for a lot of people, and food isn't something people like being brave about. It's really a rather hard-wired thing," says Kate Krueger, the research director at New Harvest, a non-profit accelerator for cellular agriculture (the umbrella field that studies how to grow animal products in the lab, including meat, dairy, and eggs).
It's so hard-wired, in fact, that trends in food inform our species' origin story. In 2017, a group of paleoanthropologists caused an upset when they unearthed fossils in present day Morocco showing that our earliest human ancestors lived much further north and 100,000 years earlier than expected -- the remains date back 300,000 years. But the excavation not only included bones and tools, it also painted a clear picture of the prevailing menu at the time: The oldest humans were apparently chomping on tons of gazelle, as well as wildebeest and zebra when they could find them, plus the occasional seasonal ostrich egg.
These were people with a diet shaped by available resources, but also by the ability to cook in the first place. In his book Catching Fire: How Cooking Made Us Human, Harvard primatologist Richard Wrangam writes that the very thing that allowed for the evolution of Homo sapiens was the ability to transform raw ingredients into edible nutrients through cooking.
Today, our behavior and feelings around food are the product of local climate, crops, animal populations, and tools, but also religion, tradition, and superstition. So what happens when you add science to the mix? Turns out, we still trend toward the familiar. The innovations in lab-grown meat that are picking up the most steam are foods like burgers, not meat chips, and salmon, not salmon-cod-tilapia hybrids. It's not for lack of imagination, it's because the industry's practitioners know that a lifetime of food memories is a hard thing to contend with. So far, the nascent lab-grown meat industry is not so much disrupting as being shaped by the oldest culture we have.
Not a single piece of lab-grown meat is commercially available to consumers yet, and already so much ink has been spilled debating if it's really meat, if it's kosher, if it's vegetarian, if it's ethical, if it's sustainable. But whether or not the industry succeeds and sticks around is almost moot -- watching these conversations and innovations unfold serves as a mirror reflecting back who we are, what concerns us, and what we aspire to.
The More Things Change, the More They Stay the Same
The building blocks for making lab-grown meat right now are remarkably similar, no matter what type of animal protein a company is aiming to produce.
First, a small biopsy, about the size of a sesame seed, is taken from a single animal. Then, the muscle cells are isolated and added to a nutrient-dense culture in a bioreactor -- the same tool used to make beer -- where the cells can multiply, grow, and form muscle tissue. This tissue can then be mixed with additives like nutrients, seasonings, binders, and sometimes colors to form a food product. Whether a company is attempting to make chicken, fish, beef, shrimp, or any other animal protein in a lab, the basic steps remain similar. Cells from various animals do behave differently, though, and each company has its own proprietary techniques and tools. Some, for example, use fetal calf serum as their cell culture, while others, aiming for a more vegan approach, eschew it.
"New gadgets feel safest when they remind us of other objects that we already know."
According to Mark Post, who made the first lab-grown hamburger at Maastricht University in the Netherlands in 2013, the cells of just one cow can give way to 175 million four-ounce burgers. By today's available burger-making methods, you'd need to slaughter 440,000 cows for the same result. The projected difference in the purely material efficiency between the two systems is staggering. The environmental impact is hard to predict, though. Some companies claim that their lab-grown meat requires 99 percent less land and 96 percent less water than traditional farming methods -- and that rearing fewer cows, specifically, would reduce methane emissions -- but the energy cost of running a lab-grown-meat production facility at an industrial scale, especially as compared to small-scale, pasture-raised farming, could be problematic. It's difficult to truly measure any of this in a burgeoning industry.
At this point, growing something like an intact shrimp tail or a marbled steak in a lab is still a Holy Grail. It would require reproducing the complex musculo-skeletal and vascular structure of meat, not just the cellular basis, and no one's successfully done it yet. Until then, many companies working on lab-grown meat are perfecting mince. Each new company's demo of a prototype food feels distinctly regional, though: At the Disruption in Food and Sustainability Summit in March, Shiok (which is pronounced "shook," and is Singaporean slang for "very tasty and delicious") first shared a prototype of its shrimp as an ingredient in siu-mai, a dumpling of Chinese origin and a fixture at dim sum. JUST, a company based in the U.S., produced a demo chicken nugget.
As Jean Anthelme Brillat-Savarin, the 17th century founder of the gastronomic essay, famously said, "Show me what you eat, and I'll tell you who you are."
For many of these companies, the baseline animal protein they are trying to innovate also feels tied to place and culture: When meat comes from a bioreactor, not a farm, the world's largest exporter of seafood could be a landlocked region, and beef could be "reared" in a bayou, yet the handful of lab-grown fish companies, like Finless Foods and BlueNalu, hug the American coasts; VOW, based in Australia, started making lab-grown kangaroo meat in August; and of course the world's first lab-grown shrimp is in Singapore.
"In the U.S., shrimps are either seen in shrimp cocktail, shrimp sushi, and so on, but [in Singapore] we have everything from shrimp paste to shrimp oil," Sriram says. "It's used in noodles and rice, as flavoring in cup noodles, and in biscuits and crackers as well. It's seen in every form, shape, and size. It just made sense for us to go after a protein that was widely used."
It's tempting to assume that innovating on pillars of cultural significance might be easier if the focus were on a whole new kind of food to begin with, not your popular dim sum items or fast food offerings. But it's proving to be quite the opposite.
"That could have been one direction where [researchers] just said, 'Look, it's really hard to reproduce raw ground beef. Why don't we just make something completely new, like meat chips?'" says Mike Lee, co-founder and co-CEO of Alpha Food Labs, which works on food innovation more broadly. "While that strategy's interesting, I think we've got so many new things to explain to people that I don't know if you want to also explain this new format of food that you've never, ever seen before."
We've seen this same cautious approach to change before in other ways that relate to cooking. Perhaps the most obvious example is the kitchen range. As Bee Wilson writes in her book Consider the Fork: A History of How We Cook and Eat, in the 1880s, convincing ardent coal-range users to switch to newfangled gas was a hard sell. To win them over, inventor William Sugg designed a range that used gas, but aesthetically looked like the coal ones already in fashion at the time -- and which in some visual ways harkened even further back to the days of open-hearth cooking. Over time, gas range designs moved further away from those of the past, but the initial jump was only made possible through familiarity. There's a cleverness to meeting people where they are.
"New gadgets feel safest when they remind us of other objects that we already know," writes Wilson. "It is far harder to accept a technology that is entirely new."
Maybe someday we won't want anything other than meat chips, but not today.
Measuring Success
A 2018 Gallup poll shows that in the U.S., rates of true vegetarianism and veganism have been stagnant for as long as they've been measured. When the poll began in 1999, six percent of Americans were vegetarian, a number that remained steady until 2012, when the number dropped one point. As of 2018, it remained at five percent.
In 2012, when Gallup first measured the percentage of vegans, the rate was two percent. By 2018 it had gone up just one point, to three percent. Increasing awareness of animal welfare, health, and environmental concerns don't seem to be incentive enough to convince Americans, en masse, to completely slam the door on a food culture characterized in many ways by its emphasis on traditional meat consumption.
"A lot of consumers get over the ick factor when you tell them that most of the food that you're eating right now has entered the lab at some point."
Wilson writes that "experimenting with new foods has always been a dangerous business. In the wild, trying out some tempting new berries might lead to death. A lingering sense of this danger may make us risk-averse in the kitchen."
That might be one psychologically deep-seated reason that Americans are so resistant to ditch meat altogether. But a middle ground is emerging with a rise in flexitarianism, which aims to reduce reliance on traditional animal products. "Americans are eager to include alternatives to animal products in their diets, but are not willing to give up animal products completely," the same 2018 Gallup poll reported. This may represent the best opportunity for lab-grown meat to wedge itself into the culture.
Quantitatively predicting a population's willingness to try a lab-grown version of its favorite protein is proving a hard thing to measure, however, because it's still science fiction to a regular consumer. Measuring popular opinion of something that doesn't really exist yet is a dubious pastime.
In 2015, University of Wisconsin School of Public Health researchers Linnea Laestadius and Mark Caldwell conducted a study using online comments on articles about lab-grown meat to suss out public response to the food. The results showed a mostly negative attitude, but that was only two years into a field that is six years old today. Already public opinion may have shifted.
Shiok Meat's Sriram and her co-founder Ka Yi Ling have used online surveys to get a sense of the landscape, but they also take a more direct approach sometimes. Every time they give a public talk about their company and their shrimp, they poll their audience before and after the talk, using the question, "How many of you are willing to try, and pay, to eat lab-grown meat?"
They consistently find that the percentage of people willing to try goes up from 50 to 90 percent after hearing their talk, which includes information about the downsides of traditional shrimp farming (for one thing, many shrimp are raised in sewage, and peeled and deveined by slaves) and a bit of information about how lab-grown animal protein is being made now. I saw this pan out myself when Ling spoke at a New Harvest conference in Cambridge, Massachusetts in July.
"A lot of consumers get over the ick factor when you tell them that most of the food that you're eating right now has entered the lab at some point," Sriram says. "We're not going to grow our meat in the lab always. It's in the lab right now, because we're in R&D. Once we go into manufacturing ... it's going to be a food manufacturing facility, where a lot of food comes from."
The downside of the University of Wisconsin's and Shiok Meat's approach to capturing public opinion is that they each look at self-selecting groups: Online commenters are often fueled by a need to complain, and it's likely that anyone attending a talk by the co-founders of a lab-grown meat company already has some level of open-mindedness.
So Sriram says that she and Ling are also using another method to assess the landscape, and it's somewhere in the middle. They've been watching public responses to the closest available product to lab-grown meat that's on the market: Impossible Burger. As a 100 percent plant-based burger, it's not quite the same, but this bleedable, searable patty is still very much the product of science and laboratory work. Its remarkable similarity to beef is courtesy of yeast that have been genetically engineered to contain DNA from soy plant roots, which produce a protein called heme as they multiply. This heme is a plant-derived protein that can look and act like the heme found in animal muscle.
So far, the sciencey underpinnings of the burger don't seem to be turning people off. In just four years, it's already found its place within other American food icons. It's readily available everywhere from nationwide Burger Kings to Boston's Warren Tavern, which has been in operation since 1780, is one of the oldest pubs in America, and is even named after the man who sent Paul Revere on his midnight ride. Some people have already grown so attached to the Impossible Burger that they will actually walk out of a restaurant that's out of stock. Demand for the burger is outpacing production.
"Even though [Impossible] doesn't consider their product cellular agriculture, it's part of a spectrum of innovation," Krueger says. "There are novel proteins that you're not going to find in your average food, and there's some cool tech there. So to me, that does show a lot of willingness on people's part to think about trying something new."
The message for those working on animal-based lab-grown meat is clear: People will accept innovation on their favorite food if it tastes good enough and evokes the same emotional connection as the real deal.
"How people talk about lab-grown meat now, it's still a conversation about science, not about culture and emotion," Lee says. But he's confident that the conversation will start to shift in that direction if the companies doing this work can nail the flavor memory, above all.
And then proving how much power flavor lords over us, we quickly derail into a conversation about Doritos, which he calls "maniacally delicious." The chips carry no health value whatsoever and are a native product of food engineering and manufacturing — just watch how hard it is for Bon Appetit associate food editor Claire Saffitz to try and recreate them in the magazine's test kitchen — yet devotees remain unfazed and crunch on.
"It's funny because it shows you that people don't ask questions about how [some foods] are made, so why are they asking so many questions about how lab-grown meat is made?" Lee asks.
For all the hype around Impossible Burger, there are still controversies and hand-wringing around lab-grown meat. Some people are grossed out by the idea, some people are confused, and if you're the U.S. Cattlemen's Association (USCA), you're territorial. Last year, the group sent a petition to the USDA to "exclude products not derived directly from animals raised and slaughtered from the definition of 'beef' and meat.'"
"I think we are probably three or four big food safety scares away from everyone, especially younger generations, embracing lab-grown meat as like, 'Science is good; nature is dirty, and can kill you.'"
"I have this working hypothesis that if you look at the nation in 50-year spurts, we revolve back and forth between artisanal, all-natural food that's unadulterated and pure, and food that's empowered by science," Lee says. "Maybe we've only had one lap around the track on that, but I think we are probably three or four big food safety scares away from everyone, especially younger generations, embracing lab-grown meat as like, 'Science is good; nature is dirty, and can kill you.'"
Food culture goes beyond just the ingredients we know and love — it's also about how we interact with them, produce them, and expect them to taste and feel when we bite down. We accept a margin of difference among a fast food burger, a backyard burger from the grill, and a gourmet burger. Maybe someday we'll accept the difference between a burger created by killing a cow and a burger created by biopsying one.
Looking to the Future
Every time we engage with food, "we are enacting a ritual that binds us to the place we live and to those in our family, both living and dead," Wilson writes in Consider the Fork. "Such things are not easily shrugged off. Every time a new cooking technology has been introduced, however useful … it has been greeted in some quarters with hostility and protestations that the old ways were better and safer."
This is why it might be hard for a vegetarian mother to try her daughter's lab-grown shrimp, no matter how ethically it was produced or how awe-inspiring the invention is. Yet food cultures can and do change. "They're not these static things," says Benjamin Wurgaft, a historian whose book Meat Planet: Artificial Flesh and the Future of Food comes out this month. "The real tension seems to be between slow change and fast change."
In fact, the very definition of the word "meat" has never exclusively meant what the USCA wants it to mean. Before the 12th century, when it first appeared in Old English as "mete," it wasn't very specific at all and could be used to describe anything from "nourishment," to "food item," to "fodder," to "sustenance." By the 13th century it had been narrowed down to mean "flesh of warm-blooded animals killed and used as food." And yet the British mincemeat pie lives on as a sweet Christmas treat full of -- to the surprise of many non-Brits -- spiced, dried fruit. Since 1901, we've also used this word with ease as a general term for anything that's substantive -- as in, "the meat of the matter." There is room for yet more definitions to pile on.
"The conversation [about lab-ground meat] has changed remarkably in the last six years," Wurgaft says. "It has become a conversation about whether or not specific companies will bring a product to market, and that's a really different conversation than asking, 'Should we produce meat in the lab?'"
As part of the field research for his book, Wurgaft visited the Rijksmuseum Boerhaave, a Dutch museum that specializes in the history of science and medicine. It was 2015, and he was there to see an exhibit on the future of food. Just two years earlier, Mark Post had made that first lab-grown hamburger about a two-and-a-half hour drive south of the museum. When Wurgaft arrived, he found the novel invention, which Post had donated to the museum, already preserved and served up on a dinner plate, the whole outfit protected by plexiglass.
"They put this in the exhibit as if it were already part of the historical records, which to a historian looked really weird," Wurgaft says. "It looked like somebody taking the most recent supercomputer and putting it in a museum exhibit saying, 'This is the supercomputer that changed everything,' as if you were already 100 years in the future, looking back."
It seemed to symbolize an effort to codify a lab-grown hamburger as a matter of Dutch pride, perhaps someday occupying a place in people's hearts right next to the stroopwafel.
"Who's to say that we couldn't get a whole school of how to cook with lab-grown meat?"
Lee likes to imagine that part of the legacy of lab-grown meat, if it succeeds, will be to inspire entirely new fads in cooking -- a step beyond ones like the crab-filled avocado of the 1960s or the pesto of the 1980s in the U.S.
"[Lab-grown meat] is inherently going to be a different quality than anything we've done with an animal," he says. "Look at every cut [of meat] on the sphere today -- each requires a slightly different cooking method to optimize the flavor of that cut. Who's to say that we couldn't get a whole school of how to cook with lab-grown meat?"
At this point, most of us have no way of trying lab-grown meat. It remains exclusively available through sometimes gimmicky demos reserved for investors and the media. But Wurgaft says the stories we tell about this innovation, the articles we write, the films we make, and yes, even the museum exhibits we curate, all hold as much cultural significance as the product itself might someday.
Can tech help prevent the insect apocalypse?
This article originally appeared in One Health/One Planet, a single-issue magazine that explores how climate change and other environmental shifts are making us more vulnerable to infectious diseases by land and by sea - and how scientists are working on solutions.
On a warm summer day, forests, meadows, and riverbanks should be abuzz with insects—from butterflies to beetles and bees. But bugs aren’t as abundant as they used to be, and that’s not a plus for people and the planet, scientists say. The declining numbers of insects, coupled with climate change, can have devastating effects for people in more ways than one. “Insects have been around for a very long time and can live well without humans, but humans cannot live without insects and the many services they provide to us,” says Philipp Lehmann, a researcher in the Department of Zoology at Stockholm University in Sweden. Their decline is not just bad, Lehmann adds. “It’s devastating news for humans.
”Insects and other invertebrates are the most diverse organisms on the planet. They fill most niches in terrestrial and aquatic environments and drive ecosystem functions. Many insects are also economically vital because they pollinate crops that humans depend on for food, including cereals, vegetables, fruits, and nuts. A paper published in PNAS notes that insects alone are worth more than $70 billion a year to the U.S. economy. In places where pollinators like honeybees are in decline, farmers now buy them from rearing facilities at steep prices rather than relying on “Mother Nature.”
And because many insects serve as food for other species—bats, birds and freshwater fish—they’re an integral part of the ecosystem’s food chain. “If you like to eat good food, you should thank an insect,” says Scott Hoffman Black, an ecologist and executive director of the Xerces Society for Invertebrate Conservation in Portland, Oregon. “And if you like birds in your trees and fish in your streams, you should be concerned with insect conservation.”
Deforestation, urbanization, and agricultural spread have eaten away at large swaths of insect habitat. The increasingly poorly controlled use of insecticides, which harms unintended species, and the proliferation of invasive insect species that disrupt native ecosystems compound the problem.
“There is not a single reason why insects are in decline,” says Jessica L. Ware, associate curator in the Division of Invertebrate Zoology at the American Museum of Natural History in New York, and president of the Entomological Society of America. “There are over one million described insect species, occupying different niches and responding to environmental stressors in different ways.”
Jessica Ware, an entomologist at the American Museum of Natural History, is using DNA methods to monitor insects.
Credit:D.Finnin/AMNH
In addition to habitat loss fueling the decline in insect populations, the other “major drivers” Ware identified are invasive species, climate change, pollution, and fluctuating levels of nitrogen, which play a major role in the lifecycle of plants, some of which serve as insect habitants and others as their food. “The causes of world insect population declines are, unfortunately, very easy to link to human activities,” Lehmann says.
Climate change will undoubtedly make the problem worse. “As temperatures start to rise, it can essentially make it too hot for some insects to survive,” says Emily McDermott, an assistant professor in the Department of Entomology and Plant Pathology at the University of Arkansas. “Conversely in other areas, it could potentially also allow other insects to expand their ranges.”
Without Pollinators Humans Will Starve
We may not think much of our planet’s getting warmer by only one degree Celsius, but it can spell catastrophe for many insects, plants, and animals, because it’s often accompanied by less rainfall. “Changes in precipitation patterns will have cascading consequences across the tree of life,” says David Wagner, a professor of ecology and evolutionary biology at the University of Connecticut. Insects, in particular, are “very vulnerable” because “they’re small and susceptible to drying.”
For instance, droughts have put the monarch butterfly at risk of being unable to find nectar to “recharge its engine” as it migrates from Canada and New England to Mexico for winter, where it enters a hibernation state until it journeys back in the spring. “The monarch is an iconic and a much-loved insect,” whose migration “is imperiled by climate change,” Wagner says.
Warming and drying trends in the Western United States are perhaps having an even more severe impact on insects than in the eastern region. As a result, “we are seeing fewer individual butterflies per year,” says Matt Forister, a professor of insect ecology at the University of Nevada, Reno.
There are hundreds of butterfly species in the United States and thousands in the world. They are pollinators and can serve as good indicators of other species’ health. “Although butterflies are only one group among many important pollinators, in general we assume that what’s bad for butterflies is probably bad for other insects,” says Forister, whose research focuses on butterflies. Climate change and habitat destruction are wreaking havoc on butterflies as well as plants, leading to a further indirect effect on caterpillars and butterflies.
Different insect species have different levels of sensitivity to environmental changes. For example, one-half of the bumblebee species in the United States are showing declines, whereas the other half are not, says Christina Grozinger, a professor of entomology at the Pennsylvania State University. Some species of bumble bees are even increasing in their range, seemingly resilient to environmental changes. But other pollinators are dwindling to the point that farmers have to buy from the rearing facilities, which is the case for the California almond industry. “This is a massive cost to the farmer, which could be provided for free, in case the local habitats supported these pollinators,” Lehmann says.
For bees and other insects, climate change can harm the plants they depend on for survival or have a negative impact on the insects directly. Overly rainy and hot conditions may limit flowering in plants or reduce the ability of a pollinator to forage and feed, which then decreases their reproductive success, resulting in dwindling populations, Grozinger explains.
“Nutritional deprivation can also make pollinators more sensitive to viruses and parasites and therefore cause disease spread,” she says. “There are many ways that climate change can reduce our pollinator populations and make it more difficult to grow the many fruit, vegetable and nut crops that depend on pollinators.”
Disease-Causing Insects Can Bring More Outbreaks
While some much-needed insects are declining, certain disease-causing species may be spreading and proliferating, which is another reason for human concern. Many mosquito types spread malaria, Zika virus, West Nile virus, and a brain infection called equine encephalitis, along with other diseases as well as heartworms in dogs, says Michael Sabourin, president of the Vermont Entomological Society. An animal health specialist for the state, Sabourin conducts vector surveys that identify ticks and mosquitoes.
Scientists refer to disease-carrying insects as vector species and, while there’s a limited number of them, many of these infections can be deadly. Fleas were a well-known vector for the bubonic plague, while kissing bugs are a vector for Chagas disease, a potentially life-threatening parasitic illness in humans, dogs, and other mammals, Sabourin says.
As the planet heats up, some of the creepy crawlers are able to survive milder winters or move up north. Warmer temperatures and a shorter snow season have spawned an increasing abundance of ticks in Maine, including the blacklegged tick (Ixodes scapularis), known to transmit Lyme disease, says Sean Birkel, an assistant professor in the Climate Change Institute and Cooperative Extension at the University of Maine.
Coupled with more frequent and heavier precipitation, rising temperatures bring a longer warm season that can also lead to a longer period of mosquito activity. “While other factors may be at play, climate change affects important underlying conditions that can, in turn, facilitate the spread of vector-borne disease,” Birkel says.
For example, if mosquitoes are finding fewer of their preferred food sources, they may bite humans more. Both male and female mosquitoes feed on sugar as part of their normal behavior, but if they aren’t eating their fill, they may become more bloodthirsty. One recent paper found that sugar-deprived Anopheles gambiae females go for larger blood meals to stay in good health and lay eggs. “More blood meals equals more chances to pick up and transmit a pathogen,” McDermott says, He adds that climate change could reduce the number of available plants to feed on. And while most mosquitoes are “generalist sugar-feeders” meaning that they will likely find alternatives, losing their favorite plants can make them hungrier for blood.
Similar to the effect of losing plants, mosquitoes may get turned onto people if they lose their favorite animal species. For example, some studies found that Culex pipiens mosquitoes that transmit the West Nile virus feed primarily on birds in summer. But that changes in the fall, at least in some places. Because there are fewer birds around, C. pipiens switch to mammals, including humans. And if some disease-carrying insect species proliferate or increase their ranges, that increases chances for human infection, says McDermott. “A larger concern is that climate change could increase vector population sizes, making it more likely that people or animals would be bitten by an infected insect.”
Science Can Help Bring Back the Buzz
To help friendly insects thrive and keep the foes in check, scientists need better ways of trapping, counting, and monitoring insects. It’s not an easy job, but artificial intelligence and molecular methods can help. Ware’s lab uses various environmental DNA methods to monitor freshwater habitats. Molecular technologies hold much promise. The so-called DNA barcodes, in which species are identified using a short string of their genes, can now be used to identify birds, bees, moths and other creatures, and should be used on a larger scale, says Wagner, the University of Connecticut professor. “One day, something akin to Star Trek’s tricorder will soon be on sale down at the local science store.”
Scientists are also deploying artificial intelligence, or AI, to identify insects in agricultural systems and north latitudes where there are fewer bugs, Wagner says. For instance, some automated traps already use the wingbeat frequencies of mosquitoes to distinguish the harmless ones from the disease-carriers. But new technology and software are needed to further expand detection based on vision, sound, and odors.
“Because of their ubiquity, enormity of numbers, and seemingly boundless diversity, we desperately need to develop molecular and AI technologies that will allow us to automate sampling and identification,” says Wagner. “That would accelerate our ability to track insect populations, alert us to the presence of new disease vectors, exotic pest introductions, and unexpected declines.”
Your surgery could harm yourself and the planet. Here's what some doctors are doing about it.
This is part 1 of a three part series on a new generation of doctors leading the charge to make the health care industry more sustainable - for the benefit of their patients and the planet. Read part 2 here and part 3 here.
Susanne Koch, an anesthesiologist and neurologist, reached a pivot point when she was up to her neck in water, almost literally. The basement of her house in Berlin had flooded in the summer of 2018, when Berlin was pummeled by unusually strong rains. After she drained the house, “I wanted to dig into facts, to understand how exactly these extreme weather events are related to climate change,” she says.
Studying the scientific literature, she realized how urgent the climate crisis is, but the biggest shock was to learn that her profession contributed substantially to the problem: Inhalation gases used during medical procedures are among the most damaging greenhouse gases. Some inhalation gases are 3,000 times more damaging for the climate than CO2, Koch discovered. “Spending seven hours in the surgery room is the equivalent of driving a car for four days nonstop,” she says. Her job of helping people at Europe’s largest university hospital, the Charité in Berlin, was inadvertently damaging both the people and the planet.
“Nobody had ever even mentioned a word about that during my training,” Koch says.
On the whole, the medical sector is responsible for a disproportionally large percentage of greenhouse gas emissions, with the U.S. as the biggest culprit. According to a key paper published in 2020 in Health Affairs, the health industry “is among the most carbon-intensive service sectors in the industrialized world,” accounting for between 4.4 percent and 4.6 percent of greenhouse gas emissions. “It’s not just anesthesia but health care that has a problem,” says Jodi Sherman, anesthesiology professor and Medical Director of the Program on Healthcare Environmental Sustainability at Yale University as well as co-director of the Lancet Planetary Health Commission on Sustainable Healthcare. In the U.S., health care greenhouse gas emissions make up about 8.5 percent of domestic greenhouse gas emissions. They rose 6 percent from 2010 to 2018, to nearly 1,700 kilograms per person, more than in any other nation.
Of course, patients worry primarily about safety, not sustainability. Yet, Koch emphasizes that “as doctors, we have the responsibility to do no harm, and this includes making sure that we use resources as sustainably as possible.” Studies show that 2018 greenhouse gas and toxic air pollutant emissions resulted in the loss of 388,000 disability-adjusted life years in the U.S. alone. “Disease burden from health care pollution is of the same order of magnitude as deaths from preventable medical errors, and should be taken just as seriously,” Sherman cautions.
When Koch, the anesthesiologist, started discussing sustainable options with colleagues, the topic was immediately met with plenty of interest. Her experience is consistent with the latest representative poll of the nonprofit Foundation Health in Germany. Nine out of ten doctors were interested in urgently finding sustainable solutions for medical services but lacked knowhow and resources. For teaching purposes, Sherman and her team have developed the Yale Gassing Greener app that allows anesthesiologists to compare how much pollution they can avoid through choosing different anesthesia methods. Sherman also published professional guidelines intended to help her colleagues better understand how various methods affect carbon emissions.
Significant traces of inhalation gases have been found in Antarctica and the Himalayas, far from the vast majority of surgery rooms.
A solution espoused by both Sherman and Koch is comparatively simple: They stopped using desflurane, which is by far the most damaging of all inhalation gases to the climate. Its greenhouse effect is 2,590 times stronger than carbon dioxide. The Yale New Haven Hospital already stopped using desflurane in 2013, becoming the first known healthcare organization to eliminate a drug based on environmental grounds. Sherman points out that this resulted in saving more than $1.2 million in costs and 1,600 tons of CO2 equivalents, about the same as the exhaust from 360 passenger vehicles per year.
At the Charité, Koch claims that switching to other anesthesiology choices, such as propofol, has eliminated 90 percent of the climate gas emissions in the anesthesiology department since 2016. Young anesthesiologists are still taught to use desflurane as the standard because desflurane is absorbed less into the patients’ bodies, and they wake up faster. However, Koch who has worked as an anesthesiologist since 2006, says that with a little bit of experience, you can learn when to stop giving the propofol so it's timed just as well with a person’s wake-up process. In addition, “patients are less likely to feel nauseous after being given propofol,” Koch says. Intravenous drugs might require more skill, she adds, "but there is nothing unique to the drug desflurane that cannot be accomplished with other medications.”
Desflurane isn’t the only gas to be concerned about. Nitrous oxide is the second most damaging because it’s extremely long-lived in the environment, and it depletes the ozone layer. Climate-conscious anesthesiologists are phasing out this gas, too, or have implemented measures to decrease leaks.
Internationally, 192 governments agreed in the Kyoto protocol of 2005 to reduce halogenated hydrocarbons – resulting from inhalation gases, including desflurane and nitrous oxide – because of their immense climate-warming potential, and in 2016, they pledged to eliminate them by 2035. However, the use of inhalation anesthetics continues to increase worldwide, not least because more people access healthcare in developing countries, and because people in industrialized countries live longer and therefore need more surgeries. Significant traces of inhalation gases have been found in Antarctica and the Himalayas, far from the vast majority of surgery rooms.
Certain companies are now pushing new technology to capture inhalation gases before they are released into the atmosphere, but both Sherman and Koch believe marketing claims of 99 percent efficiency amount to greenwashing. After investigating the technology first-hand and visiting the company that is producing such filters in Germany, Koch concluded that such technology only reduces emissions by 25 percent. And Sherman believes such initiatives are akin to the fallacy of recycling plastic. In addition to questioning their efficiency, Sherman fears such technology “gives the illusion there is a magical solution that means I don’t need to change my behavior, reduce my waste and choose less harmful options.”
Financial interests are at play, too. “Desflurane is the most expensive inhalation gas, and some think, the most expensive must be the best,” Koch says. Both Koch and Sherman lament that efforts to increase sustainability in the medical sector are entirely voluntary in their countries and led by a few dedicated individual professionals while industry-wide standards and transparency are needed, a notion expressed in the American Hospital Association’s Sustainability Roadmap.
Susanne Koch, an anesthesiologist in Berlin, wants her colleagues to stop using a gas called desflurane, which is by far the most damaging of all inhalation gases to the climate.
Adobe Stock
Other countries have done more. The European Union recommends reducing inhalation gases and even contemplated a ban of desflurane, except in medical emergencies. In 2008, the National Health Service (NHS) created a Sustainable Development Unit, which measures CO2 emissions in the U.K. health sector. NHS is the first national health service that pledged to reach net zero carbon by 2040. The carbon footprint of the NHS fell by 26 percent from 1990 to 2019, mostly due to reduced use of certain inhalers and the switch to renewable energy for heat and power. “The evidence that the climate emergency is a health emergency is overwhelming,” said Nick Watts, the NHS Chief Sustainability Officer, in a press release, “with health professionals already needing to manage its symptoms.”
Sherman is a leading voice in demanding action in the U.S. To her, comprehensive solutions start with the mandatory, transparent measurement of emissions in the health sector to tackle the biggest sources of pollution. While the Biden administration highlighted its efforts to reduce these kinds of emissions during the United Nations Climate Conference (COP27) in November 2022 and U.S. delegates announced that more than 100 health care organizations signed the voluntary Health Sector Climate Pledge, with the aim to reduce emissions by 50 percent in the next eight years, Sherman is convinced that voluntary pledges are not enough. “Voluntary measures are insufficient,” she testified in congress. “The vast majority of U.S. health care organizations remain uncommitted to timely action. Those that are committed lack policies and knowledge to support necessary changes; even worse, existing policies drive inappropriate consumption of resources and pollution.”
Both Sherman and Koch look at the larger picture. “Health care organizations have an obligation to their communities to protect public health,” Sherman says. “We must lead by example. That includes setting ambitious, science-based carbon reduction targets to achieve net zero emissions before 2050. We must quantify current emissions and their sources, particularly throughout the health care supply chains.”