With Lab-Grown Chicken Nuggets, Dumplings, and Burgers, Futuristic Foods Aim to Seem Familiar
Sandhya Sriram is at the forefront of the expanding lab-grown meat industry in more ways than one.
"[Lab-grown meat] is kind of a brave new world for a lot of people, and food isn't something people like being brave about."
She's the CEO and co-founder of one of fewer than 30 companies that is even in this game in the first place. Her Singapore-based company, Shiok Meats, is the only one to pop up in Southeast Asia. And it's the only company in the world that's attempting to grow crustaceans in a lab, starting with shrimp. This spring, the company debuted a prototype of its shrimp, and completed a seed funding round of $4.6 million.
Yet despite all of these wins, Sriram's own mother won't try the company's shrimp. She's a staunch, lifelong vegetarian, adhering to a strict definition of what that means.
"[Lab-grown meat] is kind of a brave new world for a lot of people, and food isn't something people like being brave about. It's really a rather hard-wired thing," says Kate Krueger, the research director at New Harvest, a non-profit accelerator for cellular agriculture (the umbrella field that studies how to grow animal products in the lab, including meat, dairy, and eggs).
It's so hard-wired, in fact, that trends in food inform our species' origin story. In 2017, a group of paleoanthropologists caused an upset when they unearthed fossils in present day Morocco showing that our earliest human ancestors lived much further north and 100,000 years earlier than expected -- the remains date back 300,000 years. But the excavation not only included bones and tools, it also painted a clear picture of the prevailing menu at the time: The oldest humans were apparently chomping on tons of gazelle, as well as wildebeest and zebra when they could find them, plus the occasional seasonal ostrich egg.
These were people with a diet shaped by available resources, but also by the ability to cook in the first place. In his book Catching Fire: How Cooking Made Us Human, Harvard primatologist Richard Wrangam writes that the very thing that allowed for the evolution of Homo sapiens was the ability to transform raw ingredients into edible nutrients through cooking.
Today, our behavior and feelings around food are the product of local climate, crops, animal populations, and tools, but also religion, tradition, and superstition. So what happens when you add science to the mix? Turns out, we still trend toward the familiar. The innovations in lab-grown meat that are picking up the most steam are foods like burgers, not meat chips, and salmon, not salmon-cod-tilapia hybrids. It's not for lack of imagination, it's because the industry's practitioners know that a lifetime of food memories is a hard thing to contend with. So far, the nascent lab-grown meat industry is not so much disrupting as being shaped by the oldest culture we have.
Not a single piece of lab-grown meat is commercially available to consumers yet, and already so much ink has been spilled debating if it's really meat, if it's kosher, if it's vegetarian, if it's ethical, if it's sustainable. But whether or not the industry succeeds and sticks around is almost moot -- watching these conversations and innovations unfold serves as a mirror reflecting back who we are, what concerns us, and what we aspire to.
The More Things Change, the More They Stay the Same
The building blocks for making lab-grown meat right now are remarkably similar, no matter what type of animal protein a company is aiming to produce.
First, a small biopsy, about the size of a sesame seed, is taken from a single animal. Then, the muscle cells are isolated and added to a nutrient-dense culture in a bioreactor -- the same tool used to make beer -- where the cells can multiply, grow, and form muscle tissue. This tissue can then be mixed with additives like nutrients, seasonings, binders, and sometimes colors to form a food product. Whether a company is attempting to make chicken, fish, beef, shrimp, or any other animal protein in a lab, the basic steps remain similar. Cells from various animals do behave differently, though, and each company has its own proprietary techniques and tools. Some, for example, use fetal calf serum as their cell culture, while others, aiming for a more vegan approach, eschew it.
"New gadgets feel safest when they remind us of other objects that we already know."
According to Mark Post, who made the first lab-grown hamburger at Maastricht University in the Netherlands in 2013, the cells of just one cow can give way to 175 million four-ounce burgers. By today's available burger-making methods, you'd need to slaughter 440,000 cows for the same result. The projected difference in the purely material efficiency between the two systems is staggering. The environmental impact is hard to predict, though. Some companies claim that their lab-grown meat requires 99 percent less land and 96 percent less water than traditional farming methods -- and that rearing fewer cows, specifically, would reduce methane emissions -- but the energy cost of running a lab-grown-meat production facility at an industrial scale, especially as compared to small-scale, pasture-raised farming, could be problematic. It's difficult to truly measure any of this in a burgeoning industry.
At this point, growing something like an intact shrimp tail or a marbled steak in a lab is still a Holy Grail. It would require reproducing the complex musculo-skeletal and vascular structure of meat, not just the cellular basis, and no one's successfully done it yet. Until then, many companies working on lab-grown meat are perfecting mince. Each new company's demo of a prototype food feels distinctly regional, though: At the Disruption in Food and Sustainability Summit in March, Shiok (which is pronounced "shook," and is Singaporean slang for "very tasty and delicious") first shared a prototype of its shrimp as an ingredient in siu-mai, a dumpling of Chinese origin and a fixture at dim sum. JUST, a company based in the U.S., produced a demo chicken nugget.
As Jean Anthelme Brillat-Savarin, the 17th century founder of the gastronomic essay, famously said, "Show me what you eat, and I'll tell you who you are."
For many of these companies, the baseline animal protein they are trying to innovate also feels tied to place and culture: When meat comes from a bioreactor, not a farm, the world's largest exporter of seafood could be a landlocked region, and beef could be "reared" in a bayou, yet the handful of lab-grown fish companies, like Finless Foods and BlueNalu, hug the American coasts; VOW, based in Australia, started making lab-grown kangaroo meat in August; and of course the world's first lab-grown shrimp is in Singapore.
"In the U.S., shrimps are either seen in shrimp cocktail, shrimp sushi, and so on, but [in Singapore] we have everything from shrimp paste to shrimp oil," Sriram says. "It's used in noodles and rice, as flavoring in cup noodles, and in biscuits and crackers as well. It's seen in every form, shape, and size. It just made sense for us to go after a protein that was widely used."
It's tempting to assume that innovating on pillars of cultural significance might be easier if the focus were on a whole new kind of food to begin with, not your popular dim sum items or fast food offerings. But it's proving to be quite the opposite.
"That could have been one direction where [researchers] just said, 'Look, it's really hard to reproduce raw ground beef. Why don't we just make something completely new, like meat chips?'" says Mike Lee, co-founder and co-CEO of Alpha Food Labs, which works on food innovation more broadly. "While that strategy's interesting, I think we've got so many new things to explain to people that I don't know if you want to also explain this new format of food that you've never, ever seen before."
We've seen this same cautious approach to change before in other ways that relate to cooking. Perhaps the most obvious example is the kitchen range. As Bee Wilson writes in her book Consider the Fork: A History of How We Cook and Eat, in the 1880s, convincing ardent coal-range users to switch to newfangled gas was a hard sell. To win them over, inventor William Sugg designed a range that used gas, but aesthetically looked like the coal ones already in fashion at the time -- and which in some visual ways harkened even further back to the days of open-hearth cooking. Over time, gas range designs moved further away from those of the past, but the initial jump was only made possible through familiarity. There's a cleverness to meeting people where they are.
"New gadgets feel safest when they remind us of other objects that we already know," writes Wilson. "It is far harder to accept a technology that is entirely new."
Maybe someday we won't want anything other than meat chips, but not today.
Measuring Success
A 2018 Gallup poll shows that in the U.S., rates of true vegetarianism and veganism have been stagnant for as long as they've been measured. When the poll began in 1999, six percent of Americans were vegetarian, a number that remained steady until 2012, when the number dropped one point. As of 2018, it remained at five percent.
In 2012, when Gallup first measured the percentage of vegans, the rate was two percent. By 2018 it had gone up just one point, to three percent. Increasing awareness of animal welfare, health, and environmental concerns don't seem to be incentive enough to convince Americans, en masse, to completely slam the door on a food culture characterized in many ways by its emphasis on traditional meat consumption.
"A lot of consumers get over the ick factor when you tell them that most of the food that you're eating right now has entered the lab at some point."
Wilson writes that "experimenting with new foods has always been a dangerous business. In the wild, trying out some tempting new berries might lead to death. A lingering sense of this danger may make us risk-averse in the kitchen."
That might be one psychologically deep-seated reason that Americans are so resistant to ditch meat altogether. But a middle ground is emerging with a rise in flexitarianism, which aims to reduce reliance on traditional animal products. "Americans are eager to include alternatives to animal products in their diets, but are not willing to give up animal products completely," the same 2018 Gallup poll reported. This may represent the best opportunity for lab-grown meat to wedge itself into the culture.
Quantitatively predicting a population's willingness to try a lab-grown version of its favorite protein is proving a hard thing to measure, however, because it's still science fiction to a regular consumer. Measuring popular opinion of something that doesn't really exist yet is a dubious pastime.
In 2015, University of Wisconsin School of Public Health researchers Linnea Laestadius and Mark Caldwell conducted a study using online comments on articles about lab-grown meat to suss out public response to the food. The results showed a mostly negative attitude, but that was only two years into a field that is six years old today. Already public opinion may have shifted.
Shiok Meat's Sriram and her co-founder Ka Yi Ling have used online surveys to get a sense of the landscape, but they also take a more direct approach sometimes. Every time they give a public talk about their company and their shrimp, they poll their audience before and after the talk, using the question, "How many of you are willing to try, and pay, to eat lab-grown meat?"
They consistently find that the percentage of people willing to try goes up from 50 to 90 percent after hearing their talk, which includes information about the downsides of traditional shrimp farming (for one thing, many shrimp are raised in sewage, and peeled and deveined by slaves) and a bit of information about how lab-grown animal protein is being made now. I saw this pan out myself when Ling spoke at a New Harvest conference in Cambridge, Massachusetts in July.
"A lot of consumers get over the ick factor when you tell them that most of the food that you're eating right now has entered the lab at some point," Sriram says. "We're not going to grow our meat in the lab always. It's in the lab right now, because we're in R&D. Once we go into manufacturing ... it's going to be a food manufacturing facility, where a lot of food comes from."
The downside of the University of Wisconsin's and Shiok Meat's approach to capturing public opinion is that they each look at self-selecting groups: Online commenters are often fueled by a need to complain, and it's likely that anyone attending a talk by the co-founders of a lab-grown meat company already has some level of open-mindedness.
So Sriram says that she and Ling are also using another method to assess the landscape, and it's somewhere in the middle. They've been watching public responses to the closest available product to lab-grown meat that's on the market: Impossible Burger. As a 100 percent plant-based burger, it's not quite the same, but this bleedable, searable patty is still very much the product of science and laboratory work. Its remarkable similarity to beef is courtesy of yeast that have been genetically engineered to contain DNA from soy plant roots, which produce a protein called heme as they multiply. This heme is a plant-derived protein that can look and act like the heme found in animal muscle.
So far, the sciencey underpinnings of the burger don't seem to be turning people off. In just four years, it's already found its place within other American food icons. It's readily available everywhere from nationwide Burger Kings to Boston's Warren Tavern, which has been in operation since 1780, is one of the oldest pubs in America, and is even named after the man who sent Paul Revere on his midnight ride. Some people have already grown so attached to the Impossible Burger that they will actually walk out of a restaurant that's out of stock. Demand for the burger is outpacing production.
"Even though [Impossible] doesn't consider their product cellular agriculture, it's part of a spectrum of innovation," Krueger says. "There are novel proteins that you're not going to find in your average food, and there's some cool tech there. So to me, that does show a lot of willingness on people's part to think about trying something new."
The message for those working on animal-based lab-grown meat is clear: People will accept innovation on their favorite food if it tastes good enough and evokes the same emotional connection as the real deal.
"How people talk about lab-grown meat now, it's still a conversation about science, not about culture and emotion," Lee says. But he's confident that the conversation will start to shift in that direction if the companies doing this work can nail the flavor memory, above all.
And then proving how much power flavor lords over us, we quickly derail into a conversation about Doritos, which he calls "maniacally delicious." The chips carry no health value whatsoever and are a native product of food engineering and manufacturing — just watch how hard it is for Bon Appetit associate food editor Claire Saffitz to try and recreate them in the magazine's test kitchen — yet devotees remain unfazed and crunch on.
"It's funny because it shows you that people don't ask questions about how [some foods] are made, so why are they asking so many questions about how lab-grown meat is made?" Lee asks.
For all the hype around Impossible Burger, there are still controversies and hand-wringing around lab-grown meat. Some people are grossed out by the idea, some people are confused, and if you're the U.S. Cattlemen's Association (USCA), you're territorial. Last year, the group sent a petition to the USDA to "exclude products not derived directly from animals raised and slaughtered from the definition of 'beef' and meat.'"
"I think we are probably three or four big food safety scares away from everyone, especially younger generations, embracing lab-grown meat as like, 'Science is good; nature is dirty, and can kill you.'"
"I have this working hypothesis that if you look at the nation in 50-year spurts, we revolve back and forth between artisanal, all-natural food that's unadulterated and pure, and food that's empowered by science," Lee says. "Maybe we've only had one lap around the track on that, but I think we are probably three or four big food safety scares away from everyone, especially younger generations, embracing lab-grown meat as like, 'Science is good; nature is dirty, and can kill you.'"
Food culture goes beyond just the ingredients we know and love — it's also about how we interact with them, produce them, and expect them to taste and feel when we bite down. We accept a margin of difference among a fast food burger, a backyard burger from the grill, and a gourmet burger. Maybe someday we'll accept the difference between a burger created by killing a cow and a burger created by biopsying one.
Looking to the Future
Every time we engage with food, "we are enacting a ritual that binds us to the place we live and to those in our family, both living and dead," Wilson writes in Consider the Fork. "Such things are not easily shrugged off. Every time a new cooking technology has been introduced, however useful … it has been greeted in some quarters with hostility and protestations that the old ways were better and safer."
This is why it might be hard for a vegetarian mother to try her daughter's lab-grown shrimp, no matter how ethically it was produced or how awe-inspiring the invention is. Yet food cultures can and do change. "They're not these static things," says Benjamin Wurgaft, a historian whose book Meat Planet: Artificial Flesh and the Future of Food comes out this month. "The real tension seems to be between slow change and fast change."
In fact, the very definition of the word "meat" has never exclusively meant what the USCA wants it to mean. Before the 12th century, when it first appeared in Old English as "mete," it wasn't very specific at all and could be used to describe anything from "nourishment," to "food item," to "fodder," to "sustenance." By the 13th century it had been narrowed down to mean "flesh of warm-blooded animals killed and used as food." And yet the British mincemeat pie lives on as a sweet Christmas treat full of -- to the surprise of many non-Brits -- spiced, dried fruit. Since 1901, we've also used this word with ease as a general term for anything that's substantive -- as in, "the meat of the matter." There is room for yet more definitions to pile on.
"The conversation [about lab-ground meat] has changed remarkably in the last six years," Wurgaft says. "It has become a conversation about whether or not specific companies will bring a product to market, and that's a really different conversation than asking, 'Should we produce meat in the lab?'"
As part of the field research for his book, Wurgaft visited the Rijksmuseum Boerhaave, a Dutch museum that specializes in the history of science and medicine. It was 2015, and he was there to see an exhibit on the future of food. Just two years earlier, Mark Post had made that first lab-grown hamburger about a two-and-a-half hour drive south of the museum. When Wurgaft arrived, he found the novel invention, which Post had donated to the museum, already preserved and served up on a dinner plate, the whole outfit protected by plexiglass.
"They put this in the exhibit as if it were already part of the historical records, which to a historian looked really weird," Wurgaft says. "It looked like somebody taking the most recent supercomputer and putting it in a museum exhibit saying, 'This is the supercomputer that changed everything,' as if you were already 100 years in the future, looking back."
It seemed to symbolize an effort to codify a lab-grown hamburger as a matter of Dutch pride, perhaps someday occupying a place in people's hearts right next to the stroopwafel.
"Who's to say that we couldn't get a whole school of how to cook with lab-grown meat?"
Lee likes to imagine that part of the legacy of lab-grown meat, if it succeeds, will be to inspire entirely new fads in cooking -- a step beyond ones like the crab-filled avocado of the 1960s or the pesto of the 1980s in the U.S.
"[Lab-grown meat] is inherently going to be a different quality than anything we've done with an animal," he says. "Look at every cut [of meat] on the sphere today -- each requires a slightly different cooking method to optimize the flavor of that cut. Who's to say that we couldn't get a whole school of how to cook with lab-grown meat?"
At this point, most of us have no way of trying lab-grown meat. It remains exclusively available through sometimes gimmicky demos reserved for investors and the media. But Wurgaft says the stories we tell about this innovation, the articles we write, the films we make, and yes, even the museum exhibits we curate, all hold as much cultural significance as the product itself might someday.
Could a tiny fern change the world — again?
More than 50 million years ago, the Arctic Ocean was the opposite of a frigid wasteland. It was a gigantic lake surrounded by lush greenery brimming with flora and fauna, thanks to the humidity and warm temperatures. Giant tortoises, alligators, rhinoceros-like animals, primates, and tapirs roamed through nearby forests in the Arctic.
This greenhouse utopia abruptly changed in the early Eocene period, when the Arctic Ocean became landlocked. A channel that connected the Arctic to the greater oceans got blocked. This provided a tiny fern called Azolla the perfect opportunity to colonize the layer of freshwater that formed on the surface of the Arctic Ocean. The floating plants rapidly covered the water body in thick layers that resembled green blankets.
Gradually, Azolla colonies migrated to every continent with the help of repeated flooding events. For around a million years, they captured more than 80 percent of atmospheric carbon dioxide that got buried at the bottom of the Arctic Ocean as billions of Azolla plants perished.
This “Arctic Azolla event” had devastating impacts on marine life. To date, scientists are trying to figure out how it ended. But they documented that the extraordinary event cooled down the Arctic by at least 40 degrees Fahrenheit — effectively freezing the poles and triggering several cycles of ice ages. “This carbon dioxide sequestration changed the climate from greenhouse to white house,” says Jonathan Bujak, a paleontologist who has researched the Arctic through expeditions since 1973.
Some farmers and scientists, such as Bujak, are looking to this ancient fern, which manipulated the Earth’s climate around 49 million years ago with its insatiable appetite for carbon dioxide, as a potential solution to our modern-day agricultural and environmental challenges. “There is no other plant like Azolla in the world,” says Bujak.
Decoding the Azolla plant
Azolla lives in symbiosis with a cyanobacterium called Anabaena that made the plant’s leaf cavities its permanent home at an early stage in Earth's history. This close relationship with Anabaena enables Azolla to accomplish a feat that is impossible for most plants: directly splitting dinitrogen molecules that make up 78 percent of the Earth’s atmosphere.
A dinitrogen molecule consists of two nitrogen atoms tightly locked together in one of the strongest bonds in nature. The semi-aquatic fern’s ability to split nitrogen, called nitrogen-fixing, made it a highly revered plant in East Asia. Rice farmers used Azolla as a biofertilizer since the 11th century in Vietnam and China.
For decades, scientists have attempted to decode Azolla’s evolution. Cell biologist Francisco Carrapico, who worked at the University of Lisbon, has analyzed this distinctive symbiosis since the 1980s. To his amazement, in 1991, he found that bacteria are the third partner of the Azolla-Anabaena symbiosis.
“Azolla and Anabaena cannot survive without each other. They have co-evolved for 80 million years, continuously exchanging their genetic material with each other,” says Bujak, co-author of The Azolla Story, which he published with his daughter, Alexandra Bujak, an environmental scientist. Three different levels of nitrogen fixation take place within the plant, as Anabaena draws down as much as 2,200 pounds of atmospheric nitrogen per acre annually.
“Using Azolla to mitigate climate change might sound a bit too simple. But that is not the case,” Bujak says. “At a microscopic level, extremely complicated biochemical reactions are constantly occurring inside the plant’s cells that machines or technology cannot replicate yet.”
In 2018, researchers based in the U.S. managed to sequence Azolla’s complete genome — which is four times larger than the human genome — through a crowdfunded study, further increasing our understanding of this plant. “Azolla is a superorganism that works efficiently as a natural biotechnology system that makes it capable of doubling in size within three to five days,” says Carrapico.
Making Azolla mainstream again in agriculture
While scientific groups in the Global North have been working towards unraveling the tiny fern’s inner workings, communities in the Global South are busy devising creative ways to return to their traditional agricultural roots by tapping into Azolla’s full potential.
Pham Gia Minh, an entrepreneur living in Hanoi, Vietnam, is one such citizen scientist who believes that Azolla could be a climate savior. More than two decades after working in finance and business development, Minh is now focusing on continuing his grandfather’s legacy, an agricultural scientist who conducted Azolla research until the 1950s. “Azolla is our family’s heritage,” says Minh.
Pham Gia Minh, an entrepreneur and citizen scientist in Hanoi, Vietnam, believes that Azolla could be a climate savior
Pham Gia Minh
Since the advent of chemical fertilizers in the early 1900s, farmers in Asia abandoned Azolla to save on time and labor costs. But rice farmers in the country went back to cultivating Azolla during the Vietnam War after chemical trade embargoes made chemical fertilizers far too expensive and inaccessible.
By 1973, Azolla cultivation in rice paddy fields was established on half a million hectares in Vietnam. By injecting nitrogen into the soil, Azolla improves soil fertility and also increases rice yields by at least 27 percent compared to urea. The plants can also reduce a farm’s methane emissions by 40 percent.
“Unfortunately, after 1985, chemical fertilizers became cheap and widely available in Vietnam again. So, farmers stopped growing Azolla because of the time-consuming and labor-intensive cultivation process,” says Minh.
Minh has invested in a rural farm where he is proving that modern technology can make the process less burdensome. He uses a pump and drying equipment for harvesting Azolla in a small pond, and he deploys a drone for spraying insecticides and fertilizers on the pond at regular intervals.
As Azolla lacks phosphorus, farmers in developing countries still find it challenging to let go of chemical fertilizers completely. Still, Minh and Bujak say that farmers can use Azolla instead of chemical fertilizers after mixing it with dung.
In the last few years, the fern’s popularity has been growing in other developing countries like India, Palestine, Indonesia, the Philippines, and Bangladesh, where local governments and citizens are trying to re-introduce Azolla integrated farming by growing the ferns in small ponds.
Replacing soybeans with Azolla
In Ecuador, Mariano Montano Armijos, a former chemical engineer, has worked with Azolla for more than 20 years. Since 2008, he has shared resources and information for growing Azolla with 3,000 farmers in Ecuador. The farmers use the harvested plants as a bio-fertilizer and feed for livestock.
“The farmers do not use urea anymore,” says Armijos. “This goes against the conventional agricultural practices of using huge amounts of synthetic nitrogen on a hectare of rice or corn fields.”
He insists that Azolla’s greatest strength is that it is a rich source of proteins, making it highly nutritious for human beings as well. After growing Azolla on a small scale in ponds, Armijos and his business partner, Ivan Noboa, are now building a facility for cultivating the ferns as a superfood on an industrial scale.
According to Armijos, one hectare of Azolla in Ecuador can produce seven tons of proteins. Whereas soybeans produce only one ton of protein per hectare. “If we switch to Azolla, it could help in reducing deforestation in the Amazon. But taming Azolla and turning it into a crop is not easy,” he adds.
Henriette Schluepmann, a molecular plant biologist at Utrecht University in the Netherlands, believes that Azolla could replace soybeans and chemical fertilizers someday — only if researchers can achieve yield stability in controlled environments over long durations.
“In a country like the Netherlands that is surrounded by water with high levels of phosphates, it makes sense to grow Azolla as a substitute for soybeans,” says Schluepmann. “For that to happen, we need massive investments to understand these ferns’ reproductive system and how to replicate that within aquaculture systems on a large scale.”
Pollution control and carbon sequestration
Currently, Schluepmann and her team are growing Azolla in a plant nursery or closed system before transferring the ferns to flooded fields. So far, they have been able to continuously grow Azolla without any major setbacks for a total of 155 days. Taking care of these plants’ well-being is an uphill struggle.
Unlike most plants, Azolla does not grow from seeds because it contains female and male spores that tend to split instead of reproducing. To add to that, growing Azolla on a large scale in controlled environments makes the floating plants extremely vulnerable to insect infestations and fungi attacks.
“Even though it is easier to grow Azolla on a non-industrial scale, the long and tedious cultivation process is often in conflict with human rights,” she says. Farms in developing countries such as Indonesia sometimes use child labor for cultivating Azolla.”
History has taught us that the uncontrolled growth of Azolla plants deprives marine ecosystems of sunlight and chokes life underneath them. But researchers like Schluepmann and Bujak are optimistic that even on a much smaller scale, Azolla can put up a fight against human-driven climate change.
Schluepmann discovered an insecticide that can control Azolla blooms. But in the wild, this aquatic fern grows relentlessly in polluted rivers and lakes and has gained a notorious reputation as an invasive weed. Countries like Portugal and the UK banned Azolla after experiencing severe blooms in rivers that snuffed out local marine life.
“Azolla has been misunderstood as a nuisance. But in reality, it is highly beneficial for purifying water,” says Bujak. Through a process called phytoremediation, Azolla locks up pollutants like excess nitrogen and phosphorus and stops toxic algal blooms from occurring in rivers and lakes.
A 2018 study found that Azolla can decrease nitrogen and phosphorus levels in wastewater by 33 percent and 40.5 percent, respectively. While harmful algae like phytoplankton produce toxins and release noxious gases, Azolla automatically blocks any toxins that its cyanobacteria, Anabaena, might produce.
“In our labs, we observed that Azolla works effectively in treating wastewater,” explains Schluepmann. “Once we gain a better understanding of Azolla aquaculture, we can also use it for carbon capture and storage. But in Europe, we would have to use the entire Baltic Sea to make a difference.”
Planting massive amounts of these prehistoric ferns in any of the Northern great water bodies is out of the question. After all, history has taught us that the uncontrolled growth of Azolla plants deprives marine ecosystems of sunlight and chokes life underneath them. But researchers like Schluepmann and Bujak are optimistic that even on a much smaller scale, Azolla can put up a fight against human-driven climate change.
Traditional carbon capture and storage methods are not only expensive but also inefficient and could increase air pollution. According to Bujak’s estimates, Azolla can sequester 10 metric tonnes of carbon dioxide per hectare annually, which is 10 times the average capacity of grasslands.
“Anyone can set up their own DIY carbon capture and storage system by growing Azolla in shallow water. After harvesting and compressing the plants, carbon dioxide gets stored permanently,” says Bujak.
He envisions scaling up this process by setting up “Azolla hubs” in mega-cities where the plants are grown in shallow trays stacked on top of each other with vertical farming systems built within multi-story buildings. The compressed Azolla plants can then be converted into a biofuel, fertilizer, livestock feed, or biochar for sequestering carbon dioxide.
“Using Azolla to mitigate climate change might sound a bit too simple. But that is not the case,” Bujak adds. “At a microscopic level, extremely complicated biochemical reactions are constantly occurring inside the plant’s cells that machines or technology cannot replicate yet.”
Through Azolla, scientists hope to work with nature by tapping into four billion years of evolution.
A new virus has emerged and stoked fears of another pandemic: monkeypox. Since May 2022, it has been detected in 29 U.S. states, the District of Columbia, and Puerto Rico among international travelers and their close contacts. On a worldwide scale, as of June 30, there have been 5,323 cases in 52 countries.
The good news: An existing vaccine can go a long way toward preventing a catastrophic outbreak. Because monkeypox is a close relative of smallpox, the same vaccine can be used—and it is about 85 percent effective against the virus, according to the World Health Organization (WHO).
Also on the plus side, monkeypox is less contagious with milder illness than smallpox and, compared to COVID-19, produces more telltale signs. Scientists think that a “ring” vaccination strategy can be used when these signs appear to help with squelching this alarming outbreak.
How it’s transmitted
Monkeypox spreads between people primarily through direct contact with infectious sores, scabs, or bodily fluids. People also can catch it through respiratory secretions during prolonged, face-to-face contact, according to the Centers for Disease Control and Prevention (CDC).
As of June 30, there have been 396 documented monkeypox cases in the U.S., and the CDC has activated its Emergency Operations Center to mobilize additional personnel and resources. The U.S. Department of Health and Human Services is aiming to boost testing capacity and accessibility. No Americans have died from monkeypox during this outbreak but, during the COVID-19 pandemic (February 2020 to date), Africa has documented 12,141 cases and 363 deaths from monkeypox.
Ring vaccination proved effective in curbing the smallpox and Ebola outbreaks. As the monkeypox threat continues to loom, scientists view this as the best vaccine approach.
A person infected with monkeypox typically has symptoms—for instance, fever and chills—in a contagious state, so knowing when to avoid close contact with others makes it easier to curtail than COVID-19.
Advantages of ring vaccination
For this reason, it’s feasible to vaccinate a “ring” of people around the infected individual rather than inoculating large swaths of the population. Ring vaccination proved effective in curbing the smallpox and Ebola outbreaks. As the monkeypox threat continues to loom, scientists view this as the best vaccine approach.
With many infections, “it normally would make sense to everyone to vaccinate more widely,” says Wesley C. Van Voorhis, a professor and director of the Center for Emerging and Re-emerging Infectious Diseases at the University of Washington School of Medicine in Seattle. However, “in this case, ring vaccination may be sufficient to contain the outbreak and also minimize the rare, but potentially serious side effects of the smallpox/monkeypox vaccine.”
There are two licensed smallpox vaccines in the United States: ACAM2000 (live Vaccina virus) and JYNNEOS (live virus non-replicating). The ACAM 2000, Van Voorhis says, is the old smallpox vaccine that, in rare instances, could spread diffusely within the body and cause heart problems, as well as severe rash in people with eczema or serious infection in immunocompromised patients.
To prevent organ damage, the current recommendation would be to use the JYNNEOS vaccine, says Phyllis Kanki, a professor of health sciences in the division of immunology and infectious diseases at the Harvard T.H. Chan School of Public Health. However, according to a report on the CDC’s website, people with immunocompromising conditions could have a higher risk of getting a severe case of monkeypox, despite being vaccinated, and “might be less likely to mount an effective response after any vaccination, including after JYNNEOS.”
In the late 1960s, the ring vaccination strategy became part of the WHO’s mission to globally eradicate smallpox, with the last known natural case described in Somalia in 1977. Ring vaccination can also refer to how a clinical trial is designed, as was the case in 2015, when this approach was used for researching the benefits of an investigational Ebola vaccine in Guinea, Kanki says.
“Since Monkeypox spreads by close contact and we have an effective vaccine, vaccinating high-risk individuals and their contacts may be a good strategy to limit transmission,” she says, adding that privacy is an important ethical principle that comes into play, as people with monkeypox would need to disclose their close contacts so that they could benefit from ring vaccination.
Rapid identification of cases and contacts—along with their cooperation—is essential for ring vaccination to be effective. Although mass vaccination also may work, the risk of infection to most of the population remains low while supply of the JYNNEOS vaccine is limited, says Stanley Deresinski, a clinical professor of medicine in the Infectious Disease Clinic at Stanford University School of Medicine.
Other strategies for preventing transmission
Ideally, the vaccine should be administered within four days of an exposure, but it’s recommended for up to 14 days. The WHO also advocates more widespread vaccination campaigns in the population segment with the most cases so far: men who engage in sex with other men.
The virus appears to be spreading in sexual networks, which differs from what was seen in previously reported outbreaks of monkeypox (outside of Africa), where risk was associated with travel to central or west Africa or various types of contact with individuals or animals from those locales. There is no evidence of transmission by food, but contaminated articles in the environment such as bedding are potential sources of the virus, Deresinski says.
Severe cases of monkeypox can occur, but “transmission of the virus requires close contact,” he says. “There is no evidence of aerosol transmission, as occurs with SARS-CoV-2, although it must be remembered that the smallpox virus, a close relative of monkeypox, was transmitted by aerosol.”
Deresinski points to the fact that in 2003, monkeypox was introduced into the U.S. through imports from Ghana of infected small mammals, such as Gambian giant rats, as pets. They infected prairie dogs, which also were sold as pets and, ultimately, this resulted in 37 confirmed transmissions to humans and 10 probable cases. A CDC investigation identified no cases of human-to-human transmission. Then, in 2021, a traveler flew from Nigeria to Dallas through Atlanta, developing skin lesions several days after arrival. Another CDC investigation yielded 223 contacts, although 85 percent were deemed to be at only minimal risk and the remainder at intermediate risk. No new cases were identified.
How much should we be worried
But how serious of a threat is monkeypox this time around? “Right now, the risk to the general public is very low,” says Scott Roberts, an assistant professor and associate medical director of infection prevention at Yale School of Medicine. “Monkeypox is spread through direct contact with infected skin lesions or through close contact for a prolonged period of time with an infected person. It is much less transmissible than COVID-19.”
The monkeypox incubation period—the time from infection until the onset of symptoms—is typically seven to 14 days but can range from five to 21 days, compared with only three days for the Omicron variant of COVID-19. With such a long incubation, there is a larger window to conduct contact tracing and vaccinate people before symptoms appear, which can prevent infection or lessen the severity.
But symptoms may present atypically or recognition may be delayed. “Ring vaccination works best with 100 percent adherence, and in the absence of a mandate, this is not achievable,” Roberts says.
At the outset of infection, symptoms include fever, chills, and fatigue. Several days later, a rash becomes noticeable, usually beginning on the face and spreading to other parts of the body, he says. The rash starts as flat lesions that raise and develop fluid, similar to manifestations of chickenpox. Once the rash scabs and falls off, a person is no longer contagious.
“It's an uncomfortable infection,” says Van Voorhis, the University of Washington School of Medicine professor. There may be swollen lymph nodes. Sores and rash are often limited to the genitals and areas around the mouth or rectum, suggesting intimate contact as the source of spread.
Symptoms of monkeypox usually last from two to four weeks. The WHO estimated that fatalities range from 3 to 6 percent. Although it’s believed to infect various animal species, including rodents and monkeys in west and central Africa, “the animal reservoir for the virus is unknown,” says Kanki, the Harvard T.H. Chan School of Public Health professor.
Too often, viruses originate in parts of the world that are too poor to grapple with them and may lack the resources to invest in vaccines and treatments. “This disease is endemic in central and west Africa, and it has basically been ignored until it jumped to the north and infected Europeans, Americans, and Canadians,” Van Voorhis says. “We have to do a better job in health care and prevention all over the world. This is the kind of thing that comes back to bite us.”