With Lab-Grown Chicken Nuggets, Dumplings, and Burgers, Futuristic Foods Aim to Seem Familiar
Sandhya Sriram is at the forefront of the expanding lab-grown meat industry in more ways than one.
"[Lab-grown meat] is kind of a brave new world for a lot of people, and food isn't something people like being brave about."
She's the CEO and co-founder of one of fewer than 30 companies that is even in this game in the first place. Her Singapore-based company, Shiok Meats, is the only one to pop up in Southeast Asia. And it's the only company in the world that's attempting to grow crustaceans in a lab, starting with shrimp. This spring, the company debuted a prototype of its shrimp, and completed a seed funding round of $4.6 million.
Yet despite all of these wins, Sriram's own mother won't try the company's shrimp. She's a staunch, lifelong vegetarian, adhering to a strict definition of what that means.
"[Lab-grown meat] is kind of a brave new world for a lot of people, and food isn't something people like being brave about. It's really a rather hard-wired thing," says Kate Krueger, the research director at New Harvest, a non-profit accelerator for cellular agriculture (the umbrella field that studies how to grow animal products in the lab, including meat, dairy, and eggs).
It's so hard-wired, in fact, that trends in food inform our species' origin story. In 2017, a group of paleoanthropologists caused an upset when they unearthed fossils in present day Morocco showing that our earliest human ancestors lived much further north and 100,000 years earlier than expected -- the remains date back 300,000 years. But the excavation not only included bones and tools, it also painted a clear picture of the prevailing menu at the time: The oldest humans were apparently chomping on tons of gazelle, as well as wildebeest and zebra when they could find them, plus the occasional seasonal ostrich egg.
These were people with a diet shaped by available resources, but also by the ability to cook in the first place. In his book Catching Fire: How Cooking Made Us Human, Harvard primatologist Richard Wrangam writes that the very thing that allowed for the evolution of Homo sapiens was the ability to transform raw ingredients into edible nutrients through cooking.
Today, our behavior and feelings around food are the product of local climate, crops, animal populations, and tools, but also religion, tradition, and superstition. So what happens when you add science to the mix? Turns out, we still trend toward the familiar. The innovations in lab-grown meat that are picking up the most steam are foods like burgers, not meat chips, and salmon, not salmon-cod-tilapia hybrids. It's not for lack of imagination, it's because the industry's practitioners know that a lifetime of food memories is a hard thing to contend with. So far, the nascent lab-grown meat industry is not so much disrupting as being shaped by the oldest culture we have.
Not a single piece of lab-grown meat is commercially available to consumers yet, and already so much ink has been spilled debating if it's really meat, if it's kosher, if it's vegetarian, if it's ethical, if it's sustainable. But whether or not the industry succeeds and sticks around is almost moot -- watching these conversations and innovations unfold serves as a mirror reflecting back who we are, what concerns us, and what we aspire to.
The More Things Change, the More They Stay the Same
The building blocks for making lab-grown meat right now are remarkably similar, no matter what type of animal protein a company is aiming to produce.
First, a small biopsy, about the size of a sesame seed, is taken from a single animal. Then, the muscle cells are isolated and added to a nutrient-dense culture in a bioreactor -- the same tool used to make beer -- where the cells can multiply, grow, and form muscle tissue. This tissue can then be mixed with additives like nutrients, seasonings, binders, and sometimes colors to form a food product. Whether a company is attempting to make chicken, fish, beef, shrimp, or any other animal protein in a lab, the basic steps remain similar. Cells from various animals do behave differently, though, and each company has its own proprietary techniques and tools. Some, for example, use fetal calf serum as their cell culture, while others, aiming for a more vegan approach, eschew it.
"New gadgets feel safest when they remind us of other objects that we already know."
According to Mark Post, who made the first lab-grown hamburger at Maastricht University in the Netherlands in 2013, the cells of just one cow can give way to 175 million four-ounce burgers. By today's available burger-making methods, you'd need to slaughter 440,000 cows for the same result. The projected difference in the purely material efficiency between the two systems is staggering. The environmental impact is hard to predict, though. Some companies claim that their lab-grown meat requires 99 percent less land and 96 percent less water than traditional farming methods -- and that rearing fewer cows, specifically, would reduce methane emissions -- but the energy cost of running a lab-grown-meat production facility at an industrial scale, especially as compared to small-scale, pasture-raised farming, could be problematic. It's difficult to truly measure any of this in a burgeoning industry.
At this point, growing something like an intact shrimp tail or a marbled steak in a lab is still a Holy Grail. It would require reproducing the complex musculo-skeletal and vascular structure of meat, not just the cellular basis, and no one's successfully done it yet. Until then, many companies working on lab-grown meat are perfecting mince. Each new company's demo of a prototype food feels distinctly regional, though: At the Disruption in Food and Sustainability Summit in March, Shiok (which is pronounced "shook," and is Singaporean slang for "very tasty and delicious") first shared a prototype of its shrimp as an ingredient in siu-mai, a dumpling of Chinese origin and a fixture at dim sum. JUST, a company based in the U.S., produced a demo chicken nugget.
As Jean Anthelme Brillat-Savarin, the 17th century founder of the gastronomic essay, famously said, "Show me what you eat, and I'll tell you who you are."
For many of these companies, the baseline animal protein they are trying to innovate also feels tied to place and culture: When meat comes from a bioreactor, not a farm, the world's largest exporter of seafood could be a landlocked region, and beef could be "reared" in a bayou, yet the handful of lab-grown fish companies, like Finless Foods and BlueNalu, hug the American coasts; VOW, based in Australia, started making lab-grown kangaroo meat in August; and of course the world's first lab-grown shrimp is in Singapore.
"In the U.S., shrimps are either seen in shrimp cocktail, shrimp sushi, and so on, but [in Singapore] we have everything from shrimp paste to shrimp oil," Sriram says. "It's used in noodles and rice, as flavoring in cup noodles, and in biscuits and crackers as well. It's seen in every form, shape, and size. It just made sense for us to go after a protein that was widely used."
It's tempting to assume that innovating on pillars of cultural significance might be easier if the focus were on a whole new kind of food to begin with, not your popular dim sum items or fast food offerings. But it's proving to be quite the opposite.
"That could have been one direction where [researchers] just said, 'Look, it's really hard to reproduce raw ground beef. Why don't we just make something completely new, like meat chips?'" says Mike Lee, co-founder and co-CEO of Alpha Food Labs, which works on food innovation more broadly. "While that strategy's interesting, I think we've got so many new things to explain to people that I don't know if you want to also explain this new format of food that you've never, ever seen before."
We've seen this same cautious approach to change before in other ways that relate to cooking. Perhaps the most obvious example is the kitchen range. As Bee Wilson writes in her book Consider the Fork: A History of How We Cook and Eat, in the 1880s, convincing ardent coal-range users to switch to newfangled gas was a hard sell. To win them over, inventor William Sugg designed a range that used gas, but aesthetically looked like the coal ones already in fashion at the time -- and which in some visual ways harkened even further back to the days of open-hearth cooking. Over time, gas range designs moved further away from those of the past, but the initial jump was only made possible through familiarity. There's a cleverness to meeting people where they are.
"New gadgets feel safest when they remind us of other objects that we already know," writes Wilson. "It is far harder to accept a technology that is entirely new."
Maybe someday we won't want anything other than meat chips, but not today.
Measuring Success
A 2018 Gallup poll shows that in the U.S., rates of true vegetarianism and veganism have been stagnant for as long as they've been measured. When the poll began in 1999, six percent of Americans were vegetarian, a number that remained steady until 2012, when the number dropped one point. As of 2018, it remained at five percent.
In 2012, when Gallup first measured the percentage of vegans, the rate was two percent. By 2018 it had gone up just one point, to three percent. Increasing awareness of animal welfare, health, and environmental concerns don't seem to be incentive enough to convince Americans, en masse, to completely slam the door on a food culture characterized in many ways by its emphasis on traditional meat consumption.
"A lot of consumers get over the ick factor when you tell them that most of the food that you're eating right now has entered the lab at some point."
Wilson writes that "experimenting with new foods has always been a dangerous business. In the wild, trying out some tempting new berries might lead to death. A lingering sense of this danger may make us risk-averse in the kitchen."
That might be one psychologically deep-seated reason that Americans are so resistant to ditch meat altogether. But a middle ground is emerging with a rise in flexitarianism, which aims to reduce reliance on traditional animal products. "Americans are eager to include alternatives to animal products in their diets, but are not willing to give up animal products completely," the same 2018 Gallup poll reported. This may represent the best opportunity for lab-grown meat to wedge itself into the culture.
Quantitatively predicting a population's willingness to try a lab-grown version of its favorite protein is proving a hard thing to measure, however, because it's still science fiction to a regular consumer. Measuring popular opinion of something that doesn't really exist yet is a dubious pastime.
In 2015, University of Wisconsin School of Public Health researchers Linnea Laestadius and Mark Caldwell conducted a study using online comments on articles about lab-grown meat to suss out public response to the food. The results showed a mostly negative attitude, but that was only two years into a field that is six years old today. Already public opinion may have shifted.
Shiok Meat's Sriram and her co-founder Ka Yi Ling have used online surveys to get a sense of the landscape, but they also take a more direct approach sometimes. Every time they give a public talk about their company and their shrimp, they poll their audience before and after the talk, using the question, "How many of you are willing to try, and pay, to eat lab-grown meat?"
They consistently find that the percentage of people willing to try goes up from 50 to 90 percent after hearing their talk, which includes information about the downsides of traditional shrimp farming (for one thing, many shrimp are raised in sewage, and peeled and deveined by slaves) and a bit of information about how lab-grown animal protein is being made now. I saw this pan out myself when Ling spoke at a New Harvest conference in Cambridge, Massachusetts in July.
"A lot of consumers get over the ick factor when you tell them that most of the food that you're eating right now has entered the lab at some point," Sriram says. "We're not going to grow our meat in the lab always. It's in the lab right now, because we're in R&D. Once we go into manufacturing ... it's going to be a food manufacturing facility, where a lot of food comes from."
The downside of the University of Wisconsin's and Shiok Meat's approach to capturing public opinion is that they each look at self-selecting groups: Online commenters are often fueled by a need to complain, and it's likely that anyone attending a talk by the co-founders of a lab-grown meat company already has some level of open-mindedness.
So Sriram says that she and Ling are also using another method to assess the landscape, and it's somewhere in the middle. They've been watching public responses to the closest available product to lab-grown meat that's on the market: Impossible Burger. As a 100 percent plant-based burger, it's not quite the same, but this bleedable, searable patty is still very much the product of science and laboratory work. Its remarkable similarity to beef is courtesy of yeast that have been genetically engineered to contain DNA from soy plant roots, which produce a protein called heme as they multiply. This heme is a plant-derived protein that can look and act like the heme found in animal muscle.
So far, the sciencey underpinnings of the burger don't seem to be turning people off. In just four years, it's already found its place within other American food icons. It's readily available everywhere from nationwide Burger Kings to Boston's Warren Tavern, which has been in operation since 1780, is one of the oldest pubs in America, and is even named after the man who sent Paul Revere on his midnight ride. Some people have already grown so attached to the Impossible Burger that they will actually walk out of a restaurant that's out of stock. Demand for the burger is outpacing production.
"Even though [Impossible] doesn't consider their product cellular agriculture, it's part of a spectrum of innovation," Krueger says. "There are novel proteins that you're not going to find in your average food, and there's some cool tech there. So to me, that does show a lot of willingness on people's part to think about trying something new."
The message for those working on animal-based lab-grown meat is clear: People will accept innovation on their favorite food if it tastes good enough and evokes the same emotional connection as the real deal.
"How people talk about lab-grown meat now, it's still a conversation about science, not about culture and emotion," Lee says. But he's confident that the conversation will start to shift in that direction if the companies doing this work can nail the flavor memory, above all.
And then proving how much power flavor lords over us, we quickly derail into a conversation about Doritos, which he calls "maniacally delicious." The chips carry no health value whatsoever and are a native product of food engineering and manufacturing — just watch how hard it is for Bon Appetit associate food editor Claire Saffitz to try and recreate them in the magazine's test kitchen — yet devotees remain unfazed and crunch on.
"It's funny because it shows you that people don't ask questions about how [some foods] are made, so why are they asking so many questions about how lab-grown meat is made?" Lee asks.
For all the hype around Impossible Burger, there are still controversies and hand-wringing around lab-grown meat. Some people are grossed out by the idea, some people are confused, and if you're the U.S. Cattlemen's Association (USCA), you're territorial. Last year, the group sent a petition to the USDA to "exclude products not derived directly from animals raised and slaughtered from the definition of 'beef' and meat.'"
"I think we are probably three or four big food safety scares away from everyone, especially younger generations, embracing lab-grown meat as like, 'Science is good; nature is dirty, and can kill you.'"
"I have this working hypothesis that if you look at the nation in 50-year spurts, we revolve back and forth between artisanal, all-natural food that's unadulterated and pure, and food that's empowered by science," Lee says. "Maybe we've only had one lap around the track on that, but I think we are probably three or four big food safety scares away from everyone, especially younger generations, embracing lab-grown meat as like, 'Science is good; nature is dirty, and can kill you.'"
Food culture goes beyond just the ingredients we know and love — it's also about how we interact with them, produce them, and expect them to taste and feel when we bite down. We accept a margin of difference among a fast food burger, a backyard burger from the grill, and a gourmet burger. Maybe someday we'll accept the difference between a burger created by killing a cow and a burger created by biopsying one.
Looking to the Future
Every time we engage with food, "we are enacting a ritual that binds us to the place we live and to those in our family, both living and dead," Wilson writes in Consider the Fork. "Such things are not easily shrugged off. Every time a new cooking technology has been introduced, however useful … it has been greeted in some quarters with hostility and protestations that the old ways were better and safer."
This is why it might be hard for a vegetarian mother to try her daughter's lab-grown shrimp, no matter how ethically it was produced or how awe-inspiring the invention is. Yet food cultures can and do change. "They're not these static things," says Benjamin Wurgaft, a historian whose book Meat Planet: Artificial Flesh and the Future of Food comes out this month. "The real tension seems to be between slow change and fast change."
In fact, the very definition of the word "meat" has never exclusively meant what the USCA wants it to mean. Before the 12th century, when it first appeared in Old English as "mete," it wasn't very specific at all and could be used to describe anything from "nourishment," to "food item," to "fodder," to "sustenance." By the 13th century it had been narrowed down to mean "flesh of warm-blooded animals killed and used as food." And yet the British mincemeat pie lives on as a sweet Christmas treat full of -- to the surprise of many non-Brits -- spiced, dried fruit. Since 1901, we've also used this word with ease as a general term for anything that's substantive -- as in, "the meat of the matter." There is room for yet more definitions to pile on.
"The conversation [about lab-ground meat] has changed remarkably in the last six years," Wurgaft says. "It has become a conversation about whether or not specific companies will bring a product to market, and that's a really different conversation than asking, 'Should we produce meat in the lab?'"
As part of the field research for his book, Wurgaft visited the Rijksmuseum Boerhaave, a Dutch museum that specializes in the history of science and medicine. It was 2015, and he was there to see an exhibit on the future of food. Just two years earlier, Mark Post had made that first lab-grown hamburger about a two-and-a-half hour drive south of the museum. When Wurgaft arrived, he found the novel invention, which Post had donated to the museum, already preserved and served up on a dinner plate, the whole outfit protected by plexiglass.
"They put this in the exhibit as if it were already part of the historical records, which to a historian looked really weird," Wurgaft says. "It looked like somebody taking the most recent supercomputer and putting it in a museum exhibit saying, 'This is the supercomputer that changed everything,' as if you were already 100 years in the future, looking back."
It seemed to symbolize an effort to codify a lab-grown hamburger as a matter of Dutch pride, perhaps someday occupying a place in people's hearts right next to the stroopwafel.
"Who's to say that we couldn't get a whole school of how to cook with lab-grown meat?"
Lee likes to imagine that part of the legacy of lab-grown meat, if it succeeds, will be to inspire entirely new fads in cooking -- a step beyond ones like the crab-filled avocado of the 1960s or the pesto of the 1980s in the U.S.
"[Lab-grown meat] is inherently going to be a different quality than anything we've done with an animal," he says. "Look at every cut [of meat] on the sphere today -- each requires a slightly different cooking method to optimize the flavor of that cut. Who's to say that we couldn't get a whole school of how to cook with lab-grown meat?"
At this point, most of us have no way of trying lab-grown meat. It remains exclusively available through sometimes gimmicky demos reserved for investors and the media. But Wurgaft says the stories we tell about this innovation, the articles we write, the films we make, and yes, even the museum exhibits we curate, all hold as much cultural significance as the product itself might someday.
New study: Hotter nights, climate change, cause sleep loss with some affected more than others
Data from the National Sleep Foundation finds that the optimal bedroom temperature for sleep is around 65 degrees Fahrenheit. But we may be getting fewer hours of "good sleepin’ weather" as the climate warms, according to a recent paper from researchers at the University of Copenhagen, Denmark.
Published in One Earth, the study finds that heat related to climate change could provide a “pathway” to sleep deprivation. The authors say the effect is “substantially larger” for those in lower-income countries. Hours of sleep decline when nighttime temperature exceeds 50 degrees, and temps higher than 77 reduce the chances of sleeping for seven hours by 3.5 percent. Even small losses associated with rising temperatures contribute significantly to people not getting enough sleep.
We’re affected by high temperatures at night because body temperature becomes more sensitive to the environment when slumbering. “Mechanisms that control for thermal regulation become more disordered during sleep,” explains Clete Kushida, a neurologist, professor of psychiatry at Stanford University and sleep medicine clinician.
The study finds that women and older adults are especially vulnerable. Worldwide, the elderly lost over twice as much sleep per degree of warming compared to younger people. This phenomenon was apparent between the ages of 60 and 70, and it increased beyond age 70. “The mechanism for balancing temperatures appears to be more affected with age,” Kushida adds.
Others disproportionately affected include those who live in regions with more greenhouse gas (GHG) emissions, which accelerate climate change, and people in hotter locales will lose more sleep per degree of warming, according to the study, with suboptimal temperatures potentially eroding 50 to 58 hours of sleep per person per year. One might think that those in warmer countries can adapt to the heat, but the researchers found no evidence for such adjustments. “We actually found those living in the warmest climate regions were impacted over twice as much as those in the coldest climate regions,” says the study's lead author, Kelton Minor, a Ph.D. candidate at the University of Copenhagen’s Center for Social Data Science.
Short sleep can reduce cognitive performance and productivity, increase absenteeism from work or school, and lead to a host of other physical and psychosocial problems. These issues include a compromised immune system, hypertension, depression, anger and suicide, say the study’s authors. According to a fact sheet by the U.S. Centers for Disease Control and Prevention, a third of U.S. adults already report sleeping fewer hours than the recommended amount, even though sufficient sleep “is not a luxury—it is something people need for good health.”
Equitable policy and planning are needed to ensure equal access to cooling technologies in a warming world.
Beyond global health, a sleep-deprived world will impact the economy as the climate warms. “Less productivity at work, associated with sleep loss or deprivation, would result in more sick days on a global scale, not just in individual countries,” Kushida says.
Unlike previous research that measured sleep patterns with self-reported surveys and controlled lab experiments, the study in One Earth offers a global analysis that relies on sleep-tracking wristbands that link more than seven million sleep records of 47,628 adults across 68 countries to local and daily meteorological data, offering new insight into the environmental impact on human sleep. Controlling for individual, seasonal and time-varying confounds, researchers found the main way that higher temperatures shorten slumber is by delaying sleep onset.
Heat effects on sleep were seen in industrialized countries including those with access to air conditioning, notes the study. Air conditioning may buffer high indoor temperatures, but they also increase GHG emissions and ambient heat displacement, thereby exacerbating the unequal burdens of global and local warming. Continued urbanization is expected to contribute to these problems.
Previous sleep studies have found an inverse U-shaped response to temperature in highly controlled settings, with subjects sleeping worse when room temperatures were either too cold or too warm. However, “people appear far better at adapting to colder outside temperatures than hotter conditions,” says Minor.
Although there are ways of countering the heat effect, some populations have more access to them. “Air conditioning can help with the effect of higher temperature, but not all individuals can afford air conditioners,” says Kushida. He points out that this could drive even greater inequity between higher- and lower-income countries.
Equitable policy and planning are needed to ensure equal access to cooling technologies in a warming world. “Clean and renewable energy systems and interventions will be needed to mitigate and adapt to ongoing climate warming,” Minor says. Future research should investigate “policy, planning and design innovation,” which could reduce the impact of sweltering temperatures on a good night’s sleep for the good of individuals, society and our planet, asserts the study.
Unabated and on its current trajectory, by 2099 suboptimal temperatures could shave 50 to 58 hours of sleep per person per year, predict the study authors. “Down the road, as technology develops, there might be ways of enabling people to adapt on a large scale to these higher temperatures,” says Kushida. “Right now, it’s not there.”
Why we need to get serious about ending aging
It is widely acknowledged that even a small advance in anti-aging science could yield benefits in terms of healthy years that the traditional paradigm of targeting specific diseases is not likely to produce. A more youthful population would also be less vulnerable to epidemics. Approximately 93 percent of all COVID-19 deaths reported in the U.S. occurred among those aged 50 or older. The potential economic benefits would be tremendous. A more youthful population would consume less medical resources and be able to work longer. A recent study published in Nature estimates that a slowdown in aging that increases life expectancy by one year would save $38 trillion per year for the U.S. alone.
A societal effort to understand, slow down, arrest or even reverse aging of at least the size of our response to COVID-19 would therefore be a rational commitment. In fact, given that America’s older population is projected to grow dramatically, and the cost of healthcare with it, it is not an overstatement to say that the future welfare of the country may depend on solving aging.
This year, the kingdom of Saudi Arabia has announced that it will spend up to 1 billion dollars per year on science with the potential to slow down the aging process. We have also seen important investments from billionaires like Google co-founder Larry Page, Amazon founder Jeff Bezos, business magnate Larry Ellison, and PayPal co-founder Peter Thiel.
The U.S. government, however, is lagging: The National Institutes of Health spent less than one percent of its $43 billion budget for the fiscal year of 2021 on the National Institute on Aging’s Division of Aging Biology. When you visit the division’s webpage you find that their mission statement carefully omits any mention of the possibility of slowing down the aging process.
There is a lack of political will and leadership on the issue, and the idea that we should seek to do something about aging is generally met with a great deal of suspicion and trepidation. In a large representative study conducted by the Pew Research Center in 2013, only 38% of the respondents said that they would want a treatment that could slow the aging process and allow them to live at least 120 years. Apparently, most people prefer, or at least do not mind, to age and die within a natural lifespan. This result has been confirmed by smaller studies and it is, I think, surprising. Are we not supposed to live in a youth-culture? Are people not supposed to want to stay young and alive forever? Is self-preservation not the strong drive we have always assumed it to be?
We are inundated and saturated with an ideology of death-acceptance.
In my book, The Case against Death, I suggest that we have been culturally conditioned to think that it is virtuous to accept aging and death. We are taught to believe that although aging and death seem gruesome, they are what is best for us, all things considered. This is what we are supposed to think, and the majority accept it. I call this the Wise View because death acceptance has been the dominant view of philosophers since the beginning. Socrates compared our earthly life to an illness and a prison and described death as a healer and a liberator. The Buddha taught that life is suffering and that the way to escape suffering is to end the cycle of birth, death and rebirth. Stoic philosophers from Zeno to Marcus Aurelius believed that everything that happens in accordance with nature is good, and that therefore we should not only accept death but welcome it as an aspect of a perfect totality.
Epicureans agreed with these rival schools and famously argued that death cannot harm us because where we are, death is not, and where death is we are not. We cannot be harmed if we are not, so death is harmless. The simple view that death actually can harm us greatly is one of the least philosophical views one can hold.
In The Case Against Death, philosopher Ingemar Patrick Linden argues that we frown on using science to prolong healthy life only because we're culturally conditioned to think that way.
Many of the stories we tell promote the Wise View. One of the earliest known pieces of literature, the Epic of Gilgamesh, follows Gilgamesh on a quest for eternal life ending with the wisdom that death is the destiny of man. Today we learn about the tedium of immortality from the children’s book Tuck Everlasting by Natalie Babbitt, and we are warned about the vice of wanting to resist death in other books and films such as J.K Rowling’s Harry Potter, where Voldemort must kill Harry as a step towards his own immortality; C.S. Lewis’ The Chronicles of Narnia where the White Witch has gained immortal youth and madness in equal measures; J.R.R. Tolkien’s Lord of the Rings trilogy where the ring extends the wearer’s life but can also destroy them, as exemplified by the creep Gollum; and Doctor Strange where life extension is the one magical power that is taboo. In Star Wars, Yoda, a stereotype of the sage, teaches us the wisdom handed down by philosophers and prophets: “Death is a natural part of life. Rejoice for those around you who transform into the Force. Mourn them do not. Miss them do not.”
We are inundated and saturated with an ideology of death-acceptance. Can the dear reader name one single story where the hero is pursuing anti-aging, longevity or immortality and the villain tries to stop her?
The Wise View resonates with us partly because we think that there is nothing we can do about aging and death, so we do not want to wish for what we cannot have. Youth and immortality are sour grapes to us. Believing that death is, all things considered, not such a bad thing, protects us from experiencing our aging and approaching death as a gruesome tragedy. This need to escape the thought that we are heading towards a personal catastrophe explains why many are so quick to accept arguments against radical life extension, despite their often glaring weaknesses.
One of the most common objections to radical life extension is that aging and death are natural. The problem with this argument is that many things that are natural are very bad, such as cancer, and other things that are not natural are very good, such as a cure for cancer. Why are we so sure that cancer is bad? Because we assume that it is bad to die. Indeed, nothing is more natural than wanting to live. We seem to need philosophers and story tellers to talk us out of it and, in the words of a distinguished bioethicist, “instruct and somewhat moderate our lust for life.”
Another standard objection is that we need a deadline, and that without death we could postpone every action forever. “Death brings urgency and seriousness to life,” say proponents of this view, but there are several problems with this argument. Even if our lives were endless, there would still be many things we would have to do at a certain time, and that could not be redone, for example, saving our planet from being destroyed, or becoming the first person on Venus. And if we prefer pleasant endless lives over unpleasant endless ones, we will have to exercise, eat right, keep our word, develop our talents, show up for time at work, pay our taxes by the due date, remember birthdays, and so on.
The Wise View provides us with a feel-good bromide for the anxiety created by the foreknowledge of our decay and death by telling us that these are not evils, but blessings in disguise. Once perhaps an innocuous delusion, today the view stands in the way of a necessary societal commitment to research that can prolong our healthy life.
Besides, even if we succeeded in ending aging, we would still die from other causes. Given the rate of accidental deaths we would be fortunate to live to age 2000 all things equal. So even if, contrary to what I have argued, we do need a deadline, we can still argue that the natural lifespan that we now labor under is inhuman and that it forces each human to limit her ambitions and to become only a fragment of all that she that could have been. Our tight time constraint imposes tragic choices and inflated opportunity costs. Death does not make life matter; it makes time matter.
The perhaps most awful argument against radical life extension is grounded in a pessimism that holds life in such little regard that it says that best of all is never to have born. This view was expressed by Ecclesiastes in the Hebrew Bible, by Sophocles and several other ancient Greeks, by the German philosopher Arthur Schopenhauer, and recently by, among others, the South African philosopher David Benatar who argues that it is wrong to bring children into the world and that we should euthanize all sentient life. Pessimism, one suspects, largely appeals to some for reasons having to do with personal temperament, but insofar as it is built on factual beliefs, they can be addressed by providing a less negatively biased understanding of the world, by pointing out that curing aging would decrease the badness that they are so hypersensitive to, and by reminding them that if life really becomes unbearable, they are free to quit at any time. Other means of persuasion could include recommending sleep, exercise and taking long brisk walks in nature.
The Wise View provides us with a feel-good bromide for the anxiety created by the foreknowledge of our decay and death by telling us that these are not evils, but blessings in disguise. Once perhaps an innocuous delusion, today the view stands in the way of a necessary societal commitment to research that can prolong our healthy life. We need abandon it and openly admit that aging is a scourge that deserves to be fought with the combined energies equaling those expended on fighting COVID-19, Alzheimer’s disease, cancer, stroke and all the other illnesses for which aging is the greatest risk factor. The fight to end aging transcends ordinary political boundaries and is therefore the kind of grand unifying enterprise that could re-energize a society suffering from divisiveness and the sense of a lack of a common purpose. It is hard to imagine a more worthwhile cause.