How Leqembi became the biggest news in Alzheimer’s disease in 40 years, and what comes next
A few months ago, Betsy Groves traveled less than a mile from her home in Cambridge, Mass. to give a talk to a bunch of scientists. The scientists, who worked for the pharmaceutical companies Biogen and Eisai, wanted to know how she lived her life, how she thought about her future, and what it was like when a doctor’s appointment in 2021 gave her the worst possible news. Groves, 73, has Alzheimer’s disease. She caught it early, through a lumbar puncture that showed evidence of amyloid, an Alzheimer’s hallmark, in her cerebrospinal fluid. As a way of dealing with her diagnosis, she joined the Alzheimer’s Association’s National Early-Stage Advisory Board, which helped her shift into seeing her diagnosis as something she could use to help others.
After her talk, Groves stayed for lunch with the scientists, who were eager to put a face to their work. Biogen and Eisai were about to release the first drug to successfully combat Alzheimer’s in 40 years of experimental disaster. Their drug, which is known by the scientific name lecanemab and the marketing name Leqembi, was granted accelerated approval by the U.S. Food and Drug Administration last Friday, Jan. 6, after a study in 1,800 people showed that it reduced cognitive decline by 27 percent over 18 months.
It is no exaggeration to say that this result is a huge deal. The field of Alzheimer’s drug development has been absolutely littered with failures. Almost everything researchers have tried has tanked in clinical trials. “Most of the things that we've done have proven not to be effective, and it's not because we haven’t been taking a ton of shots at goal,” says Anton Porsteinsson, director of the University of Rochester Alzheimer's Disease Care, Research, and Education Program, who worked on the lecanemab trial. “I think it's fair to say you don't survive in this field unless you're an eternal optimist.”
As far back as 1984, a cure looked like it was within reach: Scientists discovered that the sticky plaques that develop in the brains of those who have Alzheimer’s are made up of a protein fragment called beta-amyloid. Buildup of beta-amyloid seemed to be sufficient to disrupt communication between, and eventually kill, memory cells. If that was true, then the cure should be straightforward: Stop the buildup of beta-amyloid; stop the Alzheimer’s disease.
It wasn’t so simple. Over the next 38 years, hundreds of drugs designed either to interfere with the production of abnormal amyloid or to clear it from the brain flamed out in trials. It got so bad that neuroscience drug divisions at major pharmaceutical companies (AstraZeneca, Pfizer, Bristol-Myers, GSK, Amgen) closed one by one, leaving the field to smaller, scrappier companies, like Cambridge-based Biogen and Tokyo-based Eisai. Some scientists began to dismiss the amyloid hypothesis altogether: If this protein fragment was so important to the disease, why didn’t ridding the brain of it do anything for patients? There was another abnormal protein that showed up in the brains of Alzheimer’s patients, called tau. Some researchers defected to the tau camp, or came to believe the proteins caused damage in combination.
The situation came to a head in 2021, when the FDA granted provisional approval to a drug called aducanumab, marketed as Aduhelm, against the advice of its own advisory council. The approval was based on proof that Aduhelm reduced beta-amyloid in the brain, even though one research trial showed it had no effect on people’s symptoms or daily life. Aduhelm could also cause serious side effects, like brain swelling and amyloid related imaging abnormalities (known as ARIA, these are basically micro-bleeds that appear on MRI scans). Without a clear benefit to memory loss that would make these risks worth it, Medicare refused to pay for Aduhelm among the general population. Two congressional committees launched an investigation into the drug’s approval, citing corporate greed, lapses in protocol, and an unjustifiably high price. (Aduhelm was also produced by the pharmaceutical company Biogen.)
To be clear, Leqembi is not the cure Alzheimer’s researchers hope for. While the drug is the first to show clear signs of a clinical benefit, the scientific establishment is split on how much of a difference Leqembi will make in the real world.
So far, Leqembi is like Aduhelm in that it has been given accelerated approval only for its ability to remove amyloid from the brain. Both are monoclonal antibodies that direct the immune system to attack and clear dysfunctional beta-amyloid. The difference is that, while that’s all Aduhelm was ever shown to do, Leqembi’s makers have already asked the FDA to give it full approval – a decision that would increase the likelihood that Medicare will cover it – based on data that show it also improves Alzheimer’s sufferer’s lives. Leqembi targets a different type of amyloid, a soluble version called “protofibrils,” and that appears to change the effect. “It can give individuals and their families three, six months longer to be participating in daily life and living independently,” says Claire Sexton, PhD, senior director of scientific programs & outreach for the Alzheimer's Association. “These types of changes matter for individuals and for their families.”
To be clear, Leqembi is not the cure Alzheimer’s researchers hope for. It does not halt or reverse the disease, and people do not get better. While the drug is the first to show clear signs of a clinical benefit, the scientific establishment is split on how much of a difference Leqembi will make in the real world. It has “a rather small effect,” wrote NIH Alzheimer’s researcher Madhav Thambisetty, MD, PhD, in an email to Leaps.org. “It is unclear how meaningful this difference will be to patients, and it is unlikely that this level of difference will be obvious to a patient (or their caregivers).” Another issue is cost: Leqembi will become available to patients later this month, but Eisai is setting the price at $26,500 per year, meaning that very few patients will be able to afford it unless Medicare chooses to reimburse them for it.
The same side effects that plagued Aduhelm are common in Leqembi treatment as well. In many patients, amyloid doesn’t just accumulate around neurons, it also forms deposits in the walls of blood vessels. Blood vessels that are shot through with amyloid are more brittle. If you infuse a drug that targets amyloid, brittle blood vessels in the brain can develop leakage that results in swelling or bleeds. Most of these come with no symptoms, and are only seen during testing, which is why they are called “imaging abnormalities.” But in situations where patients have multiple diseases or are prescribed incompatible drugs, they can be serious enough to cause death. The three deaths reported from Leqembi treatment (so far) are enough to make Thambisetty wonder “how well the drug may be tolerated in real world clinical practice where patients are likely to be sicker and have multiple other medical conditions in contrast to carefully selected patients in clinical trials.”
Porsteinsson believes that earlier detection of Alzheimer’s disease will be the next great advance in treatment, a more important step forward than Leqembi’s approval.
Still, there are reasons to be excited. A successful Alzheimer’s drug can pave the way for combination studies, in which patients try a known effective drug alongside newer, more experimental ones; or preventative studies, which take place years before symptoms occur. It also represents enormous strides in researchers’ understanding of the disease. For example, drug dosages have increased massively—in some cases quadrupling—from the early days of Alzheimer’s research. And patient selection for studies has changed drastically as well. Doctors now know that you’ve got to catch the disease early, through PET-scans or CSF tests for amyloid, if you want any chance of changing its course.
Porsteinsson believes that earlier detection of Alzheimer’s disease will be the next great advance in treatment, a more important step forward than Leqembi’s approval. His lab already uses blood tests for different types of amyloid, for different types of tau, and for measures of neuroinflammation, neural damage, and synaptic health, but commercially available versions from companies like C2N, Quest, and Fuji Rebio are likely to hit the market in the next couple of years. “[They are] going to transform the diagnosis of Alzheimer's disease,” Porsteinsson says. “If someone is experiencing memory problems, their physicians will be able to order a blood test that will tell us if this is the result of changes in your brain due to Alzheimer's disease. It will ultimately make it much easier to identify people at a very early stage of the disease, where they are most likely to benefit from treatment.”
Learn more about new blood tests to detect Alzheimer's
Early detection can help patients for more philosophical reasons as well. Betsy Groves credits finding her Alzheimer’s early with giving her the space to understand and process the changes that were happening to her before they got so bad that she couldn’t. She has been able to update her legal documents and, through her role on the Advisory Group, help the Alzheimer’s Association with developing its programs and support services for people in the early stages of the disease. She still drives, and because she and her husband love to travel, they are hoping to get out of grey, rainy Cambridge and off to Texas or Arizona this spring.
Because her Alzheimer’s disease involves amyloid deposits (a “substantial portion” do not, says Claire Sexton, which is an additional complication for research), and has not yet reached an advanced stage, Groves may be a good candidate to try Leqembi. She says she’d welcome the opportunity to take it. If she can get access, Groves hopes the drug will give her more days to be fully functioning with her husband, daughters, and three grandchildren. Mostly, she avoids thinking about what the latter stages of Alzheimer’s might be like, but she knows the time will come when it will be her reality. “So whatever lecanemab can do to extend my more productive ways of engaging with relationships in the world,” she says. “I'll take that in a minute.”
Why we should put insects on the menu
I walked through the Dong Makkhai forest-products market, just outside of Vientiane, the laid-back capital of the Lao Peoples Democratic Republic or Lao PDR. Piled on rough display tables were varieties of six-legged wildlife–grasshoppers, small white crickets, house crickets, mole crickets, wasps, wasp eggs and larvae, dragonflies, and dung beetles. Some were roasted or fried, but in a few cases, still alive and scrabbling at the bottom of deep plastic bowls. I crunched on some fried crickets and larvae.
One stall offered Giant Asian hornets, both babies and adults. I suppressed my inner squirm and, in the interests of world food security and equity, accepted an offer of the soft, velvety larva; they were smooth on the tongue and of a pleasantly cool, buttery-custard consistency. Because the seller had already given me a free sample, I felt obliged to buy a chunk of the nest with larvae and some dead adults, which the seller mixed with kaffir lime leaves.
The year was 2016 and I was in Lao PDR because Veterinarians without Borders/Vétérinaires sans Frontières-Canada had initiated a project on small-scale cricket farming. The intent was to organize and encourage rural women to grow crickets as a source of supplementary protein and sell them at the market for cash. As a veterinary epidemiologist, I had been trained to exterminate disease spreading insects—Lyme disease-carrying ticks, kissing bugs that carry American Sleeping Sickness and mosquitoes carrying malaria, West Nile and Zika. Now, as part of a global wave promoting insects as a sustainable food source, I was being asked to view arthropods as micro-livestock, and devise management methods to keep them alive and healthy. It was a bit of a mind-bender.
The 21st century wave of entomophagy, or insect eating, first surged in the early 2010s, promoted by a research centre in Wageningen, a university in the Netherlands, conferences organized by the Food and Agriculture Organization of the United Nations, and enthusiastic endorsements by culinary adventurers and celebrities from Europeanized cultures. Headlines announced that two billion people around the world already ate insects, and that if everyone adopted entomophagy we could reduce greenhouse gases, mitigate climate change, and reign in profligate land and water use associated with industrial livestock production.
Furthermore, eating insects was better for human health than eating beef. If we were going to feed the estimated nine billion people with whom we will share the earth in 2050, we would need to make some radical changes in our agriculture and food systems. As one author proclaimed, entomophagy presented us with a last great chance to save the planet.
In 2010, in Kunming, a friend had served me deep-fried bamboo worms. I ate them to be polite. They tasted like French fries, but with heads.
The more recent data suggests that the number of people who eat insects in various forms, though sizeable, may be closer to several hundreds of millions. I knew that from several decades of international veterinary work. Sometimes, for me, insect eating has been simply a way of acknowledging cultural diversity. In 2010, in Kunming, a friend had served me deep-fried bamboo worms. I ate them to be polite. They tasted like French fries, but with heads. My friend said he preferred them chewier. I never thought about them much after that. I certainly had not thought about them as ingredients for human health.
Is consuming insects good for human health? Researchers over the past decade have begun to tease that apart. Some think it might not be useful to use the all-encompassing term insect at all; we don’t lump cows, pigs, chickens into one culinary category. Which insects are we talking about? What are they fed? Were they farmed or foraged? Which stages of the insects are we eating? Do we eat them directly or roasted and ground up?
The overall research indicates that, in general, the usual farmed insects (crickets, locusts, mealworms, soldier fly larvae) have high levels of protein and other important nutrients. If insects are foraged by small groups in Laos, they provide excellent food supplements. Large scale foraging in response to global markets can be incredibly destructive, but soldier fly larvae fed on food waste and used as a substitute for ground up anchovies for farmed fish (as Enterra Feed in Canada does) improves ecological sustainability.
Entomophagy alone might not save the planet, but it does give us an unprecedented opportunity to rethink how we produce and harvest protein.
The author enjoys insects from the Dong Makkhai forest-products market, just outside of Vientiane, the capital of the Lao Peoples Democratic Republic.
David Waltner-Toews
Between 1961 and 2018, world chicken production increased from 4 billion to 20 billion, pork from 200 million to over 100 billion pigs, human populations doubled from 3.5 billion to more than 7 billion, and life expectancy (on average) from 52 to 72 years. These dramatic increases in food production are the result of narrowly focused scientific studies, identifying specific nutrients, antibiotics, vaccines and genetics. What has been missing is any sort of peripheral vision: what are the unintended consequences of our narrowly defined success?
If we look more broadly, we can see that this narrowly defined success led to industrial farming, which caused wealth, health and labor inequities; polluted the environment; and created grounds for disease outbreaks. Recent generations of Europeanized people inherited the ideas of eating cows, pigs and chickens, along with their products, so we were focused only on growing them as efficiently as possible. With insects, we have an exciting chance to start from scratch. Because, for Europeanized people, insect eating is so strange, we are given the chance to reimagine our whole food system in consultation with local experts in Asia and Africa (many of them villagers), and to bring together the best of both locally adapted food production and global distribution.
For this to happen, we will need to change the dietary habits of the big meat eaters. How can we get accustomed to eating bugs? There’s no one answer, but there are a few ways. In many cases, insects are ground up and added as protein supplements to foods like crackers or bars. In certain restaurants, the chefs want you to get used to seeing the bugs as you eat them. At Le Feston Nu in Paris, the Arlo Guthrie look-alike bartender poured me a beer and brought out five small plates, each featuring a different insect in a nest of figs, sun-dried tomatoes, raisins, and chopped dried tropical fruits: buffalo worms, crickets, large grasshoppers (all just crunchy and no strong flavour, maybe a little nutty), small black ants (sour bite), and fat grubs with a beak, which I later identified as palm weevil larvae, tasting a bit like dried figs.
Some entomophagy advertising has used esthetically pleasing presentations in classy restaurants. In London, at the Archipelago restaurant, I dined on Summer Nights (pan fried chermoula crickets, quinoa, spinach and dried fruit), Love-Bug Salad (baby greens with an accompanying dish of zingy, crunchy mealworms fried in olive oil, chilis, lemon grass, and garlic), Bushman’s Cavi-Err (caramel mealworms, bilinis, coconut cream and vodka jelly), and Medieaval Hive (brown butter ice cream, honey and butter caramel sauce and a baby bee drone).
The Archipelago restaurant in London serves up a Love-Bug Salad: baby greens with an accompanying dish of zingy, crunchy mealworms fried in olive oil, chilis, lemon grass, and garlic.
David Waltner-Toews
Some chefs, like Tokyo-based Shoichi Uchiyama, try to entice people with sidewalk cooking lessons. Uchiyama's menu included hornet larvae, silkworm pupae, and silkworms. The silkworm pupae were white and pink and yellow. We snipped off the ends and the larvae dropped out. My friend Zen Kawabata roasted them in a small pan over a camp stove in the street to get the "chaff" off. We made tea from the feces of worms that had fed on cherry blossoms—the tea smelled of the blossoms. One of Uchiyama-san’s assistants made noodles from buckwheat dough that included powdered whole bees.
At a book reading in a Tokyo bookstore, someone handed me a copy of the Japanese celebrity scandal magazine Friday, opened to an article celebrating the “charms of insect eating.” In a photo, scantily-clad girls were drinking vodka and nibbling giant water bugs dubbed as toe-biters, along with pickled and fried locusts and butterfly larvae. If celebrities embraced bug-eating, others might follow. When asked to prepare an article on entomophagy for the high fashion Sorbet Magazine, I started by describing a clip of Nicole Kidman delicately snacking on insects.
Taking a page from the success story of MacDonald’s, we might consider targeting children and school lunches. Kids don’t lug around the same dietary baggage as the grownups, and they can carry forward new eating habits for the long term. When I offered roasted crickets to my grandchildren, they scarfed them down. I asked my five-year-old granddaughter what she thought: she preferred the mealworms to the crickets – they didn’t have legs that caught in her teeth.
Entomo Farms in Ontario, the province where I live, was described in 2015 by Canadian Business magazine as North America’s largest supplier of edible insects for human consumption. When visiting, I popped some of their roasted crickets into my mouth. They were crunchy, a little nutty. Nothing to get squeamish over. Perhaps the human consumption is indeed growing—my wife, at least, has joined me in my entomophagy adventures. When we celebrated our wedding anniversary at the Public Bar and Restaurant in Brisbane, Australia, the “Kang Kong Worms” and “Salmon, Manuka Honey, and Black Ants” seemed almost normal. Of course, the champagne helped.
For this podcast episode, my guest is Raina Plowright, one of the world’s leading researchers when it comes to how and why viruses sometimes jump from bats to humans. The intuition may be that bats are the bad guys in this situation, but the real culprits are more likely humans and ways that we intrude on nature.
Plowright is a Cornell Atkinson Scholar and professor at Cornell in the Department of Public and Ecosystem Health in the College of Veterinary Medicine. Read her full bio here. For a shorter (and lightly edited) version of this conversation, you can check out my Q&A interview with Plowright in the single-issue magazine, One Health / One Planet, published earlier this month by Leaps.org in collaboration with the Aspen Institute and the Science Philanthropy Alliance.
In the episode, Plowright tells me about her global research team that is busy studying the complex chain of events in between viruses originating in bats and humans getting infected with those viruses. She’s collecting samples from bats in Asia, Africa and Australia, which sounds challenging enough, but now consider the diligence required to parse out 1400 different bat species.
We also discuss a high-profile paper that she co-authored last month arguing for greater investment in preventing pandemics in the first place instead of the current approach, which basically puts all of our eggs in the basket of trying to respond to these outbreaks after the fact. Investing in pandemic prevention is a small price to pay compared with millions of people killed and trillions of dollars spent during the response to COVID-19.
Listen to the Episode
Listen on Apple | Listen on Spotify | Listen on Stitcher | Listen on Amazon | Listen on Google
Raina Plowright, a disease ecologist at Cornell University, is taking blood and urine samples from hundreds of animals and using GPS tags to follow their movement.
Kelly Gorham