Alzheimer’s prevention may be less about new drugs, more about income, zip code and education
That your risk of Alzheimer’s disease depends on your salary, what you ate as a child, or the block where you live may seem implausible. But researchers are discovering that social determinants of health (SDOH) play an outsized role in Alzheimer’s disease and related dementias, possibly more than age, and new strategies are emerging for how to address these factors.
At the 2022 Alzheimer’s Association International Conference, a series of presentations offered evidence that a string of socioeconomic factors—such as employment status, social support networks, education and home ownership—significantly affected dementia risk, even when adjusting data for genetic risk. What’s more, memory declined more rapidly in people who earned lower wages and slower in people who had parents of higher socioeconomic status.
In 2020, a first-of-its kind study in JAMA linked Alzheimer’s incidence to “neighborhood disadvantage,” which is based on SDOH indicators. Through autopsies, researchers analyzed brain tissue markers related to Alzheimer’s and found an association with these indicators. In 2022, Ryan Powell, the lead author of that study, published further findings that neighborhood disadvantage was connected with having more neurofibrillary tangles and amyloid plaques, the main pathological features of Alzheimer's disease.
As of yet, little is known about the biological processes behind this, says Powell, director of data science at the Center for Health Disparities Research at the University of Wisconsin School of Medicine and Public Health. “We know the association but not the direct causal pathway.”
The corroborative findings keep coming. In a Nature study published a few months after Powell’s study, every social determinant investigated affected Alzheimer’s risk except for marital status. The links were highest for income, education, and occupational status.
Clinical trials on new Alzheimer’s medications get all the headlines but preventing dementia through policy and public health interventions should not be underestimated.
The potential for prevention is significant. One in three older adults dies with Alzheimer's or another dementia—more than breast and prostate cancers combined. Further, a 2020 report from the Lancet Commission determined that about 40 percent of dementia cases could theoretically be prevented or delayed by managing the risk factors that people can modify.
Take inactivity. Older adults who took 9,800 steps daily were half as likely to develop dementia over the next 7 years, in a 2022 JAMA study. Hearing loss, another risk factor that can be managed, accounts for about 9 percent of dementia cases.
Clinical trials on new Alzheimer’s medications get all the headlines but preventing dementia through policy and public health interventions should not be underestimated. Simply slowing the course of Alzheimer’s or delaying its onset by five years would cut the incidence in half, according to the Global Council on Brain Health.
Minorities Hit the Hardest
The World Health Organization defines SDOH as “conditions in which people are born, work, live, and age, and the wider set of forces and systems shaping the conditions of daily life.”
Anyone who exists on processed food, smokes cigarettes, or skimps on sleep has heightened risks for dementia. But minority groups get hit harder. Older Black Americans are twice as likely to have Alzheimer’s or another form of dementia as white Americans; older Hispanics are about one and a half times more likely.
This is due in part to higher rates of diabetes, obesity, and high blood pressure within these communities. These diseases are linked to Alzheimer’s, and SDOH factors multiply the risks. Blacks and Hispanics earn less income on average than white people. This means they are more likely to live in neighborhoods with limited access to healthy food, medical care, and good schools, and suffer greater exposure to noise (which impairs hearing) and air pollution—additional risk factors for dementia.
Related Reading: The Toxic Effects of Noise and What We're Not Doing About it
Plus, when Black people are diagnosed with dementia, their cognitive impairment and neuropsychiatric symptom are more advanced than in white patients. Why? Some African-Americans delay seeing a doctor because of perceived discrimination and a sense they will not be heard, says Carl V. Hill, chief diversity, equity, and inclusion officer at the Alzheimer’s Association.
Misinformation about dementia is another issue in Black communities. The thinking is that Alzheimer’s is genetic or age-related, not realizing that diet and physical activity can improve brain health, Hill says.
African Americans are severely underrepresented in clinical trials for Alzheimer’s, too. So, researchers miss the opportunity to learn more about health disparities. “It’s a bioethical issue,” Hill says. “The people most likely to have Alzheimer’s aren’t included in the trials.”
The Cure: Systemic Change
People think of lifestyle as a choice but there are limitations, says Muniza Anum Majoka, a geriatric psychiatrist and assistant professor of psychiatry at Yale University, who published an overview of SDOH factors that impact dementia. “For a lot of people, those choices [to improve brain health] are not available,” she says. If you don’t live in a safe neighborhood, for example, walking for exercise is not an option.
Hill wants to see the focus of prevention shift from individual behavior change to ensuring everyone has access to the same resources. Advice about healthy eating only goes so far if someone lives in a food desert. Systemic change also means increasing the number of minority physicians and recruiting minorities in clinical drug trials so studies will be relevant to these communities, Hill says.
Based on SDOH impact research, raising education levels has the most potential to prevent dementia. One theory is that highly educated people have a greater brain reserve that enables them to tolerate pathological changes in the brain, thus delaying dementia, says Majoka. Being curious, learning new things and problem-solving also contribute to brain health, she adds. Plus, having more education may be associated with higher socioeconomic status, more access to accurate information and healthier lifestyle choices.
New Strategies
The chasm between what researchers know about brain health and how the knowledge is being applied is huge. “There’s an explosion of interest in this area. We’re just in the first steps,” says Powell. One day, he predicts that physicians will manage Alzheimer’s through precision medicine customized to the patient’s specific risk factors and needs.
Raina Croff, assistant professor of neurology at Oregon Health & Science University School of Medicine, created the SHARP (Sharing History through Active Reminiscence and Photo-imagery) walking program to forestall memory loss in African Americans with mild cognitive impairment or early dementia.
Participants and their caregivers walk in historically black neighborhoods three times a week over six months. A smart tablet provides information about “Memory Markers” they pass, such as the route of a civil rights march. People celebrate their community and culture while “brain health is running in the background,” Croff says.
Photos and memory prompts engage participants in the SHARP program.
OHSU/Kristyna Wentz-Graff
The project began in 2015 as a pilot study in Croff’s hometown of Portland, Ore., expanded to Seattle, and will soon start in Oakland, Calif. “Walking is good for slowing [brain] decline,” she says. A post-study assessment of 40 participants in 2017 showed that half had higher cognitive scores after the program; 78 percent had lower blood pressure; and 44 percent lost weight. Those with mild cognitive impairment showed the most gains. The walkers also reported improved mood and energy along with increased involvement in other activities.
It’s never too late to reap the benefits of working your brain and being socially engaged, Majoka says.
In Milwaukee, the Wisconsin Alzheimer’s Institute launched the The Amazing Grace Chorus® to stave off cognitive decline in seniors. People in early stages of Alzheimer’s practice and perform six concerts each year. The activity provides opportunities for social engagement, mental stimulation, and a support network. Among the benefits, 55 percent reported better communication at home and nearly half of participants said they got involved with more activities after participating in the chorus.
Private companies are offering intervention services to healthcare providers and insurers to manage SDOH, too. One such service, MyHello, makes calls to at-risk people to assess their needs—be it food, transportation or simply a friendly voice. Having a social support network is critical for seniors, says Majoka, noting there was a steep decline in cognitive function among isolated elders during Covid lockdowns.
About 1 in 9 Americans age 65 or older live with Alzheimer’s today. With a surge in people with the disease predicted, public health professionals have to think more broadly about resource targets and effective intervention points, Powell says.
Beyond breakthrough pills, that is. Like Dorothy in Kansas discovering happiness was always in her own backyard, we are beginning to learn that preventing Alzheimer’s is in our reach if only we recognized it.
Podcast: A Nasal Spray COVID Booster Shot, With Dr. Akiko Iwasaki
The "Making Sense of Science" podcast features interviews with leading medical and scientific experts about the latest developments and the big ethical and societal questions they raise. This monthly podcast is hosted by journalist Kira Peikoff, founding editor of the award-winning science outlet Leaps.org.
Real-world data shows that protection against Covid-19 infection wanes a few months after two or three shots of mRNA vaccines (while protection against severe disease remains high). But what if there was another kind of booster that could shore up the immune response in your nose, the "door" to your body? Like bouncers at a club, a better prepared nasal defense system could stop the virus in its tracks -- mitigating illnesses as well as community spread. Dr. Akiko Iwasaki, an immunologist at Yale, is working on such a booster, with fantastic results recently reported in mice. In this episode, she shares the details of this important work.
Listen to episode
Kira Peikoff was the editor-in-chief of Leaps.org from 2017 to 2021. As a journalist, her work has appeared in The New York Times, Newsweek, Nautilus, Popular Mechanics, The New York Academy of Sciences, and other outlets. She is also the author of four suspense novels that explore controversial issues arising from scientific innovation: Living Proof, No Time to Die, Die Again Tomorrow, and Mother Knows Best. Peikoff holds a B.A. in Journalism from New York University and an M.S. in Bioethics from Columbia University. She lives in New Jersey with her husband and two young sons. Follow her on Twitter @KiraPeikoff.
Technology is Redefining the Age of 'Older Mothers'
In October 2021, a woman from Gujarat, India, stunned the world when it was revealed she had her first child through in vitro fertilization (IVF) at age 70. She had actually been preceded by a compatriot of hers who, two years before, gave birth to twins at the age of 73, again with the help of IVF treatment. The oldest known mother to conceive naturally lived in the UK; in 1997, Dawn Brooke conceived a son at age 59.
These women may seem extreme outliers, almost freaks of nature; in the US, for example, the average age of first-time mothers is 26. A few decades from now, though, the sight of 70-year-old first-time mothers may not even raise eyebrows, say futurists.
“We could absolutely have more 70-year-old mothers because we are learning how to regulate the aging process better,” says Andrew Hessel, a microbiologist and geneticist, who cowrote "The Genesis Machine," a book about “rewriting life in the age of synthetic biology,” with Amy Webb, the futurist who recently wondered why 70-year-old women shouldn’t give birth.
Technically, we're already doing this, says Hessel, pointing to a technique known as in vitro gametogenesis (IVG). IVG refers to turning adult cells into sperm or egg cells. “You can think of it as the upgrade to IVF,” Hessel says. These vanguard stem cell research technologies can take even skin cells and turn them into induced pluripotent stem cells (iPSCs), which are basically master cells capable of maturing into any human cell, be it kidney cells, liver cells, brain cells or gametes, aka eggs and sperm, says Henry T. “Hank” Greely, a Stanford law professor who specializes in ethical, legal, and social issues in biosciences.
Mothers over 70 will be a minor blip, statistically speaking, Greely predicts.
In 2016, Greely wrote "The End of Sex," a book in which he described the science of making gametes out of iPSCs in detail. Greely says science will indeed enable us to see 70-year-old new mums fraternize with mothers several decades younger at kindergartens in the (not far) future. And it won’t be that big of a deal.
“An awful lot of children all around the world have been raised by grandmothers for millennia. To have 70-year-olds and 30-year-olds mingling in maternal roles is not new,” he says. That said, he doubts that many women will want to have a baby in the eighth decade of their life, even if science allows it. “Having a baby and raising a child is hard work. Even if 1% of all mothers are over 65, they aren’t going to change the world,” Greely says. Mothers over 70 will be a minor blip, statistically speaking, he predicts. But one thing is certain: the technology is here.
And more technologies for the same purpose could be on the way. In March 2021, researchers from Monash University in Melbourne, Australia, published research in Nature, where they successfully reprogrammed skin cells into a three-dimensional cellular structure that was morphologically and molecularly similar to a human embryo–the iBlastoid. In compliance with Australian law and international guidelines referencing the “primitive streak rule," which bans the use of embryos older than 14 days in scientific research, Monash scientists stopped growing their iBlastoids in vitro on day 11.
“The research was both cutting-edge and controversial, because it essentially created a new human life, not for the purpose of a patient who's wanting to conceive, but for basic research,” says Lindsay Wu, a senior lecturer in the School of Medical Sciences at the University of New South Wales (UNSW), in Kensington, Australia. If you really want to make sure what you are breeding is an embryo, you need to let it develop into a viable baby. “This is the real proof in the pudding,'' says Wu, who runs UNSW’s Laboratory for Ageing Research. Then you get to a stage where you decide for ethical purposes you have to abort it. “Fiddling here a bit too much?” he asks. Wu believes there are other approaches to tackling declining fertility due to older age that are less morally troubling.
He is actually working on them. Why would it be that women, who are at peak physical health in almost every other regard, in their mid- to late- thirties, have problems conceiving, asked Wu and his team in a research paper published in 2020 in Cell Reports. The simple answer is the egg cell. An average girl in puberty has between 300,000 and 400,000 eggs, while at around age 37, the same woman has only 25,000 eggs left. Things only go downhill from there. So, what torments the egg cells?
The UNSW team found that the levels of key molecules called NAD+ precursors, which are essential to the metabolism and genome stability of egg cells, decline with age. The team proceeded to add these vitamin-like substances back into the drinking water of reproductively aged, infertile lab mice, which then had babies.
“It's an important proof of concept,” says Wu. He is investigating how safe it is to replicate the experiment with humans in two ongoing studies. The ultimate goal is to restore the quality of egg cells that are left in patients in their late 30s and early- to mid-40s, says Wu. He sees the goal of getting pregnant for this age group as less ethically troubling, compared to 70-year-olds.
But what is ethical, anyway? “It is a tricky word,” says Hessel. He differentiates between ethics, which represent a personal position and may, thus, be more transient, and morality, longer lasting principles embraced across society such as, “Thou shalt not kill.” Unprecedented advances often bring out fear and antagonism until time passes and they just become…ordinary. When IVF pioneer Landrum Shettles tried to perform IVF in 1973, the chairman of Columbia’s College of Physicians and Surgeons interdicted the procedure at the last moment. Almost all countries in the world have IVF clinics today, and the global IVF services market is clearly a growth industry.
Besides, you don’t have a baby at 70 by accident: you really want it, Greely and Hessel agree. And by that age, mothers may be wiser and more financially secure, Hessel says (though he is quick to add that even the pregnancy of his own wife, who had her child at 40, was a high-risk one).
As a research question, figuring out whether older mothers are better than younger ones and vice-versa entails too many confounding variables, says Greely. And why should we focus on who’s the better mother anyway? “We've had 70-year-old and 80-year-old fathers forever–why should people have that much trouble getting used to mothers doing the same?” Greely wonders. For some women having a child at an old(er) age would be comforting; maybe that’s what matters.
And the technology to enable older women to have children is already here or coming very soon. That, perhaps, matters even more. Researchers have already created mice–and their offspring–entirely from scratch in the lab. “Doing this to produce human eggs is similar," says Hessel. "It is harder to collect tissues, and the inducing cocktails are different, but steady advances are being made." He predicts that the demand for fertility treatments will keep financing research and development in the area. He says that big leaps will be made if ethical concerns don’t block them: it is not far-fetched to believe that the first baby produced from lab-grown eggs will be born within the next decade.
In an op-ed in 2020 with Stat, Greely argued that we’ve already overcome the technical barrier for human cloning, but no one's really talking about it. Likewise, scientists are also working on enabling 70-year-old women to have babies, says Hessel, but most commentators are keeping really quiet about it. At least so far.