Alzheimer’s prevention may be less about new drugs, more about income, zip code and education
That your risk of Alzheimer’s disease depends on your salary, what you ate as a child, or the block where you live may seem implausible. But researchers are discovering that social determinants of health (SDOH) play an outsized role in Alzheimer’s disease and related dementias, possibly more than age, and new strategies are emerging for how to address these factors.
At the 2022 Alzheimer’s Association International Conference, a series of presentations offered evidence that a string of socioeconomic factors—such as employment status, social support networks, education and home ownership—significantly affected dementia risk, even when adjusting data for genetic risk. What’s more, memory declined more rapidly in people who earned lower wages and slower in people who had parents of higher socioeconomic status.
In 2020, a first-of-its kind study in JAMA linked Alzheimer’s incidence to “neighborhood disadvantage,” which is based on SDOH indicators. Through autopsies, researchers analyzed brain tissue markers related to Alzheimer’s and found an association with these indicators. In 2022, Ryan Powell, the lead author of that study, published further findings that neighborhood disadvantage was connected with having more neurofibrillary tangles and amyloid plaques, the main pathological features of Alzheimer's disease.
As of yet, little is known about the biological processes behind this, says Powell, director of data science at the Center for Health Disparities Research at the University of Wisconsin School of Medicine and Public Health. “We know the association but not the direct causal pathway.”
The corroborative findings keep coming. In a Nature study published a few months after Powell’s study, every social determinant investigated affected Alzheimer’s risk except for marital status. The links were highest for income, education, and occupational status.
Clinical trials on new Alzheimer’s medications get all the headlines but preventing dementia through policy and public health interventions should not be underestimated.
The potential for prevention is significant. One in three older adults dies with Alzheimer's or another dementia—more than breast and prostate cancers combined. Further, a 2020 report from the Lancet Commission determined that about 40 percent of dementia cases could theoretically be prevented or delayed by managing the risk factors that people can modify.
Take inactivity. Older adults who took 9,800 steps daily were half as likely to develop dementia over the next 7 years, in a 2022 JAMA study. Hearing loss, another risk factor that can be managed, accounts for about 9 percent of dementia cases.
Clinical trials on new Alzheimer’s medications get all the headlines but preventing dementia through policy and public health interventions should not be underestimated. Simply slowing the course of Alzheimer’s or delaying its onset by five years would cut the incidence in half, according to the Global Council on Brain Health.
Minorities Hit the Hardest
The World Health Organization defines SDOH as “conditions in which people are born, work, live, and age, and the wider set of forces and systems shaping the conditions of daily life.”
Anyone who exists on processed food, smokes cigarettes, or skimps on sleep has heightened risks for dementia. But minority groups get hit harder. Older Black Americans are twice as likely to have Alzheimer’s or another form of dementia as white Americans; older Hispanics are about one and a half times more likely.
This is due in part to higher rates of diabetes, obesity, and high blood pressure within these communities. These diseases are linked to Alzheimer’s, and SDOH factors multiply the risks. Blacks and Hispanics earn less income on average than white people. This means they are more likely to live in neighborhoods with limited access to healthy food, medical care, and good schools, and suffer greater exposure to noise (which impairs hearing) and air pollution—additional risk factors for dementia.
Related Reading: The Toxic Effects of Noise and What We're Not Doing About it
Plus, when Black people are diagnosed with dementia, their cognitive impairment and neuropsychiatric symptom are more advanced than in white patients. Why? Some African-Americans delay seeing a doctor because of perceived discrimination and a sense they will not be heard, says Carl V. Hill, chief diversity, equity, and inclusion officer at the Alzheimer’s Association.
Misinformation about dementia is another issue in Black communities. The thinking is that Alzheimer’s is genetic or age-related, not realizing that diet and physical activity can improve brain health, Hill says.
African Americans are severely underrepresented in clinical trials for Alzheimer’s, too. So, researchers miss the opportunity to learn more about health disparities. “It’s a bioethical issue,” Hill says. “The people most likely to have Alzheimer’s aren’t included in the trials.”
The Cure: Systemic Change
People think of lifestyle as a choice but there are limitations, says Muniza Anum Majoka, a geriatric psychiatrist and assistant professor of psychiatry at Yale University, who published an overview of SDOH factors that impact dementia. “For a lot of people, those choices [to improve brain health] are not available,” she says. If you don’t live in a safe neighborhood, for example, walking for exercise is not an option.
Hill wants to see the focus of prevention shift from individual behavior change to ensuring everyone has access to the same resources. Advice about healthy eating only goes so far if someone lives in a food desert. Systemic change also means increasing the number of minority physicians and recruiting minorities in clinical drug trials so studies will be relevant to these communities, Hill says.
Based on SDOH impact research, raising education levels has the most potential to prevent dementia. One theory is that highly educated people have a greater brain reserve that enables them to tolerate pathological changes in the brain, thus delaying dementia, says Majoka. Being curious, learning new things and problem-solving also contribute to brain health, she adds. Plus, having more education may be associated with higher socioeconomic status, more access to accurate information and healthier lifestyle choices.
New Strategies
The chasm between what researchers know about brain health and how the knowledge is being applied is huge. “There’s an explosion of interest in this area. We’re just in the first steps,” says Powell. One day, he predicts that physicians will manage Alzheimer’s through precision medicine customized to the patient’s specific risk factors and needs.
Raina Croff, assistant professor of neurology at Oregon Health & Science University School of Medicine, created the SHARP (Sharing History through Active Reminiscence and Photo-imagery) walking program to forestall memory loss in African Americans with mild cognitive impairment or early dementia.
Participants and their caregivers walk in historically black neighborhoods three times a week over six months. A smart tablet provides information about “Memory Markers” they pass, such as the route of a civil rights march. People celebrate their community and culture while “brain health is running in the background,” Croff says.
Photos and memory prompts engage participants in the SHARP program.
OHSU/Kristyna Wentz-Graff
The project began in 2015 as a pilot study in Croff’s hometown of Portland, Ore., expanded to Seattle, and will soon start in Oakland, Calif. “Walking is good for slowing [brain] decline,” she says. A post-study assessment of 40 participants in 2017 showed that half had higher cognitive scores after the program; 78 percent had lower blood pressure; and 44 percent lost weight. Those with mild cognitive impairment showed the most gains. The walkers also reported improved mood and energy along with increased involvement in other activities.
It’s never too late to reap the benefits of working your brain and being socially engaged, Majoka says.
In Milwaukee, the Wisconsin Alzheimer’s Institute launched the The Amazing Grace Chorus® to stave off cognitive decline in seniors. People in early stages of Alzheimer’s practice and perform six concerts each year. The activity provides opportunities for social engagement, mental stimulation, and a support network. Among the benefits, 55 percent reported better communication at home and nearly half of participants said they got involved with more activities after participating in the chorus.
Private companies are offering intervention services to healthcare providers and insurers to manage SDOH, too. One such service, MyHello, makes calls to at-risk people to assess their needs—be it food, transportation or simply a friendly voice. Having a social support network is critical for seniors, says Majoka, noting there was a steep decline in cognitive function among isolated elders during Covid lockdowns.
About 1 in 9 Americans age 65 or older live with Alzheimer’s today. With a surge in people with the disease predicted, public health professionals have to think more broadly about resource targets and effective intervention points, Powell says.
Beyond breakthrough pills, that is. Like Dorothy in Kansas discovering happiness was always in her own backyard, we are beginning to learn that preventing Alzheimer’s is in our reach if only we recognized it.
Today’s Focus on STEM Education Is Missing A Crucial Point
I once saw a fascinating TED talk on 3D printing. As I watched the presenter discuss the custom fabrication, not of plastic gears or figurines, but of living, implantable kidneys, I thought I was finally living in the world of Star Trek, and I experienced a flush of that eager, expectant enthusiasm I felt as a child looking toward the future. I looked at my current career and felt a rejuvenation of my commitment to teach young people the power of science.
The well-rounded education of human beings needs to include lessons learned both from a study of the physical world, and from a study of humanity.
Whether we are teachers or not, those of us who admire technology and innovation, and who wish to support progress, usually embrace the importance of educating the next generation of scientists and inventors. Growing a healthy technological civilization takes a lot of work, skill, and wisdom, and its continued health depends on future generations of competent thinkers. Thus, we may find it encouraging that there is currently an abundance of interest in STEM– the common acronym for the study of science, technology, engineering, and math.
But education is as challenging an endeavor as science itself. Educating youth--if we want to do it right--requires as much thought, work, and expertise as discovering a cure or pioneering regenerative medicine. Before we give our money, time, or support to any particular school or policy, let's give some thought to the details of the educational process.
A Well-Balanced Diet
For one thing, STEM education cannot stand in isolation. The well-rounded education of human beings needs to include lessons learned both from a study of the physical world, and from a study of humanity. This is especially true for the basic education of children, but it is true even for college students. And even for those in science and engineering, there are important lessons to be learned from the study of history, literature, and art.
Scientists have their own emotions and values, and also need financial support. The fruits of their labor ultimately benefit other people. How are we all to function together in our division-of-labor society, without some knowledge of the way societies work? How are we to fully thrive and enjoy life, without some understanding of ourselves, our motives, our moral values, and our relationships to others? STEM education needs the humanities as a partner. That flourishing civilization we dream of requires both technical competence and informed life-choices.
Think for Yourself (Even in Science)
Perhaps even more important than what is taught, is the subject of how things are taught. We want our children to learn the skill of thinking independently, but even in the sciences, we often fail completely to demonstrate how. Instead of teaching science as a thinking process, we indoctrinate, using the grand discoveries of the great scientists as our sacred texts. But consider the words of Isaac Newton himself, regarding rote learning:
A Vulgar Mechanick can practice what he has been taught or seen done, but if he is in an error he knows not how to find it out and correct it, and if you put him out of his road he is at a stand. Whereas he that is able to reason nimbly and judiciously about figure, force, and motion, is never at rest till he gets over every rub.
What's the point of all this formal schooling in the first place? Is it, as many of the proponents of STEM education might argue, to train students for a "good" career?
If our goal is to help students "reason nimbly" about the world around them, as the great scientists themselves did, are we succeeding? When we "teach" middle school students about DNA or cellular respiration by presenting as our only supporting evidence cartoon pictures, are we showing students a process of discovery based on evidence and hard work? Or are we just training them to memorize and repeat what the authorities say?
A useful education needs to give students the skill of following a line of reasoning, of asking rational questions, and of chewing things through in their minds--even if we regard the material as beyond question. Besides feeding students a well-balanced diet of knowledge, healthy schooling needs to teach them to digest this information thoroughly.
Thinking Training
Now step back for a moment and think about the purpose of education. What's the point of all this formal schooling in the first place? Is it, as many of the proponents of STEM education might argue, to train students for a "good" career? That view may have some validity for young adults, who are beginning to choose electives in favored subjects, and have started to choose a direction for their career.
But for the basic education of children, this way of thinking is presumptuous and disastrous. I would argue that the central purpose of a basic education is not to teach children how to perform this or that particular skill, but simply to teach them to think clearly. We should not be aiming to provide job training, but thinking training. We should be helping children learn how to "reason nimbly" about the world around them, and breathing life into their thinking processes, by which they will grapple with the events and circumstances of their lives.
So as we admire innovation, dream of a wonderful future, and attempt to nurture the next generation of scientists and engineers, instead of obsessing over STEM education, let us focus on rational education. Let's worry about showing children how to think--about all the important things in life. Let's give them the basic facts of human existence -- physical and humanitarian -- and show them how to fluently and logically understand them.
Some students will become the next generation of creators, and some will follow other careers, but together -- if they are educated properly -- they will continue to grow their inheritance, and to keep our civilization healthy and flourishing, in body and in mind.
Do New Tools Need New Ethics?
Scarcely a week goes by without the announcement of another breakthrough owing to advancing biotechnology. Recent examples include the use of gene editing tools to successfully alter human embryos or clone monkeys; new immunotherapy-based treatments offering longer lives or even potential cures for previously deadly cancers; and the creation of genetically altered mosquitos using "gene drives" to quickly introduce changes into the population in an ecosystem and alter the capacity to carry disease.
The environment for conducting science is dramatically different today than it was in the 1970s, 80s, or even the early 2000s.
Each of these examples puts pressure on current policy guidelines and approaches, some existing since the late 1970s, which were created to help guide the introduction of controversial new life sciences technologies. But do the policies that made sense decades ago continue to make sense today, or do the tools created during different eras in science demand new ethics guidelines and policies?
Advances in biotechnology aren't new of course, and in fact have been the hallmark of science since the creation of the modern U.S. National Institutes of Health in the 1940s and similar government agencies elsewhere. Funding agencies focused on health sciences research with the hope of creating breakthroughs in human health, and along the way, basic science discoveries led to the creation of new scientific tools that offered the ability to approach life, death, and disease in fundamentally new ways.
For example, take the discovery in the 1970s of the "chemical scissors" in living cells called restriction enzymes, which could be controlled and used to introduce cuts at predictable locations in a strand of DNA. This led to the creation of tools that for the first time allowed for genetic modification of any organism with DNA, which meant bacteria, plants, animals, and even humans could in theory have harmful mutations repaired, but also that changes could be made to alter or even add genetic traits, with potentially ominous implications.
The scientists involved in that early research convened a small conference to discuss not only the science, but how to responsibly control its potential uses and their implications. The meeting became known as the Asilomar Conference for the meeting center where it was held, and is often noted as the prime example of the scientific community policing itself. While the Asilomar recommendations were not sufficient from a policy standpoint, they offered a blueprint on which policies could be based and presented a model of the scientific community setting responsible controls for itself.
But the environment for conducting science changed over the succeeding decades and it is dramatically different today than it was in the 1970s, 80s, or even the early 2000s. The regime for oversight and regulation that has provided controls for the introduction of so-called "gene therapy" in humans starting in the mid-1970s is beginning to show signs of fraying. The vast majority of such research was performed in the U.S., U.K., and Europe, where policies were largely harmonized. But as the tools for manipulating humans at the molecular level advanced, they also became more reliable and more precise, as well as cheaper and easier to use—think CRISPR—and therefore more accessible to more people in many more countries, many without clear oversight or policies laying out responsible controls.
There is no precedent for global-scale science policy, though that is exactly what this moment seems to demand.
As if to make the point through news headlines, scientists in China announced in 2017 that they had attempted to perform gene editing on in vitro human embryos to repair an inherited mutation for beta thalassemia--research that would not be permitted in the U.S. and most European countries and at the time was also banned in the U.K. Similarly, specialists from a reproductive medicine clinic in the U.S. announced in 2016 that they had performed a highly controversial reproductive technology by which DNA from two women is combined (so-called "three parent babies"), in a satellite clinic they had opened in Mexico to avoid existing prohibitions on the technique passed by the U.S. Congress in 2015.
In both cases, genetic changes were introduced into human embryos that if successful would lead to the birth of a child with genetically modified germline cells—the sperm in boys or eggs in girls—with those genetic changes passed on to all future generations of related offspring. Those are just two very recent examples, and it doesn't require much imagination to predict the list of controversial possible applications of advancing biotechnologies: attempts at genetic augmentation or even cloning in humans, and alterations of the natural environment with genetically engineered mosquitoes or other insects in areas with endemic disease. In fact, as soon as this month, scientists in Africa may release genetically modified mosquitoes for the first time.
The technical barriers are falling at a dramatic pace, but policy hasn't kept up, both in terms of what controls make sense and how to address what is an increasingly global challenge. There is no precedent for global-scale science policy, though that is exactly what this moment seems to demand. Mechanisms for policy at global scale are limited–-think UN declarations, signatory countries, and sometimes international treaties, but all are slow, cumbersome and have limited track records of success.
But not all the news is bad. There are ongoing efforts at international discussion, such as an international summit on human genome editing convened in 2015 by the National Academies of Sciences and Medicine (U.S.), Royal Academy (U.K.), and Chinese Academy of Sciences (China), a follow-on international consensus committee whose report was issued in 2017, and an upcoming 2nd international summit in Hong Kong in November this year.
These efforts need to continue to focus less on common regulatory policies, which will be elusive if not impossible to create and implement, but on common ground for the principles that ought to guide country-level rules. Such principles might include those from the list proposed by the international consensus committee, including transparency, due care, responsible science adhering to professional norms, promoting wellbeing of those affected, and transnational cooperation. Work to create a set of shared norms is ongoing and worth continued effort as the relevant stakeholders attempt to navigate what can only be called a brave new world.