Alzheimer’s prevention may be less about new drugs, more about income, zip code and education
That your risk of Alzheimer’s disease depends on your salary, what you ate as a child, or the block where you live may seem implausible. But researchers are discovering that social determinants of health (SDOH) play an outsized role in Alzheimer’s disease and related dementias, possibly more than age, and new strategies are emerging for how to address these factors.
At the 2022 Alzheimer’s Association International Conference, a series of presentations offered evidence that a string of socioeconomic factors—such as employment status, social support networks, education and home ownership—significantly affected dementia risk, even when adjusting data for genetic risk. What’s more, memory declined more rapidly in people who earned lower wages and slower in people who had parents of higher socioeconomic status.
In 2020, a first-of-its kind study in JAMA linked Alzheimer’s incidence to “neighborhood disadvantage,” which is based on SDOH indicators. Through autopsies, researchers analyzed brain tissue markers related to Alzheimer’s and found an association with these indicators. In 2022, Ryan Powell, the lead author of that study, published further findings that neighborhood disadvantage was connected with having more neurofibrillary tangles and amyloid plaques, the main pathological features of Alzheimer's disease.
As of yet, little is known about the biological processes behind this, says Powell, director of data science at the Center for Health Disparities Research at the University of Wisconsin School of Medicine and Public Health. “We know the association but not the direct causal pathway.”
The corroborative findings keep coming. In a Nature study published a few months after Powell’s study, every social determinant investigated affected Alzheimer’s risk except for marital status. The links were highest for income, education, and occupational status.
Clinical trials on new Alzheimer’s medications get all the headlines but preventing dementia through policy and public health interventions should not be underestimated.
The potential for prevention is significant. One in three older adults dies with Alzheimer's or another dementia—more than breast and prostate cancers combined. Further, a 2020 report from the Lancet Commission determined that about 40 percent of dementia cases could theoretically be prevented or delayed by managing the risk factors that people can modify.
Take inactivity. Older adults who took 9,800 steps daily were half as likely to develop dementia over the next 7 years, in a 2022 JAMA study. Hearing loss, another risk factor that can be managed, accounts for about 9 percent of dementia cases.
Clinical trials on new Alzheimer’s medications get all the headlines but preventing dementia through policy and public health interventions should not be underestimated. Simply slowing the course of Alzheimer’s or delaying its onset by five years would cut the incidence in half, according to the Global Council on Brain Health.
Minorities Hit the Hardest
The World Health Organization defines SDOH as “conditions in which people are born, work, live, and age, and the wider set of forces and systems shaping the conditions of daily life.”
Anyone who exists on processed food, smokes cigarettes, or skimps on sleep has heightened risks for dementia. But minority groups get hit harder. Older Black Americans are twice as likely to have Alzheimer’s or another form of dementia as white Americans; older Hispanics are about one and a half times more likely.
This is due in part to higher rates of diabetes, obesity, and high blood pressure within these communities. These diseases are linked to Alzheimer’s, and SDOH factors multiply the risks. Blacks and Hispanics earn less income on average than white people. This means they are more likely to live in neighborhoods with limited access to healthy food, medical care, and good schools, and suffer greater exposure to noise (which impairs hearing) and air pollution—additional risk factors for dementia.
Related Reading: The Toxic Effects of Noise and What We're Not Doing About it
Plus, when Black people are diagnosed with dementia, their cognitive impairment and neuropsychiatric symptom are more advanced than in white patients. Why? Some African-Americans delay seeing a doctor because of perceived discrimination and a sense they will not be heard, says Carl V. Hill, chief diversity, equity, and inclusion officer at the Alzheimer’s Association.
Misinformation about dementia is another issue in Black communities. The thinking is that Alzheimer’s is genetic or age-related, not realizing that diet and physical activity can improve brain health, Hill says.
African Americans are severely underrepresented in clinical trials for Alzheimer’s, too. So, researchers miss the opportunity to learn more about health disparities. “It’s a bioethical issue,” Hill says. “The people most likely to have Alzheimer’s aren’t included in the trials.”
The Cure: Systemic Change
People think of lifestyle as a choice but there are limitations, says Muniza Anum Majoka, a geriatric psychiatrist and assistant professor of psychiatry at Yale University, who published an overview of SDOH factors that impact dementia. “For a lot of people, those choices [to improve brain health] are not available,” she says. If you don’t live in a safe neighborhood, for example, walking for exercise is not an option.
Hill wants to see the focus of prevention shift from individual behavior change to ensuring everyone has access to the same resources. Advice about healthy eating only goes so far if someone lives in a food desert. Systemic change also means increasing the number of minority physicians and recruiting minorities in clinical drug trials so studies will be relevant to these communities, Hill says.
Based on SDOH impact research, raising education levels has the most potential to prevent dementia. One theory is that highly educated people have a greater brain reserve that enables them to tolerate pathological changes in the brain, thus delaying dementia, says Majoka. Being curious, learning new things and problem-solving also contribute to brain health, she adds. Plus, having more education may be associated with higher socioeconomic status, more access to accurate information and healthier lifestyle choices.
New Strategies
The chasm between what researchers know about brain health and how the knowledge is being applied is huge. “There’s an explosion of interest in this area. We’re just in the first steps,” says Powell. One day, he predicts that physicians will manage Alzheimer’s through precision medicine customized to the patient’s specific risk factors and needs.
Raina Croff, assistant professor of neurology at Oregon Health & Science University School of Medicine, created the SHARP (Sharing History through Active Reminiscence and Photo-imagery) walking program to forestall memory loss in African Americans with mild cognitive impairment or early dementia.
Participants and their caregivers walk in historically black neighborhoods three times a week over six months. A smart tablet provides information about “Memory Markers” they pass, such as the route of a civil rights march. People celebrate their community and culture while “brain health is running in the background,” Croff says.
Photos and memory prompts engage participants in the SHARP program.
OHSU/Kristyna Wentz-Graff
The project began in 2015 as a pilot study in Croff’s hometown of Portland, Ore., expanded to Seattle, and will soon start in Oakland, Calif. “Walking is good for slowing [brain] decline,” she says. A post-study assessment of 40 participants in 2017 showed that half had higher cognitive scores after the program; 78 percent had lower blood pressure; and 44 percent lost weight. Those with mild cognitive impairment showed the most gains. The walkers also reported improved mood and energy along with increased involvement in other activities.
It’s never too late to reap the benefits of working your brain and being socially engaged, Majoka says.
In Milwaukee, the Wisconsin Alzheimer’s Institute launched the The Amazing Grace Chorus® to stave off cognitive decline in seniors. People in early stages of Alzheimer’s practice and perform six concerts each year. The activity provides opportunities for social engagement, mental stimulation, and a support network. Among the benefits, 55 percent reported better communication at home and nearly half of participants said they got involved with more activities after participating in the chorus.
Private companies are offering intervention services to healthcare providers and insurers to manage SDOH, too. One such service, MyHello, makes calls to at-risk people to assess their needs—be it food, transportation or simply a friendly voice. Having a social support network is critical for seniors, says Majoka, noting there was a steep decline in cognitive function among isolated elders during Covid lockdowns.
About 1 in 9 Americans age 65 or older live with Alzheimer’s today. With a surge in people with the disease predicted, public health professionals have to think more broadly about resource targets and effective intervention points, Powell says.
Beyond breakthrough pills, that is. Like Dorothy in Kansas discovering happiness was always in her own backyard, we are beginning to learn that preventing Alzheimer’s is in our reach if only we recognized it.
Matt Trau, a professor of chemistry at the University of Queensland, stunned the science world back in December when the prestigious journal Nature Communications published his lab's discovery about a unique property of cancer DNA that could lead to a simple, cheap, and accurate test to detect any type of cancer in under 10 minutes.
No one believed it. I didn't believe it. I thought, "Gosh, okay, maybe it's a fluke."
Trau granted very few interviews in the wake of the news, but he recently opened up to leapsmag about the significance of this promising early research. Here is his story in his own words, as told to Editor-in-Chief Kira Peikoff.
There's been an incredible explosion of knowledge over the past 20 years, particularly since the genome was sequenced. The area of diagnostics has a tremendous amount of promise and has caught our lab's interest. If you catch cancer early, you can improve survival rates to as high as 98 percent, sometimes even now surpassing that.
My lab is interested in devices to improve the trajectory of cancer patients. So, once people get diagnosed, can we get really sophisticated information about the molecular origins of the disease, and can we measure it in real time? And then can we match that with the best treatment and monitor it in real time, too?
I think those approaches, also coupled with immunotherapy, where one dreams of monitoring the immune system simultaneously with the disease progress, will be the future.
But currently, the methodologies for cancer are still pretty old. So, for example, let's talk about biopsies in general. Liquid biopsy just means using a blood test or a urine test, rather than extracting out a piece of solid tissue. Now consider breast cancer. Still, the cutting-edge screening method is mammography or the physical interrogation for lumps. This has had a big impact in terms of early detection and awareness, but it's still primitive compared to interrogating, forensically, blood samples to look at traces of DNA.
Large machines like CAT scans, PET scans, MRIs, are very expensive and very subjective in terms of the operator. They don't look at the root causes of the cancer. Cancer is caused by changes in DNA. These can be changes in the hard drive of the DNA (the genomic changes) or changes in the apps that the DNA are running (the epigenetics and the transcriptomics).
We don't look at that now, even though we have, emerging, all of these technologies to do it, and those technologies are getting so much cheaper. I saw some statistics at a conference just a few months ago that, in the United States, less than 1 percent of cancer patients have their DNA interrogated. That's the current state-of-the-art in the modern medical system.
Professor Matt Trau, a cancer researcher at the University of Queensland in Australia.
(Courtesy)
Blood, as the highway of the body, is carrying all of this information. Cancer cells, if they are present in the body, are constantly getting turned over. When they die, they release their contents into the blood. Many of these cells end up in the urine and saliva. Having technologies that can forensically scan the highways looking for evidence of cancer is little bit like looking for explosives at the airport. That's very valuable as a security tool.
The trouble is that there are thousands of different types of cancer. Going back to breast cancer, there's at least a dozen different types, probably more, and each of them change the DNA (the hard drive of the disease) and the epigenetics (or the RAM memory). So one of the problems for diagnostics in cancer is to find something that is a signature of all cancers. That's been a really, really, really difficult problem.
Ours was a completely serendipitous discovery. What we found in the lab was this one marker that just kept coming up in all of the types of breast cancers we were studying.
No one believed it. I didn't believe it. I thought, "Gosh, okay, maybe it's a fluke, maybe it works just for breast cancer." So we went on to test it in prostate cancer, which is also many different types of diseases, and it seemed to be working in all of those. We then tested it further in lymphoma. Again, many different types of lymphoma. It worked across all of those. We tested it in gastrointestinal cancer. Again, many different types, and still, it worked, but we were skeptical.
Then we looked at cell lines, which are cells that have come from previous cancer patients, that we grow in the lab, but are used as model experimental systems. We have many of those cell lines, both ones that are cancerous, and ones that are healthy. It was quite remarkable that the marker worked in all of the cancer cell lines and didn't work in the healthy cell lines.
What could possibly be going on?
Well, imagine DNA as a piece of string, that's your hard drive. Epigenetics is like the beads that you put on that string. Those beads you can take on and off as you wish and they control which apps are run, meaning which genetic programs the cell runs. We hypothesized that for cancer, those beads cluster together, rather than being randomly distributed across the string.
Ultimately, I see this as something that would be like a pregnancy test you could take at your doctor's office.
The implications of this are profound. It means that DNA from cancer folds in water into three-dimensional structures that are very different from healthy cells' DNA. It's quite literally the needle in a haystack. Because when you do a liquid biopsy for early detection of cancer, most of the DNA from blood contains a vast abundance of healthy DNA. And that's not of interest. What's of interest is to find the cancerous DNA. That's there only in trace.
Once we figured out what was going on, we could easily set up a system to detect the trace cancerous DNA. It binds to gold nanoparticles in water and changes color. The test takes 10 minutes, and you can detect it by eye. Red indicates cancer and blue doesn't.
We're very, very excited about where we go from here. We're starting to test the test on a greater number of cancers, in thousands of patient samples. We're looking to the scientific community to engage with us, and we're getting a really good response from groups around the world who are supplying more samples to us so we can test this more broadly.
We also are very interested in testing how early can we go with this test. Can we detect cancer through a simple blood test even before there are any symptoms whatsoever? If so, we might be able to convert a cancer diagnosis to something almost as good as a vaccine.
Of course, we have to watch what are called false positives. We don't want to be detecting people as positives when they don't have cancer, and so the technology needs to improve there. We see this version as the iPhone 1. We're interested in the iPhone 2, 3, 4, getting better and better.
Ultimately, I see this as something that would be like a pregnancy test you could take at your doctor's office. If it came back positive, your doctor could say, "Look, there's some news here, but actually, it's not bad news, it's good news. We've caught this so early that we will be able to manage this, and this won't be a problem for you."
If this were to be in routine use in the medical system, countless lives could be saved. Cancer is now becoming one of the biggest killers in the world. We're talking millions upon millions upon millions of people who are affected. This really motivates our work. We might make a difference there.
Kira Peikoff was the editor-in-chief of Leaps.org from 2017 to 2021. As a journalist, her work has appeared in The New York Times, Newsweek, Nautilus, Popular Mechanics, The New York Academy of Sciences, and other outlets. She is also the author of four suspense novels that explore controversial issues arising from scientific innovation: Living Proof, No Time to Die, Die Again Tomorrow, and Mother Knows Best. Peikoff holds a B.A. in Journalism from New York University and an M.S. in Bioethics from Columbia University. She lives in New Jersey with her husband and two young sons. Follow her on Twitter @KiraPeikoff.
Ethan Lindenberger, the Ohio teenager who sought out vaccinations after he was denied them as a child, recently testified before Congress about why his parents became anti-vaxxers. The trouble, he believes, stems from the pervasiveness of misinformation online.
There is evidence that 'educating' people with facts about the benefits of vaccination may not be effective.
"For my mother, her love and affection and care as a parent was used to push an agenda to create a false distress," he told the Senate Committee. His mother read posts on social media saying vaccines are dangerous, and that was enough to persuade her against them.
His story is an example of how widespread and harmful the current discourse on vaccinations is—and more importantly—how traditional strategies to convince people about the merits of vaccination have largely failed.
As responsible members of society, all of us have implicitly signed on to what ethicists call the "Social Contract" -- we agree to abide by certain moral and political rules of behavior. This is what our societal values, norms, and often governments are based upon. However, with the unprecedented rise of social media, alternative facts, and fake news, it is evident that our understanding—and application—of the social contract must also evolve.
Nowhere is this breakdown of societal norms more visible than in the failure to contain the spread of vaccine-preventable diseases like measles. What started off as unexplained episodes in New York City last October, mostly in communities that are under-vaccinated, has exploded into a national epidemic: 880 cases of measles across 24 states in 2019, according to the CDC (as of May 17, 2019). In fact, the Unites States is only eight months away from losing its "measles free" status, joining Venezuela as the second country out of North and South America with that status.
The U.S. is not the only country facing this growing problem. Such constant and perilous reemergence of measles and other vaccine-preventable diseases in various parts of the world raises doubts about the efficacy of current vaccination policies. In addition to the loss of valuable life, these outbreaks lead to loss of millions of dollars in unnecessary expenditure of scarce healthcare resources. While we may be living through an age of information, we are also navigating an era whose hallmark is a massive onslaught on truth.
There is ample evidence on how these outbreaks start: low-vaccination rates. At the same time, there is evidence that 'educating' people with facts about the benefits of vaccination may not be effective. Indeed, human reasoning has a limit, and facts alone rarely change a person's opinion. In a fascinating report by researchers from the University of Pennsylvania, a small experiment revealed how "behavioral nudges" could inform policy decisions around vaccination.
In the reported experiment, the vaccination rate for employees of a company increased by 1.5 percent when they were prompted to name the date when they planned to get their flu shot. In the same experiment, when employees were prompted to name both a date and a time for their planned flu shot, vaccination rate increased by 4 percent.
A randomized trial revealed the subtle power of "announcements" – direct, brief, assertive statements by physicians that assumed parents were ready to vaccinate their children.
This experiment is a part of an emerging field of behavioral economics—a scientific undertaking that uses insights from psychology to understand human decision-making. The field was born from a humbling realization that humans probably do not possess an unlimited capacity for processing information. Work in this field could inform how we can formulate vaccination policy that is effective, conserves healthcare resources, and is applicable to current societal norms.
Take, for instance, the case of Human Papilloma Virus (HPV) that can cause several types of cancers in both men and women. Research into the quality of physician communication has repeatedly revealed how lukewarm recommendations for HPV vaccination by primary care physicians likely contributes to under-immunization of eligible adolescents and can cause confusion for parents.
A randomized trial revealed the subtle power of "announcements" – direct, brief, assertive statements by physicians that assumed parents were ready to vaccinate their children. These announcements increased vaccination rates by 5.4 percent. Lengthy, open-ended dialogues demonstrated no benefit in vaccination rates. It seems that uncertainty from the physician translates to unwillingness from a parent.
Choice architecture is another compelling concept. The premise is simple: We hardly make any of our decisions in vacuum; the environment in which these decisions are made has an influence. If health systems were designed with these insights in mind, people would be more likely to make better choices—without being forced.
This theory, proposed by Richard Thaler, who won the 2017 Nobel Prize in Economics, was put to the test by physicians at the University of Pennsylvania. In their study, flu vaccination rates at primary care practices increased by 9.5 percent all because the staff implemented "active choice intervention" in their electronic health records—a prompt that nudged doctors and nurses to ask patients if they'd gotten the vaccine yet. This study illustrated how an intervention as simple as a reminder can save lives.
To be sure, some bioethicists do worry about implementing these policies. Are behavioral nudges akin to increased scrutiny or a burden for the disadvantaged? For example, would incentives to quit smoking unfairly target the poor, who are more likely to receive criticism for bad choices?
The measles outbreak is a sober reminder of how devastating it can be when the social contract breaks down.
While this is a valid concern, behavioral economics offers one of the only ethical solutions to increasing vaccination rates by addressing the most critical—and often legal—challenge to universal vaccinations: mandates. Choice architecture and other interventions encourage and inform a choice, allowing an individual to retain his or her right to refuse unwanted treatment. This distinction is especially important, as evidence suggests that people who refuse vaccinations often do so as a result of cognitive biases – systematic errors in thinking resulting from emotional attachment or a lack of information.
For instance, people are prone to "confirmation bias," or a tendency to selectively believe in information that confirms their preexisting theories, rather than the available evidence. At the same time, people do not like mandates. In such situations, choice architecture provides a useful option: people are nudged to make the right choice via the design of health delivery systems, without needing policies that rely on force.
The measles outbreak is a sober reminder of how devastating it can be when the social contract breaks down and people fall prey to misinformation. But all is not lost. As we fight a larger societal battle against alternative facts, we now have another option in the trenches to subtly encourage people to make better choices.
Using insights from research in decision-making, we can all contribute meaningfully in controversial conversations with family, friends, neighbors, colleagues, and our representatives — and push for policies that protect those we care about. A little more than a hundred years ago, thousands of lives were routinely lost to preventive illnesses. We've come too far to let ignorance destroy us now.