Alzheimer’s prevention may be less about new drugs, more about income, zip code and education
That your risk of Alzheimer’s disease depends on your salary, what you ate as a child, or the block where you live may seem implausible. But researchers are discovering that social determinants of health (SDOH) play an outsized role in Alzheimer’s disease and related dementias, possibly more than age, and new strategies are emerging for how to address these factors.
At the 2022 Alzheimer’s Association International Conference, a series of presentations offered evidence that a string of socioeconomic factors—such as employment status, social support networks, education and home ownership—significantly affected dementia risk, even when adjusting data for genetic risk. What’s more, memory declined more rapidly in people who earned lower wages and slower in people who had parents of higher socioeconomic status.
In 2020, a first-of-its kind study in JAMA linked Alzheimer’s incidence to “neighborhood disadvantage,” which is based on SDOH indicators. Through autopsies, researchers analyzed brain tissue markers related to Alzheimer’s and found an association with these indicators. In 2022, Ryan Powell, the lead author of that study, published further findings that neighborhood disadvantage was connected with having more neurofibrillary tangles and amyloid plaques, the main pathological features of Alzheimer's disease.
As of yet, little is known about the biological processes behind this, says Powell, director of data science at the Center for Health Disparities Research at the University of Wisconsin School of Medicine and Public Health. “We know the association but not the direct causal pathway.”
The corroborative findings keep coming. In a Nature study published a few months after Powell’s study, every social determinant investigated affected Alzheimer’s risk except for marital status. The links were highest for income, education, and occupational status.
Clinical trials on new Alzheimer’s medications get all the headlines but preventing dementia through policy and public health interventions should not be underestimated.
The potential for prevention is significant. One in three older adults dies with Alzheimer's or another dementia—more than breast and prostate cancers combined. Further, a 2020 report from the Lancet Commission determined that about 40 percent of dementia cases could theoretically be prevented or delayed by managing the risk factors that people can modify.
Take inactivity. Older adults who took 9,800 steps daily were half as likely to develop dementia over the next 7 years, in a 2022 JAMA study. Hearing loss, another risk factor that can be managed, accounts for about 9 percent of dementia cases.
Clinical trials on new Alzheimer’s medications get all the headlines but preventing dementia through policy and public health interventions should not be underestimated. Simply slowing the course of Alzheimer’s or delaying its onset by five years would cut the incidence in half, according to the Global Council on Brain Health.
Minorities Hit the Hardest
The World Health Organization defines SDOH as “conditions in which people are born, work, live, and age, and the wider set of forces and systems shaping the conditions of daily life.”
Anyone who exists on processed food, smokes cigarettes, or skimps on sleep has heightened risks for dementia. But minority groups get hit harder. Older Black Americans are twice as likely to have Alzheimer’s or another form of dementia as white Americans; older Hispanics are about one and a half times more likely.
This is due in part to higher rates of diabetes, obesity, and high blood pressure within these communities. These diseases are linked to Alzheimer’s, and SDOH factors multiply the risks. Blacks and Hispanics earn less income on average than white people. This means they are more likely to live in neighborhoods with limited access to healthy food, medical care, and good schools, and suffer greater exposure to noise (which impairs hearing) and air pollution—additional risk factors for dementia.
Related Reading: The Toxic Effects of Noise and What We're Not Doing About it
Plus, when Black people are diagnosed with dementia, their cognitive impairment and neuropsychiatric symptom are more advanced than in white patients. Why? Some African-Americans delay seeing a doctor because of perceived discrimination and a sense they will not be heard, says Carl V. Hill, chief diversity, equity, and inclusion officer at the Alzheimer’s Association.
Misinformation about dementia is another issue in Black communities. The thinking is that Alzheimer’s is genetic or age-related, not realizing that diet and physical activity can improve brain health, Hill says.
African Americans are severely underrepresented in clinical trials for Alzheimer’s, too. So, researchers miss the opportunity to learn more about health disparities. “It’s a bioethical issue,” Hill says. “The people most likely to have Alzheimer’s aren’t included in the trials.”
The Cure: Systemic Change
People think of lifestyle as a choice but there are limitations, says Muniza Anum Majoka, a geriatric psychiatrist and assistant professor of psychiatry at Yale University, who published an overview of SDOH factors that impact dementia. “For a lot of people, those choices [to improve brain health] are not available,” she says. If you don’t live in a safe neighborhood, for example, walking for exercise is not an option.
Hill wants to see the focus of prevention shift from individual behavior change to ensuring everyone has access to the same resources. Advice about healthy eating only goes so far if someone lives in a food desert. Systemic change also means increasing the number of minority physicians and recruiting minorities in clinical drug trials so studies will be relevant to these communities, Hill says.
Based on SDOH impact research, raising education levels has the most potential to prevent dementia. One theory is that highly educated people have a greater brain reserve that enables them to tolerate pathological changes in the brain, thus delaying dementia, says Majoka. Being curious, learning new things and problem-solving also contribute to brain health, she adds. Plus, having more education may be associated with higher socioeconomic status, more access to accurate information and healthier lifestyle choices.
New Strategies
The chasm between what researchers know about brain health and how the knowledge is being applied is huge. “There’s an explosion of interest in this area. We’re just in the first steps,” says Powell. One day, he predicts that physicians will manage Alzheimer’s through precision medicine customized to the patient’s specific risk factors and needs.
Raina Croff, assistant professor of neurology at Oregon Health & Science University School of Medicine, created the SHARP (Sharing History through Active Reminiscence and Photo-imagery) walking program to forestall memory loss in African Americans with mild cognitive impairment or early dementia.
Participants and their caregivers walk in historically black neighborhoods three times a week over six months. A smart tablet provides information about “Memory Markers” they pass, such as the route of a civil rights march. People celebrate their community and culture while “brain health is running in the background,” Croff says.
Photos and memory prompts engage participants in the SHARP program.
OHSU/Kristyna Wentz-Graff
The project began in 2015 as a pilot study in Croff’s hometown of Portland, Ore., expanded to Seattle, and will soon start in Oakland, Calif. “Walking is good for slowing [brain] decline,” she says. A post-study assessment of 40 participants in 2017 showed that half had higher cognitive scores after the program; 78 percent had lower blood pressure; and 44 percent lost weight. Those with mild cognitive impairment showed the most gains. The walkers also reported improved mood and energy along with increased involvement in other activities.
It’s never too late to reap the benefits of working your brain and being socially engaged, Majoka says.
In Milwaukee, the Wisconsin Alzheimer’s Institute launched the The Amazing Grace Chorus® to stave off cognitive decline in seniors. People in early stages of Alzheimer’s practice and perform six concerts each year. The activity provides opportunities for social engagement, mental stimulation, and a support network. Among the benefits, 55 percent reported better communication at home and nearly half of participants said they got involved with more activities after participating in the chorus.
Private companies are offering intervention services to healthcare providers and insurers to manage SDOH, too. One such service, MyHello, makes calls to at-risk people to assess their needs—be it food, transportation or simply a friendly voice. Having a social support network is critical for seniors, says Majoka, noting there was a steep decline in cognitive function among isolated elders during Covid lockdowns.
About 1 in 9 Americans age 65 or older live with Alzheimer’s today. With a surge in people with the disease predicted, public health professionals have to think more broadly about resource targets and effective intervention points, Powell says.
Beyond breakthrough pills, that is. Like Dorothy in Kansas discovering happiness was always in her own backyard, we are beginning to learn that preventing Alzheimer’s is in our reach if only we recognized it.
The Internet has made it easier than ever to misguide people. The anti-vaxx movement, climate change denial, protests against stem cell research, and other movements like these are rooted in the spread of misinformation and a distrust of science.
"I had been taught intelligent design and young-earth creationism instead of evolution, geology, and biology."
Science illiteracy is pervasive in the communities responsible for these movements. For the mainstream, the challenge lies not in sharing the facts, but in combating the spread of misinformation and facilitating an open dialogue between experts and nonexperts.
I grew up in a household that was deeply skeptical of science and medicine. My parents are evangelical Christians who believe the word of the Bible is law. To protect my four siblings and me from secular influence, they homeschooled some of us and put the others in private Christian schools. When my oldest brother left for a Christian college and the tuition began to add up, I was placed in a public charter school to offset the costs.
There, I became acutely aware of my ignorant upbringing. I had been taught intelligent design and young-earth creationism instead of evolution, geology, and biology. My mother skipped over world religions, and much of my history curriculum was more biblical-based than factual. She warned me that stem cell research, vaccines, genetic modification of crops, and other areas of research in biological science were examples of humans trying to be like God. At the time, biologist Richard Dawkins' The God Delusion was a bestseller and science seemed like an excuse to not believe in God, so she and my father discouraged me from studying it.
The gaps in my knowledge left me feeling frustrated and embarrassed. The solution was to learn about the things that had been censored from my education, but several obstacles stood in the way.
"When I first learned about fundamentalism, my parents' behavior finally made sense."
I lacked a good foundation in basic mathematics after being taught by my mother, who never graduated college. My father, who holds a graduate degree in computer science, repeatedly told me that I inherited my mother's "bad math genes" and was therefore ill-equipped for science. While my brothers excelled at math under his supervision and were even encouraged toward careers in engineering and psychology, I was expected to do well in other subjects, such as literature. When I tried to change this by enrolling in honors math and science classes, they scolded me -- so reluctantly, I dropped math. By the time I graduated high school, I was convinced that math and science were beyond me.
When I look back at my high school transcripts, that sense of failure was unfounded: my grades were mostly A's and B's, and I excelled in honors biology. Even my elementary standardized test scores don't reflect a student disinclined toward STEM, because I consistently scored in the top percentile for sciences. Teachers often encouraged me to consider studying science in college. Why then, I wondered, did my parents reject that idea? Why did they work so hard to sway me from that path? It wasn't until I moved away from my parents' home and started working to put myself through community college that I discovered my passion for both biology and science writing.
As a young adult venturing into the field of science communication, I've become fascinated with understanding communities that foster antagonistic views toward science. When I first learned about fundamentalism, my parents' behavior finally made sense. It is the foundation of the Religious Right, a right-wing Christian group which heavily influences the Republican party in the United States. The Religious Right crusades against secular education, stem cell research, abortion, evolution, and other controversial issues in science and medicine on the basis that they contradict Christian beliefs. They are quietly overturning the separation of church and state in order to enforce their religion as policy -- at the expense of science and progress.
Growing up in this community, I learned that strong feelings about these issues arise from both a lack of science literacy and a distrust of experts. Those who are against genetic modification of crops don't understand that GMO research aims to produce more, and longer-lasting, food for a growing planet. The anti-vaxx movement is still relying on a deeply flawed study that was ultimately retracted. Those who are against stem cell research don't understand how it works or the important benefits it provides the field of medicine, such as discovering new treatment methods.
In fact, at one point the famous Christian radio show Focus on the Family spread anti-vaxx mentality when they discussed vaccines that, long ago, were derived from aborted fetal cells. Although Focus on the Family now endorses vaccines, at the time it was enough to convince my own mother, who listened to the show every morning, not to vaccinate us unless the law required it.
"In everyday interactions with skeptics, science communicators need to shift their focus from convincing to discussing."
We can help clear up misunderstandings by sharing the facts, but the real challenge lies in willful ignorance. It was hard for me to accept, but I've come to understand that I'm not going to change anyone's mind. It's up to an individual to evaluate the facts, consider the arguments for and against, and make his or her own decision.
As my parents grew older and my siblings and I introduced them to basic concepts in science, they came around to trusting the experts a little more. They now see real doctors instead of homeopathic practitioners. They acknowledge our world's changing climate instead of denying it. And they even applaud two of their children for pursuing careers in science. Although they have held on to their fundamentalism and we still disagree on many issues, these basic changes give me hope that people in deeply skeptical communities are not entirely out of reach.
In everyday interactions with skeptics, science communicators need to shift their focus from convincing to discussing. This means creating an open dialogue with the intention of being understanding and helpful, not persuasive. This approach can be beneficial in both personal and online interactions. There are people within these movements who have doubts, and their doubts will grow as we continue to feed them through discussion.
People will only change their minds when it is the right time for them to do so. We need to be there ready to hold their hand and lead them toward truth when they reach out. Until then, all we can do is keep the channels of communication open, keep sharing the facts, and fight the spread of misinformation. Science is the pursuit of truth, and as scientists and science communicators, sometimes we need to let the truth speak for itself. We're just there to hold the megaphone.
Like any life-threatening medical condition that affects children, food allergies can traumatize more than just the patient. My wife and I learned this one summer afternoon when our daughter was three years old.
Emergency room visits for anaphylaxis in children more than doubled from 2010 to 2016.
At an ice cream parlor, I gave Samantha a lick of my pistachio cone; within seconds, red blotches erupted on her skin, her lips began to swell, and she complained that her throat felt funny. We rushed her to the nearest emergency room, where a doctor injected her with epinephrine. Explaining that the reaction, known as anaphylaxis, could have been fatal if left unchecked, he advised us to have her tested for nut allergies—and to start carrying an injector of our own.
After an allergist confirmed Sam's vulnerability to tree nuts and peanuts, we figured that keeping her safe would be relatively simple. But food allergies often come in bunches. Over the next year, she wound up back in the ER after eating bread with sesame seeds at an Italian restaurant, and again after slurping buckwheat noodles at our neighborhood Japanese. She hated eggs, so we discovered that (less severe) allergy only when she vomited after eating a variety of products containing them.
In recent years, a growing number of families have had to grapple with such challenges. An estimated 32 million Americans have food allergies, or nearly 10 percent of the population—10 times the prevalence reported 35 years ago. The severity of symptoms seems to be increasing, too. According to a study released in January by Food Allergy Research & Education (FARE), a Virginia-based nonprofit, insurance claims for anaphylactic food reactions rose 377 percent in the U.S. from 2007 to 2016.
Because food allergies most commonly emerge in childhood, these trends are largely driven by the young. An insurance-industry study found that emergency room visits for anaphylaxis in children more than doubled from 2010 to 2016. Peanut allergies, once rare, tripled in kids between 1997 and 2008. "The first year, it was 1 in 250," says Scott Sicherer, chief of pediatric allergy and immunology at New York City's Mount Sinai Hospital, who led that study. "When we did the next round of research, in 2002, it was 1 in 125. I thought there must be a mistake. But by 2008, it was 1 in 70."
The forces behind these dire statistics—as well as similar numbers throughout the developed world—have yet to be positively identified. But the leading suspects are elements of our modern lifestyle that can throw the immune system out of whack, prompting potentially deadly overreactions to harmless proteins. Although parents can take a few steps that might lessen their children's risk, societal changes may be needed to brighten the larger epidemiological picture.
Meanwhile, scientists are racing to develop therapies that can induce patients' hyped-up immune defenses to chill. And lately, they've made some big strides toward that goal.
A Variety of Culprits
In the United States, about 90 percent of allergic reactions come from eight foods: milk, eggs, peanuts, tree nuts, soy, wheat, fish, and shellfish. The list varies from country to country, depending on dietary customs, but what the trigger foods all have in common is proteins that can survive breakdown in the stomach and enter the bloodstream more or less intact.
"When we were kids, we played in the dirt. Today, children tend to be on their screens, inside sealed buildings."
A food allergy results from a chain of biochemical misunderstandings. The first time the immune system encounters an allergen (as a protein that triggers an allergy is known), it mistakes the substance for a hostile invader—perhaps a parasite with a similar molecular profile. In response, it produces an antibody called immunoglobin E (IgE), which is designed to bind to a specific protein and flag it for attack. These antibodies circulate through the bloodstream and attach to immune-system foot soldiers known as mast cells and basophils, which congregate in the nose, throat, lungs, skin, and gastrointestinal tract.
The next time the person is exposed to the allergen, the IgE antibodies signal the warrior cells to blast the intruder with histamines and other chemical weapons. Tissues in the affected areas swell and leak fluid; blood pressure may fall. Depending on the strength of the reaction, collateral damage to the patient can range from unpleasant—itching, runny nose, nausea—to catastrophic.
This kind of immunological glitchiness runs in families. Genome-wide association studies have identified a dozen genes linked to allergies of all types, and twin studies suggest that about 80 percent of the risk of food allergies is heritable. But why one family member shows symptoms while another doesn't remains unknown. Nor can genetics explain why food allergy rates have skyrocketed in such a brief period. For that, we must turn to the environment.
First, it's important to note that rates of all allergies are rising—including skin and respiratory afflictions—though none as rapidly or with as much risk of anaphylaxis as those involving food. The takeoff was already underway in the late 1980s, when British epidemiologist David P. Strachan found that children in larger households had fewer instances of hay fever. The reason, he suggested, was that their immune systems were strengthened by exposure to their siblings' germs. Since then, other researchers have discerned more evidence for Strachan's "hygiene hypothesis": higher rates of allergy (as well as autoimmune disorders) in cities versus rural areas, in industrialized countries versus developing ones, in lab animals raised under sterile conditions versus those exposed to germs.
Fending off a variety of pathogens, experts theorize, helps train the immune system to better distinguish friend from foe, and to respond to threats in a more nuanced manner. In an era of increasing urbanization, shrinking family sizes, and more sheltered lifestyles, such conditioning may be harder to come by. "When we were kids, we played in the dirt," observes Cathryn R. Nagler, a professor and food allergy researcher at the University of Chicago. "Today, children tend to be on their screens, inside sealed buildings."
But other factors may be driving the allergy epidemic as well. More time indoors, for example, means less exposure to sunlight, which can lead to a deficiency in vitamin D—a nutrient crucial to immune system regulation. The growing popularity of processed foods filled with refined fats and sugars may play a role, along with rising rates of obesity, by promoting tissue inflammation that could increase some people's risk of immunological mayhem. And the surge in allergies also correlates with several trends that may be altering the human microbiome, the community of microbes (including bacteria, viruses, and fungi, among others) that inhabits our guts, skin, and bodily orifices.
The microbiome connection may be particularly relevant to food allergies. In 2014, a team led by Nagler published a landmark study showing that Clostridia, a common class of gut bacteria, protects against these allergies. When the researchers fed peanut allergens to germ-free mice (born and raised in sterile conditions) and to mice treated with antibiotics as newborns (reducing their gut bacteria), the animals showed a strong immunological response. This sensitization could be reversed, however, by reintroducing Clostridia—but not another class of bacteria, Bacteroides—into the mice. Further experiments revealed that Clostridia caused immune cells to produce high levels of interleukin-22 (IL-22), a signaling molecule known to decrease the permeability of the intestinal lining.
"In simple terms," Nagler says, "what we found is that these bacteria prevent food allergens from gaining access to the blood in an intact form that elicits an allergic reaction."
A growing body of evidence suggests that our eating habits are throwing our gut microbiota off-balance, in part by depriving helpful species of the dietary fiber they feed on. Our increasing exposure to antibiotics and antimicrobial compounds may be harming our beneficial bugs as well. These depletions could affect kids from the moment they enter the world: Because babies are seeded with their mothers' microbiota as they pass through the birth canal, they may be inheriting a less diverse microbiome than did previous generations. And the rising rate of caesarian deliveries may be further depriving our children of the bugs they need.
On expert suggests two measures worth a try: increasing consumption of fiber, and reducing use of antimicrobial agents, from antibacterial cleaners to antibiotics.
So which culprit is most responsible for the food allergy upsurge? "The illnesses that we're measuring are complex," says Sicherer. "There are multiple genetic inputs, which interact with one another, and there are multiple environmental inputs, which interact with each other and with the genes. There's not one single thing that's causing this. It's a conglomeration."
What Parents Can Do
For anyone hoping to reduce their child's or their own odds of developing a food allergy (rates of adult onset are also increasing), the current state of science offers few guideposts. As with many other areas of health research, it's hard to know when the data is solid enough to warrant a particular course of action. A case in point: the American Academy of Pediatrics once recommended that children at risk of allergy to peanuts (as evidenced by family history, other food allergies, or eczema) wait to eat them until age three; now, the AAP advises those parents to start their babies at four months, citing epidemiological evidence that early exposure may prevent peanut allergies.
And it's all too easy for a layperson to draw mistaken conclusions from media coverage of such research—inferring, for instance, that taking commercially available probiotics might have a protective effect. Unfortunately, says Nagler, none of those products even contain the relevant kind of bacteria.
Although, as a research scientist, she refrains from giving medical advice, Nagler does suggest (based on a large body of academic literature) that two measures are worth a try: increasing consumption of fiber, and reducing use of antimicrobial agents, from antibacterial cleaners to antibiotics. Yet she acknowledges that it's not always possible to avoid the suspected risk factors for food allergies. Sometimes an antibiotic is a lifesaving necessity, for example—and it's tough to avoid exposure to such drugs altogether, due to their use in animal feed and their consequent presence in many foods and in the water supply. If these chemicals are contributing to the food allergy epidemic, protecting ourselves will require action from farmers, doctors, manufacturers, and policymakers.
My family's experience illustrates the limits of healthy lifestyle choices in mitigating allergy risk. My daughter and son were born without C-sections; both were breastfed as well, receiving maximum microbial seeding from their mother. As a family, we eat exemplary diets, and no one could describe our home as excessively clean. Yet one child can't taste nuts, sesame, or buckwheat without becoming dangerously ill. "You can do everything right and still have allergies," says Ian A. Myles, a staff clinician at the National Institute of Allergy and Infectious Diseases. "You can do everything wrong and not have allergies. The two groups overlap."
The Latest Science Shows Promise
But while preventing all food allergies is clearly unrealistic, researchers are making remarkable progress in developing better treatments—therapies that, instead of combating symptoms after they've started (like epinephrine or antihistamines), aim to make patients less sensitive to allergens in the first place. One promising approach is oral immunotherapy (OIT), in which patients consume small but slowly increasing amounts of an allergen, gradually reducing their sensitivity. A study published last year in the New England Journal of Medicine showed that an experimental OIT called AR101, consisting of a standardized peanut powder mixed into food, enabled 67 percent of participants to tolerate a dose equivalent to two peanut kernels—a potential lifesaver if they were accidentally exposed to the real thing.
Because OIT itself can trigger troublesome reactions in some patients, however, it's not for everyone. Another experimental treatment, sublingual immunotherapy (SLIT) uses an allergen solution or dissolving tablet placed beneath the tongue; although its results are less robust than OIT's, it seems to generate milder side effects. Epicutaneous immunotherapy (EPIT) avoids the mouth entirely, using a technology similar to a nicotine patch to deliver allergens through the skin. Researchers are also exploring the use of medications known as biologics, aiming to speed up the action of immunotherapies by suppressing IgE or targeting other immune-system molecules.
These findings suggest that drugs based on microbial metabolites could help protect vulnerable individuals against a wide range of allergies.
One downside of the immunotherapy approach is that in most cases the allergen must be taken indefinitely to maintain desensitization. To provide a potentially permanent fix, scientists are working on vaccines that use DNA or peptides (protein fragments) from allergens to reset patients' immune systems.
Nagler is attacking the problem from a different angle—one that starts with the microbiome. In a recent study, a follow-up to her peanut-allergy investigation, she and her colleagues found that Clostridia bacteria protect mice against milk allergy as well; they also identified a particular species responsible, known as Anaerostipes caccae. The bugs, the team determined, produce a short-chain fatty acid called butyrate, which modulates many immune activities crucial to maintaining a well-sealed gut.
These findings suggest that drugs based on microbial metabolites could help protect vulnerable individuals against a wide range of allergies. Nagler has launched a company, ClostraBio, to develop biotherapeutics based on this notion; she expects its first product, using synthetic butyrate, to be ready for clinical trials within the next two years.
My daughter could well be a candidate for such a medication. Sam, now 15, is a vibrant, resilient kid who handles her allergies with confidence and humor. Thanks to vigilance and luck (on her part as well as her parents'), she hasn't had another food-related ER visit in more than a decade; she's never had to use her Epi-Pen. Still, she says, she would welcome the arrival of a pill that could reduce the danger. "I've learned how to watch out for myself," she says. "But it would be nice not to have to be so careful."