Alzheimer’s prevention may be less about new drugs, more about income, zip code and education
That your risk of Alzheimer’s disease depends on your salary, what you ate as a child, or the block where you live may seem implausible. But researchers are discovering that social determinants of health (SDOH) play an outsized role in Alzheimer’s disease and related dementias, possibly more than age, and new strategies are emerging for how to address these factors.
At the 2022 Alzheimer’s Association International Conference, a series of presentations offered evidence that a string of socioeconomic factors—such as employment status, social support networks, education and home ownership—significantly affected dementia risk, even when adjusting data for genetic risk. What’s more, memory declined more rapidly in people who earned lower wages and slower in people who had parents of higher socioeconomic status.
In 2020, a first-of-its kind study in JAMA linked Alzheimer’s incidence to “neighborhood disadvantage,” which is based on SDOH indicators. Through autopsies, researchers analyzed brain tissue markers related to Alzheimer’s and found an association with these indicators. In 2022, Ryan Powell, the lead author of that study, published further findings that neighborhood disadvantage was connected with having more neurofibrillary tangles and amyloid plaques, the main pathological features of Alzheimer's disease.
As of yet, little is known about the biological processes behind this, says Powell, director of data science at the Center for Health Disparities Research at the University of Wisconsin School of Medicine and Public Health. “We know the association but not the direct causal pathway.”
The corroborative findings keep coming. In a Nature study published a few months after Powell’s study, every social determinant investigated affected Alzheimer’s risk except for marital status. The links were highest for income, education, and occupational status.
Clinical trials on new Alzheimer’s medications get all the headlines but preventing dementia through policy and public health interventions should not be underestimated.
The potential for prevention is significant. One in three older adults dies with Alzheimer's or another dementia—more than breast and prostate cancers combined. Further, a 2020 report from the Lancet Commission determined that about 40 percent of dementia cases could theoretically be prevented or delayed by managing the risk factors that people can modify.
Take inactivity. Older adults who took 9,800 steps daily were half as likely to develop dementia over the next 7 years, in a 2022 JAMA study. Hearing loss, another risk factor that can be managed, accounts for about 9 percent of dementia cases.
Clinical trials on new Alzheimer’s medications get all the headlines but preventing dementia through policy and public health interventions should not be underestimated. Simply slowing the course of Alzheimer’s or delaying its onset by five years would cut the incidence in half, according to the Global Council on Brain Health.
Minorities Hit the Hardest
The World Health Organization defines SDOH as “conditions in which people are born, work, live, and age, and the wider set of forces and systems shaping the conditions of daily life.”
Anyone who exists on processed food, smokes cigarettes, or skimps on sleep has heightened risks for dementia. But minority groups get hit harder. Older Black Americans are twice as likely to have Alzheimer’s or another form of dementia as white Americans; older Hispanics are about one and a half times more likely.
This is due in part to higher rates of diabetes, obesity, and high blood pressure within these communities. These diseases are linked to Alzheimer’s, and SDOH factors multiply the risks. Blacks and Hispanics earn less income on average than white people. This means they are more likely to live in neighborhoods with limited access to healthy food, medical care, and good schools, and suffer greater exposure to noise (which impairs hearing) and air pollution—additional risk factors for dementia.
Related Reading: The Toxic Effects of Noise and What We're Not Doing About it
Plus, when Black people are diagnosed with dementia, their cognitive impairment and neuropsychiatric symptom are more advanced than in white patients. Why? Some African-Americans delay seeing a doctor because of perceived discrimination and a sense they will not be heard, says Carl V. Hill, chief diversity, equity, and inclusion officer at the Alzheimer’s Association.
Misinformation about dementia is another issue in Black communities. The thinking is that Alzheimer’s is genetic or age-related, not realizing that diet and physical activity can improve brain health, Hill says.
African Americans are severely underrepresented in clinical trials for Alzheimer’s, too. So, researchers miss the opportunity to learn more about health disparities. “It’s a bioethical issue,” Hill says. “The people most likely to have Alzheimer’s aren’t included in the trials.”
The Cure: Systemic Change
People think of lifestyle as a choice but there are limitations, says Muniza Anum Majoka, a geriatric psychiatrist and assistant professor of psychiatry at Yale University, who published an overview of SDOH factors that impact dementia. “For a lot of people, those choices [to improve brain health] are not available,” she says. If you don’t live in a safe neighborhood, for example, walking for exercise is not an option.
Hill wants to see the focus of prevention shift from individual behavior change to ensuring everyone has access to the same resources. Advice about healthy eating only goes so far if someone lives in a food desert. Systemic change also means increasing the number of minority physicians and recruiting minorities in clinical drug trials so studies will be relevant to these communities, Hill says.
Based on SDOH impact research, raising education levels has the most potential to prevent dementia. One theory is that highly educated people have a greater brain reserve that enables them to tolerate pathological changes in the brain, thus delaying dementia, says Majoka. Being curious, learning new things and problem-solving also contribute to brain health, she adds. Plus, having more education may be associated with higher socioeconomic status, more access to accurate information and healthier lifestyle choices.
New Strategies
The chasm between what researchers know about brain health and how the knowledge is being applied is huge. “There’s an explosion of interest in this area. We’re just in the first steps,” says Powell. One day, he predicts that physicians will manage Alzheimer’s through precision medicine customized to the patient’s specific risk factors and needs.
Raina Croff, assistant professor of neurology at Oregon Health & Science University School of Medicine, created the SHARP (Sharing History through Active Reminiscence and Photo-imagery) walking program to forestall memory loss in African Americans with mild cognitive impairment or early dementia.
Participants and their caregivers walk in historically black neighborhoods three times a week over six months. A smart tablet provides information about “Memory Markers” they pass, such as the route of a civil rights march. People celebrate their community and culture while “brain health is running in the background,” Croff says.
Photos and memory prompts engage participants in the SHARP program.
OHSU/Kristyna Wentz-Graff
The project began in 2015 as a pilot study in Croff’s hometown of Portland, Ore., expanded to Seattle, and will soon start in Oakland, Calif. “Walking is good for slowing [brain] decline,” she says. A post-study assessment of 40 participants in 2017 showed that half had higher cognitive scores after the program; 78 percent had lower blood pressure; and 44 percent lost weight. Those with mild cognitive impairment showed the most gains. The walkers also reported improved mood and energy along with increased involvement in other activities.
It’s never too late to reap the benefits of working your brain and being socially engaged, Majoka says.
In Milwaukee, the Wisconsin Alzheimer’s Institute launched the The Amazing Grace Chorus® to stave off cognitive decline in seniors. People in early stages of Alzheimer’s practice and perform six concerts each year. The activity provides opportunities for social engagement, mental stimulation, and a support network. Among the benefits, 55 percent reported better communication at home and nearly half of participants said they got involved with more activities after participating in the chorus.
Private companies are offering intervention services to healthcare providers and insurers to manage SDOH, too. One such service, MyHello, makes calls to at-risk people to assess their needs—be it food, transportation or simply a friendly voice. Having a social support network is critical for seniors, says Majoka, noting there was a steep decline in cognitive function among isolated elders during Covid lockdowns.
About 1 in 9 Americans age 65 or older live with Alzheimer’s today. With a surge in people with the disease predicted, public health professionals have to think more broadly about resource targets and effective intervention points, Powell says.
Beyond breakthrough pills, that is. Like Dorothy in Kansas discovering happiness was always in her own backyard, we are beginning to learn that preventing Alzheimer’s is in our reach if only we recognized it.
There's no shortage of fake news going around the internet these days, but how do we become more aware as consumers of what's real and what's not?
"We are hoping to create what you might call a general 'vaccine' against fake news, rather than trying to counter each specific conspiracy or falsehood."
Researchers at the University of Cambridge may have answered just that by developing an online game designed to expose and educate participants to the tactics used by those spreading false information.
"We wanted to see if we could preemptively debunk, or 'pre-bunk', fake news by exposing people to a weak dose of the methods used to create and spread disinformation, so they have a better understanding of how they might be deceived," Dr Sander van der Linden, Director of the Cambridge Social Decision-Making Lab, said in a statement.
"This is a version of what psychologists call 'inoculation theory', with our game working like a psychological vaccination."
In February 2018, van der Linden and his coauthor, Jon Roozenbeek, helped launch the browser game, "Bad News," where players take on the role of "Disinformation and Fake News Tycoon."
They can manipulate news and social media within the game by several different methods, including deploying twitter-bots, photo-shopping evidence, creating fake accounts, and inciting conspiracy theories with the goal of attracting followers and maintaining a "credibility score" for persuasiveness.
In order to gauge the game's effectiveness, players were asked to rate the reliability of a number of real and fake news headlines and tweets both before and after playing. The data from 15,000 players was evaluated, with the results published June 25 in the journal Palgrave Communications.
The results concluded that "the perceived reliability of fake news before playing the game had reduced by an average of 21% after completing it. Yet the game made no difference to how users ranked real news."
Just 15 minutes of playing the game can have a moderate effect on people, which could play a major role on a larger scale.
Additionally, participants who "registered as most susceptible to fake news headlines at the outset benefited most from the 'inoculation,'" according to the study.
Just 15 minutes of playing the game can have a moderate effect on people, which could play a major role on a larger scale when it comes to "building a societal resistance to fake news," according to Dr. van der Linden.
"Research suggests that fake news spreads faster and deeper than the truth, so combating disinformation after-the-fact can be like fighting a losing battle," he said.
"We are hoping to create what you might call a general 'vaccine' against fake news, rather than trying to counter each specific conspiracy or falsehood," Roozenbeek added.
Van der Linden and Roozenbeek's work is an early example of the potential methods to protect people against deception by training them to be more attuned to the methods used to distribute fake news.
"I hope that the positive results give further credence to the new science of prebunking rather than only thinking about traditional debunking. On a larger level, I also hope the game and results inspire a new kind of behavioral science research where we actively engage with people and apply insights from psychological science in the public interest," van der Linden told leapsmag.
"I like the idea that the end result of a scientific theory is a real-world partnership and practical tool that organizations and people can use to guard themselves against online manipulation techniques in a novel and hopefully fun and engaging manner."
Ready to be "inoculated" against fake news? Then play the game for yourself.
What if people could just survive on sunlight like plants?
The admittedly outlandish question occurred to me after reading about how climate change will exacerbate drought, flooding, and worldwide food shortages. Many of these problems could be eliminated if human photosynthesis were possible. Had anyone ever tried it?
Extreme space travel exists at an ethically unique spot that makes human experimentation much more palatable.
I emailed Sidney Pierce, professor emeritus in the Department of Integrative Biology at the University of South Florida, who studies a type of sea slug, Elysia chlorotica, that eats photosynthetic algae, incorporating the algae's key cell structure into itself. It's still a mystery how exactly a slug can operate the part of the cell that converts sunlight into energy, which requires proteins made by genes to function, but the upshot is that the slugs can (and do) live on sunlight in-between feedings.
Pierce says he gets questions about human photosynthesis a couple of times a year, but it almost certainly wouldn't be worth it to try to develop the process in a human. "A high-metabolic rate, large animal like a human could probably not survive on photosynthesis," he wrote to me in an email. "The main reason is a lack of surface area. They would either have to grow leaves or pull a trailer covered with them."
In short: Plants have already exploited the best tricks for subsisting on photosynthesis, and unless we want to look and act like plants, we won't have much success ourselves. Not that it stopped Pierce from trying to develop human photosynthesis technology anyway: "I even tried to sell it to the Navy back in the day," he told me. "Imagine photosynthetic SEALS."
It turns out, however, that while no one is actively trying to create photosynthetic humans, scientists are considering the ways humans might need to change to adapt to future environments, either here on the rapidly changing Earth or on another planet. Rice University biologist Scott Solomon has written an entire book, Future Humans, in which he explores the environmental pressures that are likely to influence human evolution from this point forward. On Earth, Solomon says, infectious disease will remain a major driver of change. As for Mars, the big two are lower gravity and radiation, the latter of which bombards the Martian surface constantly because the planet has no magnetosphere.
Although he considers this example "pretty out there," Solomon says one possible solution to Mars' magnetic assault could leave humans not photosynthetic green, but orange, thanks to pigments called carotenoids that are responsible for the bright hues of pumpkins and carrots.
"Carotenoids protect against radiation," he says. "Usually only plants and microbes can produce carotenoids, but there's at least one kind of insect, a particular type of aphid, that somehow acquired the gene for making carotenoids from a fungus. We don't exactly know how that happened, but now they're orange... I view that as an example of, hey, maybe humans on Mars will evolve new kinds of pigmentation that will protect us from the radiation there."
We could wait for an orange human-producing genetic variation to occur naturally, or with new gene editing techniques such as CRISPR-Cas9, we could just directly give astronauts genetic advantages such as carotenoid-producing skin. This may not be as far-off as it sounds: Extreme space travel exists at an ethically unique spot that makes human experimentation much more palatable. If an astronaut already plans to subject herself to the enormous experiment of traveling to, and maybe living out her days on, a dangerous and faraway planet, do we have any obligation to provide all the protection we can?
Probably the most vocal person trying to figure out what genetic protections might help astronauts is Cornell geneticist Chris Mason. His lab has outlined a 10-phase, 500-year plan for human survival, starting with the comparatively modest goal of establishing which human genes are not amenable to change and should be marked with a "Do not disturb" sign.
To be clear, Mason is not actually modifying human beings. Instead, his lab has studied genes in radiation-resistant bacteria, such as the Deinococcus genus. They've expressed proteins called DSUP from tardigrades, tiny water bears that can survive in space, in human cells. They've looked into p53, a gene that is overexpressed in elephants and seems to protect them from cancer. They also developed a protocol to work on the NASA twin study comparing astronauts Scott Kelly, who spent a year aboard the International Space Station, and his brother Mark, who did not, to find out what effects space tends to have on genes in the first place.
In a talk he gave in December, Mason reported that 8.7 percent of Scott Kelly's genes—mostly those associated with immune function, DNA repair, and bone formation—did not return to normal after the astronaut had been home for six months. "Some of these space genes, we could engineer them, activate them, have them be hyperactive when you go to space," he said in that same talk. "When we think about having the hubris to go to a faraway planet...it seems like an almost impossible idea….but I really like people and I want us to survive for a long time, and this is the first step on the stairwell to survive out of the solar system."
What is the most important ability we could give our future selves through science?
There are others performing studies to figure out what capabilities we might bestow on the future-proof superhuman, but none of them are quite as extreme as photosynthesis (although all of them are useful). At Harvard, geneticist George Church wants to engineer cells to be resistant to viruses, such as the common cold and HIV. At Columbia, synthetic biologist Harris Wang is addressing self-sufficient humans more directly—trying to spur kidney cells to produce amino acids that are normally only available from diet.
But perhaps Future Humans author Scott Solomon has the most radical idea. I asked him a version of the classic What would be your superhero power? question: What does he see as the most important ability we could give our future selves through science?
"The empathy gene," he said. "The ability to put yourself in someone else's shoes and see the world as they see it. I think it would solve a lot of our problems."