AI and you: Is the promise of personalized nutrition apps worth the hype?
As a type 2 diabetic, Michael Snyder has long been interested in how blood sugar levels vary from one person to another in response to the same food, and whether a more personalized approach to nutrition could help tackle the rapidly cascading levels of diabetes and obesity in much of the western world.
Eight years ago, Snyder, who directs the Center for Genomics and Personalized Medicine at Stanford University, decided to put his theories to the test. In the 2000s continuous glucose monitoring, or CGM, had begun to revolutionize the lives of diabetics, both type 1 and type 2. Using spherical sensors which sit on the upper arm or abdomen – with tiny wires that pierce the skin – the technology allowed patients to gain real-time updates on their blood sugar levels, transmitted directly to their phone.
It gave Snyder an idea for his research at Stanford. Applying the same technology to a group of apparently healthy people, and looking for ‘spikes’ or sudden surges in blood sugar known as hyperglycemia, could provide a means of observing how their bodies reacted to an array of foods.
“We discovered that different foods spike people differently,” he says. “Some people spike to pasta, others to bread, others to bananas, and so on. It’s very personalized and our feeling was that building programs around these devices could be extremely powerful for better managing people’s glucose.”
Unbeknown to Snyder at the time, thousands of miles away, a group of Israeli scientists at the Weizmann Institute of Science were doing exactly the same experiments. In 2015, they published a landmark paper which used CGM to track the blood sugar levels of 800 people over several days, showing that the biological response to identical foods can vary wildly. Like Snyder, they theorized that giving people a greater understanding of their own glucose responses, so they spend more time in the normal range, may reduce the prevalence of type 2 diabetes.
The commercial potential of such apps is clear, but the underlying science continues to generate intriguing findings.
“At the moment 33 percent of the U.S. population is pre-diabetic, and 70 percent of those pre-diabetics will become diabetic,” says Snyder. “Those numbers are going up, so it’s pretty clear we need to do something about it.”
Fast forward to 2022,and both teams have converted their ideas into subscription-based dietary apps which use artificial intelligence to offer data-informed nutritional and lifestyle recommendations. Snyder’s spinoff, January AI, combines CGM information with heart rate, sleep, and activity data to advise on foods to avoid and the best times to exercise. DayTwo–a start-up which utilizes the findings of Weizmann Institute of Science–obtains microbiome information by sequencing stool samples, and combines this with blood glucose data to rate ‘good’ and ‘bad’ foods for a particular person.
“CGMs can be used to devise personalized diets,” says Eran Elinav, an immunology professor and microbiota researcher at the Weizmann Institute of Science in addition to serving as a scientific consultant for DayTwo. “However, this process can be cumbersome. Therefore, in our lab we created an algorithm, based on data acquired from a big cohort of people, which can accurately predict post-meal glucose responses on a personal basis.”
The commercial potential of such apps is clear. DayTwo, who market their product to corporate employers and health insurers rather than individual consumers, recently raised $37 million in funding. But the underlying science continues to generate intriguing findings.
Last year, Elinav and colleagues published a study on 225 individuals with pre-diabetes which found that they achieved better blood sugar control when they followed a personalized diet based on DayTwo’s recommendations, compared to a Mediterranean diet. The journal Cell just released a new paper from Snyder’s group which shows that different types of fibre benefit people in different ways.
“The idea is you hear different fibres are good for you,” says Snyder. “But if you look at fibres they’re all over the map—it’s like saying all animals are the same. The responses are very individual. For a lot of people [a type of fibre called] arabinoxylan clearly reduced cholesterol while the fibre inulin had no effect. But in some people, it was the complete opposite.”
Eight years ago, Stanford's Michael Snyder began studying how continuous glucose monitors could be used by patients to gain real-time updates on their blood sugar levels, transmitted directly to their phone.
The Snyder Lab, Stanford Medicine
Because of studies like these, interest in precision nutrition approaches has exploded in recent years. In January, the National Institutes of Health announced that they are spending $170 million on a five year, multi-center initiative which aims to develop algorithms based on a whole range of data sources from blood sugar to sleep, exercise, stress, microbiome and even genomic information which can help predict which diets are most suitable for a particular individual.
“There's so many different factors which influence what you put into your mouth but also what happens to different types of nutrients and how that ultimately affects your health, which means you can’t have a one-size-fits-all set of nutritional guidelines for everyone,” says Bruce Y. Lee, professor of health policy and management at the City University of New York Graduate School of Public Health.
With the falling costs of genomic sequencing, other precision nutrition clinical trials are choosing to look at whether our genomes alone can yield key information about what our diets should look like, an emerging field of research known as nutrigenomics.
The ASPIRE-DNA clinical trial at Imperial College London is aiming to see whether particular genetic variants can be used to classify individuals into two groups, those who are more glucose sensitive to fat and those who are more sensitive to carbohydrates. By following a tailored diet based on these sensitivities, the trial aims to see whether it can prevent people with pre-diabetes from developing the disease.
But while much hope is riding on these trials, even precision nutrition advocates caution that the field remains in the very earliest of stages. Lars-Oliver Klotz, professor of nutrigenomics at Friedrich-Schiller-University in Jena, Germany, says that while the overall goal is to identify means of avoiding nutrition-related diseases, genomic data alone is unlikely to be sufficient to prevent obesity and type 2 diabetes.
“Genome data is rather simple to acquire these days as sequencing techniques have dramatically advanced in recent years,” he says. “However, the predictive value of just genome sequencing is too low in the case of obesity and prediabetes.”
Others say that while genomic data can yield useful information in terms of how different people metabolize different types of fat and specific nutrients such as B vitamins, there is a need for more research before it can be utilized in an algorithm for making dietary recommendations.
“I think it’s a little early,” says Eileen Gibney, a professor at University College Dublin. “We’ve identified a limited number of gene-nutrient interactions so far, but we need more randomized control trials of people with different genetic profiles on the same diet, to see whether they respond differently, and if that can be explained by their genetic differences.”
Some start-ups have already come unstuck for promising too much, or pushing recommendations which are not based on scientifically rigorous trials. The world of precision nutrition apps was dubbed a ‘Wild West’ by some commentators after the founders of uBiome – a start-up which offered nutritional recommendations based on information obtained from sequencing stool samples –were charged with fraud last year. The weight-loss app Noom, which was valued at $3.7 billion in May 2021, has been criticized on Twitter by a number of users who claimed that its recommendations have led to them developed eating disorders.
With precision nutrition apps marketing their technology at healthy individuals, question marks have also been raised about the value which can be gained through non-diabetics monitoring their blood sugar through CGM. While some small studies have found that wearing a CGM can make overweight or obese individuals more motivated to exercise, there is still a lack of conclusive evidence showing that this translates to improved health.
However, independent researchers remain intrigued by the technology, and say that the wealth of data generated through such apps could be used to help further stratify the different types of people who become at risk of developing type 2 diabetes.
“CGM not only enables a longer sampling time for capturing glucose levels, but will also capture lifestyle factors,” says Robert Wagner, a diabetes researcher at University Hospital Düsseldorf. “It is probable that it can be used to identify many clusters of prediabetic metabolism and predict the risk of diabetes and its complications, but maybe also specific cardiometabolic risk constellations. However, we still don’t know which forms of diabetes can be prevented by such approaches and how feasible and long-lasting such self-feedback dietary modifications are.”
Snyder himself has now been wearing a CGM for eight years, and he credits the insights it provides with helping him to manage his own diabetes. “My CGM still gives me novel insights into what foods and behaviors affect my glucose levels,” he says.
He is now looking to run clinical trials with his group at Stanford to see whether following a precision nutrition approach based on CGM and microbiome data, combined with other health information, can be used to reverse signs of pre-diabetes. If it proves successful, January AI may look to incorporate microbiome data in future.
“Ultimately, what I want to do is be able take people’s poop samples, maybe a blood draw, and say, ‘Alright, based on these parameters, this is what I think is going to spike you,’ and then have a CGM to test that out,” he says. “Getting very predictive about this, so right from the get go, you can have people better manage their health and then use the glucose monitor to help follow that.”
Out of Thin Air: A Fresh Solution to Farming’s Water Shortages
California has been plagued by perilous droughts for decades. Freshwater shortages have sparked raging wildfires and killed fruit and vegetable crops. And California is not alone in its danger of running out of water for farming; parts of the Southwest, including Texas, are battling severe drought conditions, according to the North American Drought Monitor. These two states account for 316,900 of the 2 million total U.S. farms.
But even as farming becomes more vulnerable due to water shortages, the world's demand for food is projected to increase 70 percent by 2050, according to Guihua Yu, an associate professor of materials science at The University of Texas at Austin.
"Water is the most limiting natural resource for agricultural production because of the freshwater shortage and enormous water consumption needed for irrigation," Yu said.
As scientists have searched for solutions, an alternative water supply has been hiding in plain sight: Water vapor in the atmosphere. It is abundant, available, and endlessly renewable, just waiting for the moment that technological innovation and necessity converged to make it fit for use. Now, new super-moisture-absorbent gels developed by Yu and a team of researchers can pull that moisture from the air and bring it into soil, potentially expanding the map of farmable land around the globe to dry and remote regions that suffer from water shortages.
"This opens up opportunities to turn those previously poor-quality or inhospitable lands to become useable and without need of centralized water and power supplies," Yu said.
A renewable source of freshwater
The hydrogels are a gelatin-like substance made from synthetic materials. The gels activate in cooler, humid overnight periods and draw water from the air. During a four-week experiment, Yu's team observed that soil with these gels provided enough water to support seed germination and plant growth without an additional liquid water supply. And the soil was able to maintain the moist environment for more than a month, according to Yu.
The super absorbent gels developed at the University of Texas at Austin.
Xingyi Zhou, UT Austin
"It is promising to liberate underdeveloped and drought areas from the long-distance water and power supplies for agricultural production," Yu said.
Crops also rely on fertilizer to maintain soil fertility and increase the production yield, but it is easily lost through leaching. Runoff increases agricultural costs and contributes to environmental pollution. The interaction between the gels and agrochemicals offer slow and controlled fertilizer release to maintain the balance between the root of the plant and the soil.
The possibilities are endless
Harvesting atmospheric water is exciting on multiple fronts. The super-moisture-absorbent gel can also be used for passively cooling solar panels. Solar radiation is the magic behind the process. Overnight, as temperatures cool, the gels absorb water hanging in the atmosphere. The moisture is stored inside the gels until the thermometer rises. Heat from the sun serves as the faucet that turns the gels on so they can release the stored water and cool down the panels. Effective cooling of the solar panels is important for sustainable long-term power generation.
In addition to agricultural uses and cooling for energy devices, atmospheric water harvesting technologies could even reach people's homes.
"They could be developed to enable easy access to drinking water through individual systems for household usage," Yu said.
Next steps
Yu and the team are now focused on affordability and developing practical applications for use. The goal is to optimize the gel materials to achieve higher levels of water uptake from the atmosphere.
"We are exploring different kinds of polymers and solar absorbers while exploring low-cost raw materials for production," Yu said.
The ability to transform atmospheric water vapor into a cheap and plentiful water source would be a game-changer. One day in the not-too-distant future, if climate change intensifies and droughts worsen, this innovation may become vital to our very survival.
On the morning of April 12, 1955, newsrooms across the United States inked headlines onto newsprint: the Salk Polio vaccine was "safe, effective, and potent." This was long-awaited news. Americans had limped through decades of fear, unaware of what caused polio or how to cure it, faced with the disease's terrifying, visible power to paralyze and kill, particularly children.
The announcement of the polio vaccine was celebrated with noisy jubilation: church bells rang, factory whistles sounded, people wept in the streets. Within weeks, mass inoculation began as the nation put its faith in a vaccine that would end polio.
Today, most of us are blissfully ignorant of child polio deaths, making it easier to believe that we have not personally benefited from the development of vaccines. According to Dr. Steven Pinker, cognitive psychologist and author of the bestselling book Enlightenment Now, we've become blasé to the gifts of science. "The default expectation is not that disease is part of life and science is a godsend, but that health is the default, and any disease is some outrage," he says.
We're now in the early stages of another vaccine rollout, one we hope will end the ravages of the COVID-19 pandemic. And yet, the Pfizer, Moderna, and AstraZeneca vaccines are met with far greater hesitancy and skepticism than the polio vaccine was in the 50s.
In 2021, concerns over the speed and safety of vaccine development and technology plague this heroic global effort, but the roots of vaccine hesitancy run far deeper. Vaccine hesitancy has always existed in the U.S., even in the polio era, motivated in part by fears around "living virus" in a bad batch of vaccines produced by Cutter Laboratories in 1955. But in the last half century, we've witnessed seismic cultural shifts—loss of public trust, a rise in misinformation, heightened racial and socioeconomic inequality, and political polarization have all intensified vaccine-related fears and resistance. Making sense of how we got here may help us understand how to move forward.
The Rise and Fall of Public Trust
When the polio vaccine was released in 1955, "we were nearing an all-time high point in public trust," says Matt Baum, Harvard Kennedy School professor and lead author of several reports measuring public trust and vaccine confidence. Baum explains that the U.S. was experiencing a post-war boom following the Allied triumph in WWII, a popular Roosevelt presidency, and the rapid innovation that elevated the country to an international superpower.
The 1950s witnessed the emergence of nuclear technology, a space program, and unprecedented medical breakthroughs, adds Emily Brunson, Texas State University anthropologist and co-chair of the Working Group on Readying Populations for COVID-19 Vaccine. "Antibiotics were a game changer," she states. While before, people got sick with pneumonia for a month, suddenly they had access to pills that accelerated recovery.
During this period, science seemed to hold all the answers; people embraced the idea that we could "come to know the world with an absolute truth," Brunson explains. Doctors were portrayed as unquestioned gods, so Americans were primed to trust experts who told them the polio vaccine was safe.
"The emotional tone of the news has gone downward since the 1940s, and journalists consider it a professional responsibility to cover the negative."
That blind acceptance eroded in the 1960s and 70s as people came to understand that science can be inherently political. "Getting to an absolute truth works out great for white men, but these things affect people socially in radically different ways," Brunson says. As the culture began questioning the white, patriarchal biases of science, doctors lost their god-like status and experts were pushed off their pedestals. This trend continues with greater intensity today, as President Trump has led a campaign against experts and waged a war on science that began long before the pandemic.
The Shift in How We Consume Information
In the 1950s, the media created an informational consensus. The fundamental ideas the public consumed about the state of the world were unified. "People argued about the best solutions, but didn't fundamentally disagree on the factual baseline," says Baum. Indeed, the messaging around the polio vaccine was centralized and consistent, led by President Roosevelt's successful March of Dimes crusade. People of lower socioeconomic status with limited access to this information were less likely to have confidence in the vaccine, but most people consumed media that assured them of the vaccine's safety and mobilized them to receive it.
Today, the information we consume is no longer centralized—in fact, just the opposite. "When you take that away, it's hard for people to know what to trust and what not to trust," Baum explains. We've witnessed an increase in polarization and the technology that makes it easier to give people what they want to hear, reinforcing the human tendencies to vilify the other side and reinforce our preexisting ideas. When information is engineered to further an agenda, each choice and risk calculation made while navigating the COVID-19 pandemic is deeply politicized.
This polarization maps onto a rise in socioeconomic inequality and economic uncertainty. These factors, associated with a sense of lost control, prime people to embrace misinformation, explains Baum, especially when the situation is difficult to comprehend. "The beauty of conspiratorial thinking is that it provides answers to all these questions," he says. Today's insidious fragmentation of news media accelerates the circulation of mis- and disinformation, reaching more people faster, regardless of veracity or motivation. In the case of vaccines, skepticism around their origin, safety, and motivation is intensified.
Alongside the rise in polarization, Pinker says "the emotional tone of the news has gone downward since the 1940s, and journalists consider it a professional responsibility to cover the negative." Relentless focus on everything that goes wrong further erodes public trust and paints a picture of the world getting worse. "Life saved is not a news story," says Pinker, but perhaps it should be, he continues. "If people were more aware of how much better life was generally, they might be more receptive to improvements that will continue to make life better. These improvements don't happen by themselves."
The Future Depends on Vaccine Confidence
So far, the U.S. has been unable to mitigate the catastrophic effects of the pandemic through social distancing, testing, and contact tracing. President Trump has downplayed the effects and threat of the virus, censored experts and scientists, given up on containing the spread, and mobilized his base to protest masks. The Trump Administration failed to devise a national plan, so our national plan has defaulted to hoping for the "miracle" of a vaccine. And they are "something of a miracle," Pinker says, describing vaccines as "the most benevolent invention in the history of our species." In record-breaking time, three vaccines have arrived. But their impact will be weakened unless we achieve mass vaccination. As Brunson notes, "The technology isn't the fix; it's people taking the technology."
Significant challenges remain, including facilitating widespread access and supporting on-the-ground efforts to allay concerns and build trust with specific populations with historic reasons for distrust, says Brunson. Baum predicts continuing delays as well as deaths from other causes that will be linked to the vaccine.
Still, there's every reason for hope. The new administration "has its eyes wide open to these challenges. These are the kind of problems that are amenable to policy solutions if we have the will," Baum says. He forecasts widespread vaccination by late summer and a bounce back from the economic damage, a "Good News Story" that will bolster vaccine acceptance in the future. And Pinker reminds us that science, medicine, and public health have greatly extended our lives in the last few decades, a trend that can only continue if we're willing to roll up our sleeves.