AI and you: Is the promise of personalized nutrition apps worth the hype?
As a type 2 diabetic, Michael Snyder has long been interested in how blood sugar levels vary from one person to another in response to the same food, and whether a more personalized approach to nutrition could help tackle the rapidly cascading levels of diabetes and obesity in much of the western world.
Eight years ago, Snyder, who directs the Center for Genomics and Personalized Medicine at Stanford University, decided to put his theories to the test. In the 2000s continuous glucose monitoring, or CGM, had begun to revolutionize the lives of diabetics, both type 1 and type 2. Using spherical sensors which sit on the upper arm or abdomen – with tiny wires that pierce the skin – the technology allowed patients to gain real-time updates on their blood sugar levels, transmitted directly to their phone.
It gave Snyder an idea for his research at Stanford. Applying the same technology to a group of apparently healthy people, and looking for ‘spikes’ or sudden surges in blood sugar known as hyperglycemia, could provide a means of observing how their bodies reacted to an array of foods.
“We discovered that different foods spike people differently,” he says. “Some people spike to pasta, others to bread, others to bananas, and so on. It’s very personalized and our feeling was that building programs around these devices could be extremely powerful for better managing people’s glucose.”
Unbeknown to Snyder at the time, thousands of miles away, a group of Israeli scientists at the Weizmann Institute of Science were doing exactly the same experiments. In 2015, they published a landmark paper which used CGM to track the blood sugar levels of 800 people over several days, showing that the biological response to identical foods can vary wildly. Like Snyder, they theorized that giving people a greater understanding of their own glucose responses, so they spend more time in the normal range, may reduce the prevalence of type 2 diabetes.
The commercial potential of such apps is clear, but the underlying science continues to generate intriguing findings.
“At the moment 33 percent of the U.S. population is pre-diabetic, and 70 percent of those pre-diabetics will become diabetic,” says Snyder. “Those numbers are going up, so it’s pretty clear we need to do something about it.”
Fast forward to 2022,and both teams have converted their ideas into subscription-based dietary apps which use artificial intelligence to offer data-informed nutritional and lifestyle recommendations. Snyder’s spinoff, January AI, combines CGM information with heart rate, sleep, and activity data to advise on foods to avoid and the best times to exercise. DayTwo–a start-up which utilizes the findings of Weizmann Institute of Science–obtains microbiome information by sequencing stool samples, and combines this with blood glucose data to rate ‘good’ and ‘bad’ foods for a particular person.
“CGMs can be used to devise personalized diets,” says Eran Elinav, an immunology professor and microbiota researcher at the Weizmann Institute of Science in addition to serving as a scientific consultant for DayTwo. “However, this process can be cumbersome. Therefore, in our lab we created an algorithm, based on data acquired from a big cohort of people, which can accurately predict post-meal glucose responses on a personal basis.”
The commercial potential of such apps is clear. DayTwo, who market their product to corporate employers and health insurers rather than individual consumers, recently raised $37 million in funding. But the underlying science continues to generate intriguing findings.
Last year, Elinav and colleagues published a study on 225 individuals with pre-diabetes which found that they achieved better blood sugar control when they followed a personalized diet based on DayTwo’s recommendations, compared to a Mediterranean diet. The journal Cell just released a new paper from Snyder’s group which shows that different types of fibre benefit people in different ways.
“The idea is you hear different fibres are good for you,” says Snyder. “But if you look at fibres they’re all over the map—it’s like saying all animals are the same. The responses are very individual. For a lot of people [a type of fibre called] arabinoxylan clearly reduced cholesterol while the fibre inulin had no effect. But in some people, it was the complete opposite.”
Eight years ago, Stanford's Michael Snyder began studying how continuous glucose monitors could be used by patients to gain real-time updates on their blood sugar levels, transmitted directly to their phone.
The Snyder Lab, Stanford Medicine
Because of studies like these, interest in precision nutrition approaches has exploded in recent years. In January, the National Institutes of Health announced that they are spending $170 million on a five year, multi-center initiative which aims to develop algorithms based on a whole range of data sources from blood sugar to sleep, exercise, stress, microbiome and even genomic information which can help predict which diets are most suitable for a particular individual.
“There's so many different factors which influence what you put into your mouth but also what happens to different types of nutrients and how that ultimately affects your health, which means you can’t have a one-size-fits-all set of nutritional guidelines for everyone,” says Bruce Y. Lee, professor of health policy and management at the City University of New York Graduate School of Public Health.
With the falling costs of genomic sequencing, other precision nutrition clinical trials are choosing to look at whether our genomes alone can yield key information about what our diets should look like, an emerging field of research known as nutrigenomics.
The ASPIRE-DNA clinical trial at Imperial College London is aiming to see whether particular genetic variants can be used to classify individuals into two groups, those who are more glucose sensitive to fat and those who are more sensitive to carbohydrates. By following a tailored diet based on these sensitivities, the trial aims to see whether it can prevent people with pre-diabetes from developing the disease.
But while much hope is riding on these trials, even precision nutrition advocates caution that the field remains in the very earliest of stages. Lars-Oliver Klotz, professor of nutrigenomics at Friedrich-Schiller-University in Jena, Germany, says that while the overall goal is to identify means of avoiding nutrition-related diseases, genomic data alone is unlikely to be sufficient to prevent obesity and type 2 diabetes.
“Genome data is rather simple to acquire these days as sequencing techniques have dramatically advanced in recent years,” he says. “However, the predictive value of just genome sequencing is too low in the case of obesity and prediabetes.”
Others say that while genomic data can yield useful information in terms of how different people metabolize different types of fat and specific nutrients such as B vitamins, there is a need for more research before it can be utilized in an algorithm for making dietary recommendations.
“I think it’s a little early,” says Eileen Gibney, a professor at University College Dublin. “We’ve identified a limited number of gene-nutrient interactions so far, but we need more randomized control trials of people with different genetic profiles on the same diet, to see whether they respond differently, and if that can be explained by their genetic differences.”
Some start-ups have already come unstuck for promising too much, or pushing recommendations which are not based on scientifically rigorous trials. The world of precision nutrition apps was dubbed a ‘Wild West’ by some commentators after the founders of uBiome – a start-up which offered nutritional recommendations based on information obtained from sequencing stool samples –were charged with fraud last year. The weight-loss app Noom, which was valued at $3.7 billion in May 2021, has been criticized on Twitter by a number of users who claimed that its recommendations have led to them developed eating disorders.
With precision nutrition apps marketing their technology at healthy individuals, question marks have also been raised about the value which can be gained through non-diabetics monitoring their blood sugar through CGM. While some small studies have found that wearing a CGM can make overweight or obese individuals more motivated to exercise, there is still a lack of conclusive evidence showing that this translates to improved health.
However, independent researchers remain intrigued by the technology, and say that the wealth of data generated through such apps could be used to help further stratify the different types of people who become at risk of developing type 2 diabetes.
“CGM not only enables a longer sampling time for capturing glucose levels, but will also capture lifestyle factors,” says Robert Wagner, a diabetes researcher at University Hospital Düsseldorf. “It is probable that it can be used to identify many clusters of prediabetic metabolism and predict the risk of diabetes and its complications, but maybe also specific cardiometabolic risk constellations. However, we still don’t know which forms of diabetes can be prevented by such approaches and how feasible and long-lasting such self-feedback dietary modifications are.”
Snyder himself has now been wearing a CGM for eight years, and he credits the insights it provides with helping him to manage his own diabetes. “My CGM still gives me novel insights into what foods and behaviors affect my glucose levels,” he says.
He is now looking to run clinical trials with his group at Stanford to see whether following a precision nutrition approach based on CGM and microbiome data, combined with other health information, can be used to reverse signs of pre-diabetes. If it proves successful, January AI may look to incorporate microbiome data in future.
“Ultimately, what I want to do is be able take people’s poop samples, maybe a blood draw, and say, ‘Alright, based on these parameters, this is what I think is going to spike you,’ and then have a CGM to test that out,” he says. “Getting very predictive about this, so right from the get go, you can have people better manage their health and then use the glucose monitor to help follow that.”
When Wayne Jonas was in medical school 40 years ago, doctors would write out a prescription for placebos, spelling it out backwards in capital letters, O-B-E-C-A-L-P. The pharmacist would fill the prescription with a sugar pill, recalls Jonas, now director of integrative health programs at the Samueli Foundation. It fulfilled the patient's desire for the doctor to do something when perhaps no drug could help, and the sugar pills did no harm.
Today, that deception is seen as unethical. But time and time again, studies have shown that placebos can have real benefits. Now, researchers are trying to untangle the mysteries of placebo effect in an effort to better treat patients.
The use of placebos took off in the post-WWII period, when randomized controlled clinical trials became the gold standard for medical research. One group in a study would be treated with a placebo, a supposedly inert pill or procedure that would not affect normal healing and recovery, while another group in the study would receive an "active" component, most commonly a pill under investigation. Presumably, the group receiving the active treatment would have a better response and the difference from the placebo group would represent the efficacy of the drug being tested. That was the basis for drug approval by the U.S. Food and Drug Administration.
"Placebo responses were marginalized," says Ted Kaptchuk, director of the Program in Placebo Studies & Therapeutic Encounters at Harvard Medical School. "Doctors were taught they have to overcome it when they were thinking about using an effective drug."
But that began to change around the turn of the 21st century. The National Institutes of Health held a series of meetings to set a research agenda and fund studies to answer some basic questions, led by Jonas who was in charge of the office of alternative medicine at the time. "People spontaneously get better all the time," says Kaptchuk. The crucial question was, is the placebo effect real? Is it more than just spontaneous healing?
Brain mechanisms
A turning point came in 2001 in a paper in Science that showed physical evidence of the placebo effect. It used positron emission tomography (PET) scans to measure release patterns of dopamine — a chemical messenger involved in how we feel pleasure — in the brains of patients with Parkinson's disease. Surprisingly, the placebo activated the same patterns that were activated by Parkinson's drugs, such as levodopa. It proved the placebo effect was real; now the search was on to better understand and control it.
A key part of the effect can be the beliefs, expectations, context, and "rituals" of the encounter between doctor and patient. Belief by the doctor and patient that the treatment would work, and the formalized practices of administering the treatment can all contribute to a positive outcome.
Conditioning can be another important component in generating a response, as Pavlov demonstrated more than a century ago in his experiments with dogs. They were trained with a bell prior to feeding such that they would begin to salivate in anticipation at the sound of a bell even with no food present.
Translating that to humans, studies with pain medications and sleeping aids showed that patients who had a positive response with a certain dose of those medications could have the same response if the doses was reduced and a dummy pill substituted, even to the point where there was no longer any active ingredient.
Researchers think placebo treatments can work particularly well in helping people deal with pain and psychological disorders.
Those types of studies troubled Kaptchuk because they often relied on deception; patients weren't told they were receiving a placebo, or at best there was a possibility that they might be randomized to receive a placebo. He believed the placebo effect could work even if patients were told upfront that they were going to receive a placebo. More than a dozen so call "open-label placebo" studies across numerous medical conditions, by Kaptchuk and others, have shown that you don't have to lie to patients for a placebo to work.
Jonas likes to tell the story of a patient who used methotrexate, a potent immunosuppressant, to control her rheumatoid arthritis. She was planning a long trip and didn't want to be bothered with the injections and monitoring required in using the drug, So she began to drink a powerful herbal extract of anise, a licorice flavor that she hated, prior to each injection. She reduced the amount of methotrexate over a period of months and finally stopped, but continued to drink the anise. That process had conditioned her body "to alter her immune function and her autoimmunity" as if she were taking the drug, much like Pavlov's dogs had been trained. She has not taken methotrexate for more than a year.
An intriguing paper published in May found that mild, non-invasive electric stimulation to the brain could not only boost the placebo effect on pain but also reduce the "nocebo" effect — when patients report a negative effect to a sham treatment. While the work is very preliminary, it may open the door to directly manipulating these responses.
Researchers think placebo treatments can work particularly well in helping people deal with pain and psychological disorders, areas where drugs often are of little help. Still, placebos aren't a cure and only a portion of patients experience a placebo effect.
Nocebo
If medicine were a soap opera, the nocebo would be the evil twin of the placebo. It's what happens when patients have adverse side effects because of the expectation that they will. It's commonly seem when patients claims to experience pain or gastric distress that can occur with a drug even when they've received a placebo. The side effects were either imagined or caused by something else.
"Up to 97% of reported pharmaceutical side effects are not caused by the drug itself but rather by nocebo effects and symptom misattribution," according to one 2019 paper.
One way to reduce a nocebo response is to simply not tell patients that specific side effects might occur. An example is a liver biopsy, in which a large-gauge needle is used to extract a tissue sample for examination. Those told ahead of time that they might experience some pain were more likely to report pain and greater pain than those who weren't offered this information.
Interestingly, a nocebo response plays out in the hippocampus, a part of the brain that is never activated in a placebo response. "I think what we are dealing with with nocebo is anxiety," says Kaptchuk, but he acknowledges that others disagree.
Distraction may be another way to minimize the nocebo effect. Pediatricians are using virtual reality (VR) to engage children and distract them during routine procedures such as blood draws and changing wound dressings, and burn patients of all ages have found relief with specially created VRs.
Treatment response
Jonas argues that what we commonly call the placebo effect is misnamed and leading us astray. "The fact is people heal and that inherent healing capacity is both powerful and influenced by mental, social, and contextual factors that are embedded in every medical encounter since the idea of treatment began," he wrote in a 2019 article in the journal Frontiers in Psychiatry. "Our understanding of healing and ability to enhance it will be accelerated if we stop using the term 'placebo response' and call it what it is—the meaning response, and its special application in medicine called the healing response."
He cites evidence that "only 15% to 20% of the healing of an individual or a population comes from health care. The rest—nearly 80%—comes from other factors rarely addressed in the health care system: behavioral and lifestyle choices that people make in their daily life."
To better align treatments and maximize their effectiveness, Jonas has created HOPE (Healing Oriented Practices & Environments) Note, "a patient-guided process designed to identify the patient's values and goals in their life and for healing." Essentially, it seeks to make clear to both doctor and patient what the patient's goals are in seeking treatment. In an extreme example of terminal cancer, some patients may choose to extend life despite the often brutal treatments, while others might prefer to optimize quality of life in the remaining time that they have. It builds on practices already taught in medical schools. Jonas believes doctors and patients can use tools like these to maximize the treatment response and achieve better outcomes.
Much of the medical profession has been resistant to these approaches. Part of that is simply tradition and limited data on their effectiveness, but another very real factor is the billing process for how they are reimbursed. Jonas says a new medical billing code added this year gives doctors another way to be compensated for the extra time and effort that a more holistic approach to medicine may initially require. Other moves away from fee-for-service payments to bundling and payment for outcomes, and the integrated care provided by the Veterans Affairs, Kaiser Permanente and other groups offer longer term hope for the future of approaches that might enhance the healing response.
The Women of RNA: Two Award-Winners Share Why They Spent Their Careers Studying DNA's Lesser-Known Cousin
When Lynne Maquat, who leads the Center for RNA Biology at the University of Rochester, became interested in the ribonucleic acid molecule in the 1970s, she was definitely in the minority. The same was true for Joan Steitz, now professor of molecular biophysics and biochemistry at Yale University, who began to study RNA a decade earlier in the 1960s.
"My first RNA experiment was a failure, because we didn't understand how things worked," Steitz recalls. In her first undergraduate experiment, she unwittingly used a lab preparation that destroyed the RNA. "Unknowingly, our preparation contained enzymes that degraded our RNA."
At the time, scientists pursuing genetic research tended to focus on DNA, or deoxyribonucleic acid — and for good reason. It was clear that the enigmatic double-helix ribbon held the answers to organisms' heredity, genetic traits, development, growth and aging. If scientists could decipher the secrets of DNA and understand how its genetic instructions translate into the body's functions in health and disease, they could develop treatments for all kinds of diseases. On the contrary, the prevailing dogma of the time viewed RNA as merely a helper that passively carried out DNA's genetic instructions for protein-making — so it received much less attention.
But Maquat and Steitz weren't interested in heredity. They studied biochemistry and biophysics, so they wanted to understand how RNA functioned on the molecular level — how it carried instructions, catalyzed reactions, and helped build protein bonds, among other things.
"I'm a mechanistic biochemist, so I like to know how things happen," Maquat says. "Once you understand the mechanism, you can think of how to solve problems." And so the quest to understand how RNA does its job became the focus of both women's careers.
"People can now appreciate why some of us studied RNA for such a long time."
Half a century later, in 2021, their RNA work has earned two prestigious recognitions only months from each other. In February, they received the Wolf Prize in Medicine, followed by the Warren Alpert Foundation Prize in May, awarded to scientists whose achievements led to prevention, cure or treatments of human diseases.
It was the development of the COVID-19 vaccines that made RNA a household name. Made by Moderna and Pfizer, the vaccines use the RNA molecule to deliver genetic instructions for making SARS-CoV-2's characteristic spike protein in our cells. The presence of this foreign-looking protein triggers the immune system to attack and remember the pathogen. As the vaccines reached the finish line, RNA took center stage, and it was Maquat's and Steitz's research that helped reveal how these molecular cogwheels drive many biological functions within cells.
If you think of a cell as a kingdom, the DNA plays the role of a queen. Like a monarch in a palace, DNA nestles inside the cell's nucleus issuing instructions needed for the cell to function. But no queen can successfully govern without her court, her messengers, and her soldiers, as well as other players that make her kingdom work. That's what RNAs do — they act as the DNA's vassals. They carry instructions for protein assembly, catalyze reactions and supervise many other processes to make sure the cellular kingdom performs as it should.
There are a myriad of these RNA vassals in our cells, and each type has its own specific task. There are messenger RNAs that deliver genetic instructions for protein synthesis from DNA to ribosomes, the cells' protein-making factories. There are ribosomal RNAs that help stitch together amino acids to make proteins. There are transfer RNAs that can bring amino acids to this protein synthesis machine, keeping it going. Then there are circular RNAs that act as sponges, absorbing proteins to help regulate the activity of genes. And that's only the tip of the iceberg when it comes to RNA diversity, researchers say.
"We know what the most abundant and important RNAs are doing," says Steitz. "But there are thousands of different ones, and we still don't have a full knowledge of them."
Critical to RNA's proper functioning is a process called splicing, in which a precursor mRNA is transformed into mature, fully-functional mRNA — a phenomenon that Steitz's work helped elucidate. The splicing process, which takes place in cellular assembly lines, involves removing extra RNA sequences and stringing the remaining RNA pieces together. Steitz found that tiny RNA particles called snRNPs are crucial to this process. They act as handy helpers, finding and removing errant genetic material from the mRNA molecules.
A dysfunctional RNA assembly line leads to diseases, including many cancers. For instance, Steitz found that people with Lupus — an autoimmune disorder — have antibodies that mistakenly attack the little snRNP helpers. She also discovered that when snRNPs don't do their job properly, they can cause what scientists call mis-splicing, producing defective mRNAs.
Fortunately, cells have a built-in quality-control process that can spot and correct these mistakes, which is what Maquat studied in her work. In 1981, she discovered a molecular quality-control system that spots and destroys such incorrectly assembled mRNA. With the cryptic name "nonsense-mediated mRNA decay" or NMD, this process is vital to the health and wellbeing of a cellular kingdom in humans — because splicing mistakes happen far more often than one would imagine.
"We estimate that about a third of our mRNA are mistakes," Maquat says. "And nonsense-mediated mRNA decay cleans up these mistakes." When this quality-control system malfunctions, defective mRNA forge faulty proteins, which mess up the cellular machinery and cause disease, including various forms of cancer.
Scientists' newfound appreciation of RNA opens door to many novel treatments.
Now that the first RNA-based shots were approved, the same principle can be used for create vaccines for other diseases, the two RNA researchers say. Moreover, the molecule has an even greater potential — it can serve as a therapeutic target for other disorders. For example, Spinraza, a groundbreaking drug approved in 2016 for spinal muscular atrophy, uses small snippets of synthetic genetic material that bind to the RNA, helping fix splicing errors. "People can now appreciate why some of us studied RNA for such a long time," says Maquat.
Steitz is thrilled that the entire field of RNA research is enjoying the limelight. "I'm delighted because the prize is more of a recognition of the field than just our work," she says. "This is a more general acknowledgment of how basic research can have a remarkable impact on human health."
Lina Zeldovich has written about science, medicine and technology for Popular Science, Smithsonian, National Geographic, Scientific American, Reader’s Digest, the New York Times and other major national and international publications. A Columbia J-School alumna, she has won several awards for her stories, including the ASJA Crisis Coverage Award for Covid reporting, and has been a contributing editor at Nautilus Magazine. In 2021, Zeldovich released her first book, The Other Dark Matter, published by the University of Chicago Press, about the science and business of turning waste into wealth and health. You can find her on http://linazeldovich.com/ and @linazeldovich.