AI and you: Is the promise of personalized nutrition apps worth the hype?
As a type 2 diabetic, Michael Snyder has long been interested in how blood sugar levels vary from one person to another in response to the same food, and whether a more personalized approach to nutrition could help tackle the rapidly cascading levels of diabetes and obesity in much of the western world.
Eight years ago, Snyder, who directs the Center for Genomics and Personalized Medicine at Stanford University, decided to put his theories to the test. In the 2000s continuous glucose monitoring, or CGM, had begun to revolutionize the lives of diabetics, both type 1 and type 2. Using spherical sensors which sit on the upper arm or abdomen – with tiny wires that pierce the skin – the technology allowed patients to gain real-time updates on their blood sugar levels, transmitted directly to their phone.
It gave Snyder an idea for his research at Stanford. Applying the same technology to a group of apparently healthy people, and looking for ‘spikes’ or sudden surges in blood sugar known as hyperglycemia, could provide a means of observing how their bodies reacted to an array of foods.
“We discovered that different foods spike people differently,” he says. “Some people spike to pasta, others to bread, others to bananas, and so on. It’s very personalized and our feeling was that building programs around these devices could be extremely powerful for better managing people’s glucose.”
Unbeknown to Snyder at the time, thousands of miles away, a group of Israeli scientists at the Weizmann Institute of Science were doing exactly the same experiments. In 2015, they published a landmark paper which used CGM to track the blood sugar levels of 800 people over several days, showing that the biological response to identical foods can vary wildly. Like Snyder, they theorized that giving people a greater understanding of their own glucose responses, so they spend more time in the normal range, may reduce the prevalence of type 2 diabetes.
The commercial potential of such apps is clear, but the underlying science continues to generate intriguing findings.
“At the moment 33 percent of the U.S. population is pre-diabetic, and 70 percent of those pre-diabetics will become diabetic,” says Snyder. “Those numbers are going up, so it’s pretty clear we need to do something about it.”
Fast forward to 2022,and both teams have converted their ideas into subscription-based dietary apps which use artificial intelligence to offer data-informed nutritional and lifestyle recommendations. Snyder’s spinoff, January AI, combines CGM information with heart rate, sleep, and activity data to advise on foods to avoid and the best times to exercise. DayTwo–a start-up which utilizes the findings of Weizmann Institute of Science–obtains microbiome information by sequencing stool samples, and combines this with blood glucose data to rate ‘good’ and ‘bad’ foods for a particular person.
“CGMs can be used to devise personalized diets,” says Eran Elinav, an immunology professor and microbiota researcher at the Weizmann Institute of Science in addition to serving as a scientific consultant for DayTwo. “However, this process can be cumbersome. Therefore, in our lab we created an algorithm, based on data acquired from a big cohort of people, which can accurately predict post-meal glucose responses on a personal basis.”
The commercial potential of such apps is clear. DayTwo, who market their product to corporate employers and health insurers rather than individual consumers, recently raised $37 million in funding. But the underlying science continues to generate intriguing findings.
Last year, Elinav and colleagues published a study on 225 individuals with pre-diabetes which found that they achieved better blood sugar control when they followed a personalized diet based on DayTwo’s recommendations, compared to a Mediterranean diet. The journal Cell just released a new paper from Snyder’s group which shows that different types of fibre benefit people in different ways.
“The idea is you hear different fibres are good for you,” says Snyder. “But if you look at fibres they’re all over the map—it’s like saying all animals are the same. The responses are very individual. For a lot of people [a type of fibre called] arabinoxylan clearly reduced cholesterol while the fibre inulin had no effect. But in some people, it was the complete opposite.”
Eight years ago, Stanford's Michael Snyder began studying how continuous glucose monitors could be used by patients to gain real-time updates on their blood sugar levels, transmitted directly to their phone.
The Snyder Lab, Stanford Medicine
Because of studies like these, interest in precision nutrition approaches has exploded in recent years. In January, the National Institutes of Health announced that they are spending $170 million on a five year, multi-center initiative which aims to develop algorithms based on a whole range of data sources from blood sugar to sleep, exercise, stress, microbiome and even genomic information which can help predict which diets are most suitable for a particular individual.
“There's so many different factors which influence what you put into your mouth but also what happens to different types of nutrients and how that ultimately affects your health, which means you can’t have a one-size-fits-all set of nutritional guidelines for everyone,” says Bruce Y. Lee, professor of health policy and management at the City University of New York Graduate School of Public Health.
With the falling costs of genomic sequencing, other precision nutrition clinical trials are choosing to look at whether our genomes alone can yield key information about what our diets should look like, an emerging field of research known as nutrigenomics.
The ASPIRE-DNA clinical trial at Imperial College London is aiming to see whether particular genetic variants can be used to classify individuals into two groups, those who are more glucose sensitive to fat and those who are more sensitive to carbohydrates. By following a tailored diet based on these sensitivities, the trial aims to see whether it can prevent people with pre-diabetes from developing the disease.
But while much hope is riding on these trials, even precision nutrition advocates caution that the field remains in the very earliest of stages. Lars-Oliver Klotz, professor of nutrigenomics at Friedrich-Schiller-University in Jena, Germany, says that while the overall goal is to identify means of avoiding nutrition-related diseases, genomic data alone is unlikely to be sufficient to prevent obesity and type 2 diabetes.
“Genome data is rather simple to acquire these days as sequencing techniques have dramatically advanced in recent years,” he says. “However, the predictive value of just genome sequencing is too low in the case of obesity and prediabetes.”
Others say that while genomic data can yield useful information in terms of how different people metabolize different types of fat and specific nutrients such as B vitamins, there is a need for more research before it can be utilized in an algorithm for making dietary recommendations.
“I think it’s a little early,” says Eileen Gibney, a professor at University College Dublin. “We’ve identified a limited number of gene-nutrient interactions so far, but we need more randomized control trials of people with different genetic profiles on the same diet, to see whether they respond differently, and if that can be explained by their genetic differences.”
Some start-ups have already come unstuck for promising too much, or pushing recommendations which are not based on scientifically rigorous trials. The world of precision nutrition apps was dubbed a ‘Wild West’ by some commentators after the founders of uBiome – a start-up which offered nutritional recommendations based on information obtained from sequencing stool samples –were charged with fraud last year. The weight-loss app Noom, which was valued at $3.7 billion in May 2021, has been criticized on Twitter by a number of users who claimed that its recommendations have led to them developed eating disorders.
With precision nutrition apps marketing their technology at healthy individuals, question marks have also been raised about the value which can be gained through non-diabetics monitoring their blood sugar through CGM. While some small studies have found that wearing a CGM can make overweight or obese individuals more motivated to exercise, there is still a lack of conclusive evidence showing that this translates to improved health.
However, independent researchers remain intrigued by the technology, and say that the wealth of data generated through such apps could be used to help further stratify the different types of people who become at risk of developing type 2 diabetes.
“CGM not only enables a longer sampling time for capturing glucose levels, but will also capture lifestyle factors,” says Robert Wagner, a diabetes researcher at University Hospital Düsseldorf. “It is probable that it can be used to identify many clusters of prediabetic metabolism and predict the risk of diabetes and its complications, but maybe also specific cardiometabolic risk constellations. However, we still don’t know which forms of diabetes can be prevented by such approaches and how feasible and long-lasting such self-feedback dietary modifications are.”
Snyder himself has now been wearing a CGM for eight years, and he credits the insights it provides with helping him to manage his own diabetes. “My CGM still gives me novel insights into what foods and behaviors affect my glucose levels,” he says.
He is now looking to run clinical trials with his group at Stanford to see whether following a precision nutrition approach based on CGM and microbiome data, combined with other health information, can be used to reverse signs of pre-diabetes. If it proves successful, January AI may look to incorporate microbiome data in future.
“Ultimately, what I want to do is be able take people’s poop samples, maybe a blood draw, and say, ‘Alright, based on these parameters, this is what I think is going to spike you,’ and then have a CGM to test that out,” he says. “Getting very predictive about this, so right from the get go, you can have people better manage their health and then use the glucose monitor to help follow that.”
The Nose Knows: Dogs Are Being Trained to Detect the Coronavirus
Asher is eccentric and inquisitive. He loves an audience, likes keeping busy, and howls to be let through doors. He is a six-year-old working Cocker Spaniel, who, with five other furry colleagues, has now been trained to sniff body odor samples from humans to detect COVID-19 infections.
As the Delta variant and other new versions of the SARS-CoV-2 virus emerge, public health agencies are once again recommending masking while employers contemplate mandatory vaccination. While PCR tests remain the "gold standard" of COVID-19 tests, they can take hours to flag infections. To accelerate the process, scientists are turning to a new testing tool: sniffer dogs.
At the London School of Hygiene and Tropical Medicine (LSHTM), researchers deployed Asher and five other trained dogs to test sock samples from 200 asymptomatic, infected individuals and 200 healthy individuals. In May, they published the findings of the yearlong study in a preprint, concluding that dogs could identify COVID-19 infections with a high degree of accuracy – they could correctly identify a COVID-positive sample up to 94% of the time and a negative sample up to 92% of the time. The paper has yet to be peer-reviewed.
"Dogs can screen lots of people very quickly – 300 people per dog per hour. This means they could be used in places like airports or public venues like stadiums and maybe even workplaces," says James Logan, who heads the Department of Disease Control at LSHTM, adding that canines can also detect variants of SARS-CoV-2. "We included samples from two variants and the dogs could still detect them."
Detection dogs have been one of the most reliable biosensors for identifying the odor of human disease. According to Gemma Butlin, a spokesperson of Medical Detection Dogs, the UK-based charity that trained canines for the LSHTM study, the olfactory capabilities of dogs have been deployed to detect malaria, Parkinson's disease, different types of cancers, as well as pseudomonas, a type of bacteria known to cause infections in blood, lungs, eyes, and other parts of the human body.
COVID-19 has a distinctive smell — a result of chemicals known as volatile organic compounds released by infected body cells, which give off an odor "fingerprint."
"It's estimated that the percentage of a dog's brain devoted to analyzing odors is 40 times larger than that of a human," says Butlin. "Humans have around 5 million scent receptors dedicated to smell. Dogs have 350 million and can detect odors at parts per trillion. To put this into context, a dog can detect a teaspoon of sugar in a million gallons of water: two Olympic-sized pools full."
According to LSHTM scientists, COVID-19 has a distinctive smell — a result of chemicals known as volatile organic compounds released by infected body cells, which give off an odor "fingerprint." Other studies, too, have revealed that the SARS-CoV-2 virus has a distinct olfactory signature, detectable in the urine, saliva, and sweat of infected individuals. Humans can't smell the disease in these fluids, but dogs can.
"Our research shows that the smell associated with COVID-19 is at least partly due to small and volatile chemicals that are produced by the virus growing in the body or the immune response to the virus or both," said Steve Lindsay, a public health entomologist at Durham University, whose team collaborated with LSHTM for the study. He added, "There is also a further possibility that dogs can actually smell the virus, which is incredible given how small viruses are."
In April this year, researchers from the University of Pennsylvania and collaborators published a similar study in the scientific journal PLOS One, revealing that detection dogs could successfully discriminate between urine samples of infected and uninfected individuals. The accuracy rate of canines in this study was 96%. Similarly, last December, French scientists found that dogs were 76-100% effective at identifying individuals with COVID-19 when presented with sweat samples.
Grandjean Dominique, a professor at France's National Veterinary School of Alfort, who led the French study, said that the researchers used two types of dogs — search and rescue dogs, as they can sniff sweat, and explosive detection dogs, because they're often used at airports to find bomb ingredients. Dogs may very well be as good as PCR tests, said Dominique, but the goal, he added, is not to replace these tests with canines.
In France, the government gave the green light to train hundreds of disease detection dogs and deploy them in airports. "They will act as mass pre-test, and only people who are positive will undergo a PCR test to check their level of infection and the kind of variant," says Dominique. He thinks the dogs will be able to decrease the amount of PCR testing and potentially save money.
Since the accuracy rate for bio-detection dogs is fairly high, scientists think they could prove to be a quick diagnosis and mass screening tool, especially at ports, airports, train stations, stadiums, and public gatherings. Countries like Finland, Thailand, UAE, Italy, Chile, India, Australia, Pakistan, Saudi Arabia, Switzerland, and Mexico are already training and deploying canines for COVID-19 detection. The dogs are trained to sniff the area around a person, and if they find the odor of COVID-19 they will sit or stand back from an individual as a signal that they've identified an infection.
While bio-detection dogs seem promising for cheap, large-volume screening, many of the studies that have been performed to date have been small and in controlled environments. The big question is whether this approach work on people in crowded airports, not just samples of shirts and socks in a lab.
"The next step is 'real world' testing where they [canines] are placed in airports to screen people and see how they perform," says Anna Durbin, professor of international health at the John Hopkins Bloomberg School of Public Health. "Testing in real airports with lots of passengers and competing scents will need to be done."
According to Butlin of Medical Detection Dogs, scalability could be a challenge. However, scientists don't intend to have a dog in every waiting room, detecting COVID-19 or other diseases, she said.
"Dogs are the most reliable bio sensors on the planet and they have proven time and time again that they can detect diseases as accurately, if not more so, than current technological diagnostics," said Butlin. "We are learning from them all the time and what their noses know will one day enable the creation an 'E-nose' that does the same job – imagine a day when your mobile phone can tell you that you are unwell."
The Voice Behind Some of Your Favorite Cartoon Characters Helped Create the Artificial Heart
In June, a team of surgeons at Duke University Hospital implanted the latest model of an artificial heart in a 39-year-old man with severe heart failure, a condition in which the heart doesn't pump properly. The man's mechanical heart, made by French company Carmat, is a new generation artificial heart and the first of its kind to be transplanted in the United States. It connects to a portable external power supply and is designed to keep the patient alive until a replacement organ becomes available.
Many patients die while waiting for a heart transplant, but artificial hearts can bridge the gap. Though not a permanent solution for heart failure, artificial hearts have saved countless lives since their first implantation in 1982.
What might surprise you is that the origin of the artificial heart dates back decades before, when an inventive television actor teamed up with a famous doctor to design and patent the first such device.
A man of many talents
Paul Winchell was an entertainer in the 1950s and 60s, rising to fame as a ventriloquist and guest-starring as an actor on programs like "The Ed Sullivan Show" and "Perry Mason." When children's animation boomed in the 1960s, Winchell made a name for himself as a voice actor on shows like "The Smurfs," "Winnie the Pooh," and "The Jetsons." He eventually became famous for originating the voices of Tigger from "Winnie the Pooh" and Gargamel from "The Smurfs," among many others.
But Winchell wasn't just an entertainer: He also had a quiet passion for science and medicine. Between television gigs, Winchell busied himself working as a medical hypnotist and acupuncturist, treating the same Hollywood stars he performed alongside. When he wasn't doing that, Winchell threw himself into engineering and design, building not only the ventriloquism dummies he used on his television appearances but a host of products he'd dreamed up himself. Winchell spent hours tinkering with his own inventions, such as a set of battery-powered gloves and something called a "flameless lighter." Over the course of his life, Winchell designed and patented more than 30 of these products – mostly novelties, but also serious medical devices, such as a portable blood plasma defroster.
Ventriloquist Paul Winchell with Jerry Mahoney, his dummy, in 1951 |
A meeting of the minds
In the early 1950s, Winchell appeared on a variety show called the "Arthur Murray Dance Party" and faced off in a dance competition with the legendary Ricardo Montalban (Winchell won). At a cast party for the show later that same night, Winchell met Dr. Henry Heimlich – the same doctor who would later become famous for inventing the Heimlich maneuver, who was married to Murray's daughter. The two hit it off immediately, bonding over their shared interest in medicine. Before long, Heimlich invited Winchell to come observe him in the operating room at the hospital where he worked. Winchell jumped at the opportunity, and not long after he became a frequent guest in Heimlich's surgical theatre, fascinated by the mechanics of the human body.
One day while Winchell was observing at the hospital, he witnessed a patient die on the operating table after undergoing open-heart surgery. He was suddenly struck with an idea: If there was some way doctors could keep blood pumping temporarily throughout the body during surgery, patients who underwent risky operations like open-heart surgery might have a better chance of survival. Winchell rushed to Heimlich with the idea – and Heimlich agreed to advise Winchell and look over any design drafts he came up with. So Winchell went to work.
Winchell's heart
As it turned out, building ventriloquism dummies wasn't that different from building an artificial heart, Winchell noted later in his autobiography – the shifting valves and chambers of the mechanical heart were similar to the moving eyes and opening mouths of his puppets. After each design, Winchell would go back to Heimlich and the two would confer, making adjustments along the way to.
By 1956, Winchell had perfected his design: The "heart" consisted of a bag that could be placed inside the human body, connected to a battery-powered motor outside of the body. The motor enabled the bag to pump blood throughout the body, similar to a real human heart. Winchell received a patent for the design in 1963.
At the time, Winchell never quite got the credit he deserved. Years later, researchers at the University of Utah, working on their own artificial heart, came across Winchell's patent and got in touch with Winchell to compare notes. Winchell ended up donating his patent to the team, which included Dr. Richard Jarvik. Jarvik expanded on Winchell's design and created the Jarvik-7 – the world's first artificial heart to be successfully implanted in a human being in 1982.
The Jarvik-7 has since been replaced with newer, more efficient models made up of different synthetic materials, allowing patients to live for longer stretches without the heart clogging or breaking down. With each new generation of hearts, heart failure patients have been able to live relatively normal lives for longer periods of time and with fewer complications than before – and it never would have been possible without the unsung genius of a puppeteer and his love of science.