AI and you: Is the promise of personalized nutrition apps worth the hype?
As a type 2 diabetic, Michael Snyder has long been interested in how blood sugar levels vary from one person to another in response to the same food, and whether a more personalized approach to nutrition could help tackle the rapidly cascading levels of diabetes and obesity in much of the western world.
Eight years ago, Snyder, who directs the Center for Genomics and Personalized Medicine at Stanford University, decided to put his theories to the test. In the 2000s continuous glucose monitoring, or CGM, had begun to revolutionize the lives of diabetics, both type 1 and type 2. Using spherical sensors which sit on the upper arm or abdomen – with tiny wires that pierce the skin – the technology allowed patients to gain real-time updates on their blood sugar levels, transmitted directly to their phone.
It gave Snyder an idea for his research at Stanford. Applying the same technology to a group of apparently healthy people, and looking for ‘spikes’ or sudden surges in blood sugar known as hyperglycemia, could provide a means of observing how their bodies reacted to an array of foods.
“We discovered that different foods spike people differently,” he says. “Some people spike to pasta, others to bread, others to bananas, and so on. It’s very personalized and our feeling was that building programs around these devices could be extremely powerful for better managing people’s glucose.”
Unbeknown to Snyder at the time, thousands of miles away, a group of Israeli scientists at the Weizmann Institute of Science were doing exactly the same experiments. In 2015, they published a landmark paper which used CGM to track the blood sugar levels of 800 people over several days, showing that the biological response to identical foods can vary wildly. Like Snyder, they theorized that giving people a greater understanding of their own glucose responses, so they spend more time in the normal range, may reduce the prevalence of type 2 diabetes.
The commercial potential of such apps is clear, but the underlying science continues to generate intriguing findings.
“At the moment 33 percent of the U.S. population is pre-diabetic, and 70 percent of those pre-diabetics will become diabetic,” says Snyder. “Those numbers are going up, so it’s pretty clear we need to do something about it.”
Fast forward to 2022,and both teams have converted their ideas into subscription-based dietary apps which use artificial intelligence to offer data-informed nutritional and lifestyle recommendations. Snyder’s spinoff, January AI, combines CGM information with heart rate, sleep, and activity data to advise on foods to avoid and the best times to exercise. DayTwo–a start-up which utilizes the findings of Weizmann Institute of Science–obtains microbiome information by sequencing stool samples, and combines this with blood glucose data to rate ‘good’ and ‘bad’ foods for a particular person.
“CGMs can be used to devise personalized diets,” says Eran Elinav, an immunology professor and microbiota researcher at the Weizmann Institute of Science in addition to serving as a scientific consultant for DayTwo. “However, this process can be cumbersome. Therefore, in our lab we created an algorithm, based on data acquired from a big cohort of people, which can accurately predict post-meal glucose responses on a personal basis.”
The commercial potential of such apps is clear. DayTwo, who market their product to corporate employers and health insurers rather than individual consumers, recently raised $37 million in funding. But the underlying science continues to generate intriguing findings.
Last year, Elinav and colleagues published a study on 225 individuals with pre-diabetes which found that they achieved better blood sugar control when they followed a personalized diet based on DayTwo’s recommendations, compared to a Mediterranean diet. The journal Cell just released a new paper from Snyder’s group which shows that different types of fibre benefit people in different ways.
“The idea is you hear different fibres are good for you,” says Snyder. “But if you look at fibres they’re all over the map—it’s like saying all animals are the same. The responses are very individual. For a lot of people [a type of fibre called] arabinoxylan clearly reduced cholesterol while the fibre inulin had no effect. But in some people, it was the complete opposite.”
Eight years ago, Stanford's Michael Snyder began studying how continuous glucose monitors could be used by patients to gain real-time updates on their blood sugar levels, transmitted directly to their phone.
The Snyder Lab, Stanford Medicine
Because of studies like these, interest in precision nutrition approaches has exploded in recent years. In January, the National Institutes of Health announced that they are spending $170 million on a five year, multi-center initiative which aims to develop algorithms based on a whole range of data sources from blood sugar to sleep, exercise, stress, microbiome and even genomic information which can help predict which diets are most suitable for a particular individual.
“There's so many different factors which influence what you put into your mouth but also what happens to different types of nutrients and how that ultimately affects your health, which means you can’t have a one-size-fits-all set of nutritional guidelines for everyone,” says Bruce Y. Lee, professor of health policy and management at the City University of New York Graduate School of Public Health.
With the falling costs of genomic sequencing, other precision nutrition clinical trials are choosing to look at whether our genomes alone can yield key information about what our diets should look like, an emerging field of research known as nutrigenomics.
The ASPIRE-DNA clinical trial at Imperial College London is aiming to see whether particular genetic variants can be used to classify individuals into two groups, those who are more glucose sensitive to fat and those who are more sensitive to carbohydrates. By following a tailored diet based on these sensitivities, the trial aims to see whether it can prevent people with pre-diabetes from developing the disease.
But while much hope is riding on these trials, even precision nutrition advocates caution that the field remains in the very earliest of stages. Lars-Oliver Klotz, professor of nutrigenomics at Friedrich-Schiller-University in Jena, Germany, says that while the overall goal is to identify means of avoiding nutrition-related diseases, genomic data alone is unlikely to be sufficient to prevent obesity and type 2 diabetes.
“Genome data is rather simple to acquire these days as sequencing techniques have dramatically advanced in recent years,” he says. “However, the predictive value of just genome sequencing is too low in the case of obesity and prediabetes.”
Others say that while genomic data can yield useful information in terms of how different people metabolize different types of fat and specific nutrients such as B vitamins, there is a need for more research before it can be utilized in an algorithm for making dietary recommendations.
“I think it’s a little early,” says Eileen Gibney, a professor at University College Dublin. “We’ve identified a limited number of gene-nutrient interactions so far, but we need more randomized control trials of people with different genetic profiles on the same diet, to see whether they respond differently, and if that can be explained by their genetic differences.”
Some start-ups have already come unstuck for promising too much, or pushing recommendations which are not based on scientifically rigorous trials. The world of precision nutrition apps was dubbed a ‘Wild West’ by some commentators after the founders of uBiome – a start-up which offered nutritional recommendations based on information obtained from sequencing stool samples –were charged with fraud last year. The weight-loss app Noom, which was valued at $3.7 billion in May 2021, has been criticized on Twitter by a number of users who claimed that its recommendations have led to them developed eating disorders.
With precision nutrition apps marketing their technology at healthy individuals, question marks have also been raised about the value which can be gained through non-diabetics monitoring their blood sugar through CGM. While some small studies have found that wearing a CGM can make overweight or obese individuals more motivated to exercise, there is still a lack of conclusive evidence showing that this translates to improved health.
However, independent researchers remain intrigued by the technology, and say that the wealth of data generated through such apps could be used to help further stratify the different types of people who become at risk of developing type 2 diabetes.
“CGM not only enables a longer sampling time for capturing glucose levels, but will also capture lifestyle factors,” says Robert Wagner, a diabetes researcher at University Hospital Düsseldorf. “It is probable that it can be used to identify many clusters of prediabetic metabolism and predict the risk of diabetes and its complications, but maybe also specific cardiometabolic risk constellations. However, we still don’t know which forms of diabetes can be prevented by such approaches and how feasible and long-lasting such self-feedback dietary modifications are.”
Snyder himself has now been wearing a CGM for eight years, and he credits the insights it provides with helping him to manage his own diabetes. “My CGM still gives me novel insights into what foods and behaviors affect my glucose levels,” he says.
He is now looking to run clinical trials with his group at Stanford to see whether following a precision nutrition approach based on CGM and microbiome data, combined with other health information, can be used to reverse signs of pre-diabetes. If it proves successful, January AI may look to incorporate microbiome data in future.
“Ultimately, what I want to do is be able take people’s poop samples, maybe a blood draw, and say, ‘Alright, based on these parameters, this is what I think is going to spike you,’ and then have a CGM to test that out,” he says. “Getting very predictive about this, so right from the get go, you can have people better manage their health and then use the glucose monitor to help follow that.”
Waste smothering our oceans is worth billions – here’s what we can do with all that sh$t
There’s hardly a person out there who hasn’t heard of the Great Pacific Garbage Patch. That type of pollution is impossible to miss. It stares you in the face from pictures and videos of sea turtles with drinking straws up their noses and acres of plastic swirling in the sea.
It demands you to solve the problem—and it works. The campaign to raise awareness about plastic pollution in the oceans has resulted in new policies, including bans on microplastics in personal care products, technology to clean up the plastic, and even new plastic-like materials that are better for the environment.
But there’s a different type of pollution smothering the ocean as you read this. Unfortunately, this one is almost invisible, but no less damaging. In fact, it’s even more serious than plastic and most people have no idea it even exists. It is literally under our noses, destroying our oceans, lakes, and rivers – and yet we are missing it completely while contributing to it daily. In fact, we exacerbate it multiple times a day—every time we use the bathroom.
It is the way we do our sewage.
Most of us don’t think much about what happens after we flush the toilet. Most of us probably assume that the substances we flush go “somewhere” and are dealt with safely. But we typically don’t think about it beyond that.
Most of us also probably don’t think about what’s in the ocean or lakes we swim in. Since others are swimming, jumping in is just fine. But our waterways are far from clean. In fact, at times they are incredibly filthy. In the US, we are dumping 1.2 trillion of gallons of untreated sewage into the environment every year. Just New York City alone discharges 27 billion gallons into the Hudson River basin annually.
How does this happen? Part of it is the unfortunate side effect of our sewage system design that dates back to over a century ago when cities were smaller and fewer people were living so close together.
Back then, engineers designed the so-called “combine sewer overflow systems,” or CSOs, in which the storm water pipes are connected to the sanitary sewer pipes. In normal conditions, the sewage effluent from homes flows to the treatment plants where it gets cleaned and released into the waterways. But when it rains, the pipe system becomes so overwhelmed with water that the treatment plant can’t process it fast enough. So the treatment plant has to release the excess water through its discharge pipes—directly, without treatment, into streams, rivers and the ocean.
The 1.2 trillion gallons of CSO releases isn’t even the full picture. There are also discharges from poorly maintained septic systems, cesspools and busted pipes of the aging wastewater infrastructure. The state of Hawaii alone has 88,000 cesspools that need replacing and are currently leaking 53 million gallons of raw sewage daily into their coastal waters. You may think twice about swimming on your Hawaii vacations.
Overall, the US is facing a $271 billion backlog in wastewater infrastructure projects to update these aging systems. Across the Western world, countries are facing similar challenges with their aging sewage systems, especially the UK and European Union.
That’s not to say that other parts of the planet are in better shape. Out of the 7+ billion people populating our earth, 4.2 billion don’t have access to safe sanitation. Included in this insane number are roughly 2 billion people who have no toilet at all. Whether washed by rains or dumped directly into the waterways, a lot of this sludge pollutes the environment, the drinking water, and ultimately the ocean.
Pipes pour water onto a rocky shore in Jakarta, Indonesia.
Tom Fisk
What complicates this from an ocean health perspective is that it’s not just poop and pee that gets dumped into nearby waterways. It is all the things we put in and on our bodies and flush down our drains. That vicious mix of chemicals includes caffeine, antibiotics, antidepressants, painkillers, hormones, microplastics, cocaine, cooking oils, paint thinners, and PFAS—the forever chemicals present in everything from breathable clothing to fire retardant fabrics of our living room couches. Recent reports have found all of the above substances in fish—and then some.
Why do we allow so much untreated sewage spill into the sea? Frankly speaking, for decades scientists and engineers thought that the ocean could handle it. The mantra back then was “dilution is the solution to pollution,” which might’ve worked when there were much fewer people living on earth—but not now. Today science is telling us that this old approach doesn’t hold. That marine habitats are much more sensitive than we had expected and can’t handle the amount of wastewater we are discharging into them.
The excess nitrogen and phosphorus that the sewage (and agricultural runoff) dumps into the water causes harmful algal blooms, more commonly known as red or brown tides. The water column is overtaken by tiny algae that sucks up all the oxygen from the water, creating dead zones like the big fish kills in the Gulf of Mexico. These algae also cause public health issues by releasing gases toxic to people and animals, including dementia, neurological damage, and respiratory illness. Marshes and mangroves end up with weakened root systems and start dying off. In a wastewater modeling study I published last year, we found that 31 percent of salt marshes globally were heavily polluted with human sewage. Coral reefs get riddled with disease and overgrown by seaweed.
We could convert sewage into high-value goods. It can be used to generate electricity, fertilizer, and drinking water. The technologies not only exist but are getting better and more efficient all the time.
Moreover, by way of our sewage, we managed to transmit a human pathogen—Serratia marcescens, which causes urinary, respiratory and other infections in people—to corals! Recent reports from the Florida Keys are showing white pox disease popping up in elk horn corals caused by S.marcescens, which somehow managed to jump species. Many recent studies have documented just how common this type of pollution is across the globe.
Yet, there is some good news in that abysmal sewage flow. Just like with plastic pollution, realizing that there’s a problem is the first step, so awareness is key. That’s exactly why I co-founded Ocean Sewage Alliance last year—a nonprofit that aims to “re-potty train the world” by breaking taboos in talking about the poop and pee problem, as well as uniting experts from various key sectors to work together to end sewage pollution in coastal areas.
To end this pollution, we have to change the ways we handle our sewage. Even more exciting is that by solving the sewage problem we can create all sorts of economic benefits. In 2015, human poop was valued at $9.5 billion a year globally, which today would be $11.5 billion per year.
What would one do with that sh$t?
We could convert it into high-value goods. Sewage can be used to generate electricity, fertilizer, and drinking water. The technologies not only exist but are getting better and more efficient all the time. Some exciting examples include biodigesters and urine diversion (or peecycling) systems that can produce fertilizer and biogas, essentially natural gas. The United Nations estimates that the biogas produced from poop could provide electricity for 138 million homes. And the recovered and cleaned water can be used for irrigation, laundry and flushing toilets. It can even be refined to the point that it is safe for drinking water – just ask the folks in Orange County, CA who have been doing so for the last few decades.
How do we deal with all the human-made pollutants in our sewage? There is technology for that too. Called pyrolysis, it heats up sludge to high temperatures in the absence of oxygen, which causes most of the substances to degrade and fall apart.
There are solutions to the problems—as long as we acknowledge that the problems exist. The fact that you are reading this means that you are part of the solution already. The next time you flush your toilet, think about where this output may flow. Does your septic system work properly? Does your local treatment plant discharge raw sewage on rainy days? Can that plant implement newer technologies that can upcycle waste? These questions are part of re-potty training the world, one household at a time. And together, these households are the force that can turn back the toxic sewage tide. And keep our oceans blue.
The U.S. must fund more biotech innovation – or other countries will catch up faster than you think
The U.S. has approximately 58 percent of the market share in the biotech sector, followed by China with 11 percent. However, this market share is the result of several years of previous research and development (R&D) – it is a present picture of what happened in the past. In the future, this market share will decline unless the federal government makes investments to improve the quality and quantity of U.S. research in biotech.
The effectiveness of current R&D can be evaluated in a variety of ways such as monies invested and the number of patents filed. According to the UNESCO Institute for Statistics, the U.S. spends approximately 2.7 percent of GDP on R&D ($476,459.0M), whereas China spends 2 percent ($346,266.3M). However, investment levels do not necessarily translate into goods that end up contributing to innovation.
Patents are a better indication of innovation. The biotech industry relies on patents to protect their investments, making patenting a key tool in the process of translating scientific discoveries that can ultimately benefit patients. In 2020, China filed 1,497,159 patents, a 6.9 percent increase in growth rate. In contrast, the U.S. filed 597,172, a 3.9 percent decline. When it comes to patents filed, China has approximately 45 percent of the world share compared to 18 percent for the U.S.
So how did we get here? The nature of science in academia allows scientists to specialize by dedicating several years to advance discovery research and develop new inventions that can then be licensed by biotech companies. This makes academic science critical to innovation in the U.S. and abroad.
Academic scientists rely on government and foundation grants to pay for R&D, which includes salaries for faculty, investigators and trainees, as well as monies for infrastructure, support personnel and research supplies. Of particular interest to academic scientists to cover these costs is government support such as Research Project Grants, also known as R01 grants, the oldest grant mechanism from the National Institutes of Health. Unfortunately, this funding mechanism is extremely competitive, as applications have a success rate of only about 20 percent. To maximize the chances of getting funded, investigators tend to limit the innovation of their applications, since a project that seems overambitious is discouraged by grant reviewers.
Considering the difficulty in obtaining funding, the limited number of opportunities for scientists to become independent investigators capable of leading their own scientific projects, and the salaries available to pay for scientists with a doctoral degree, it is not surprising that the U.S. is progressively losing its workforce for innovation.
This approach affects the future success of the R&D enterprise in the U.S. Pursuing less innovative work tends to produce scientific results that are more obvious than groundbreaking, and when a discovery is obvious, it cannot be patented, resulting in fewer inventions that go on to benefit patients. Even though there are governmental funding options available for scientists in academia focused on more groundbreaking and translational projects, those options are less coveted by academic scientists who are trying to obtain tenure and long-term funding to cover salaries and other associated laboratory expenses. Therefore, since only a small percent of projects gets funded, the likelihood of scientists interested in pursuing academic science or even research in general keeps declining over time.
Efforts to raise the number of individuals who pursue a scientific education are paying off. However, the number of job openings for those trainees to carry out independent scientific research once they graduate has proved harder to increase. These limitations are not just in the number of faculty openings to pursue academic science, which are in part related to grant funding, but also the low salary available to pay those scientists after they obtain their doctoral degree, which ranges from $53,000 to $65,000, depending on years of experience.
Thus, considering the difficulty in obtaining funding, the limited number of opportunities for scientists to become independent investigators capable of leading their own scientific projects, and the salaries available to pay for scientists with a doctoral degree, it is not surprising that the U.S. is progressively losing its workforce for innovation, which results in fewer patents filed.
Perhaps instead of encouraging scientists to propose less innovative projects in order to increase their chances of getting grants, the U.S. government should give serious consideration to funding investigators for their potential for success -- or the success they have already achieved in contributing to the advancement of science. Such a funding approach should be tiered depending on career stage or years of experience, considering that 42 years old is the median age at which the first R01 is obtained. This suggests that after finishing their training, scientists spend 10 years before they establish themselves as independent academic investigators capable of having the appropriate funds to train the next generation of scientists who will help the U.S. maintain or even expand its market share in the biotech industry for years to come. Patenting should be given more weight as part of the academic endeavor for promotion purposes, or governmental investment in research funding should be increased to support more than just 20 percent of projects.
Remaining at the forefront of biotech innovation will give us the opportunity to not just generate more jobs, but it will also allow us to attract the brightest scientists from all over the world. This talented workforce will go on to train future U.S. scientists and will improve our standard of living by giving us the opportunity to produce the next generation of therapies intended to improve human health.
This problem cannot rely on just one solution, but what is certain is that unless there are more creative changes in funding approaches for scientists in academia, eventually we may be saying “remember when the U.S. was at the forefront of biotech innovation?”