Genetically Sequencing Healthy Babies Yielded Surprising Results
Today in Melrose, Massachusetts, Cora Stetson is the picture of good health, a bubbly precocious 2-year-old. But Cora has two separate mutations in the gene that produces a critical enzyme called biotinidase and her body produces only 40 percent of the normal levels of that enzyme.
In the last few years, the dream of predicting and preventing diseases through genomics, starting in childhood, is finally within reach.
That's enough to pass conventional newborn (heelstick) screening, but may not be enough for normal brain development, putting baby Cora at risk for seizures and cognitive impairment. But thanks to an experimental study in which Cora's DNA was sequenced after birth, this condition was discovered and she is being treated with a safe and inexpensive vitamin supplement.
Stories like these are beginning to emerge from the BabySeq Project, the first clinical trial in the world to systematically sequence healthy newborn infants. This trial was led by my research group with funding from the National Institutes of Health. While still controversial, it is pointing the way to a future in which adults, or even newborns, can receive comprehensive genetic analysis in order to determine their risk of future disease and enable opportunities to prevent them.
Some believe that medicine is still not ready for genomic population screening, but others feel it is long overdue. After all, the sequencing of the Human Genome Project was completed in 2003, and with this milestone, it became feasible to sequence and interpret the genome of any human being. The costs have come down dramatically since then; an entire human genome can now be sequenced for about $800, although the costs of bioinformatic and medical interpretation can add another $200 to $2000 more, depending upon the number of genes interrogated and the sophistication of the interpretive effort.
Two-year-old Cora Stetson, whose DNA sequencing after birth identified a potentially dangerous genetic mutation in time for her to receive preventive treatment.
(Photo courtesy of Robert Green)
The ability to sequence the human genome yielded extraordinary benefits in scientific discovery, disease diagnosis, and targeted cancer treatment. But the ability of genomes to detect health risks in advance, to actually predict the medical future of an individual, has been mired in controversy and slow to manifest. In particular, the oft-cited vision that healthy infants could be genetically tested at birth in order to predict and prevent the diseases they would encounter, has proven to be far tougher to implement than anyone anticipated.
But in the last few years, the dream of predicting and preventing diseases through genomics, starting in childhood, is finally within reach. Why did it take so long? And what remains to be done?
Great Expectations
Part of the problem was the unrealistic expectations that had been building for years in advance of the genomic science itself. For example, the 1997 film Gattaca portrayed a near future in which the lifetime risk of disease was readily predicted the moment an infant is born. In the fanfare that accompanied the completion of the Human Genome Project, the notion of predicting and preventing future disease in an individual became a powerful meme that was used to inspire investment and public support for genomic research long before the tools were in place to make it happen.
Another part of the problem was the success of state-mandated newborn screening programs that began in the 1960's with biochemical tests of the "heel-stick" for babies with metabolic disorders. These programs have worked beautifully, costing only a few dollars per baby and saving thousands of infants from death and severe cognitive impairment. It seemed only logical that a new technology like genome sequencing would add power and promise to such programs. But instead of embracing the notion of newborn sequencing, newborn screening laboratories have thus far rejected the entire idea as too expensive, too ambiguous, and too threatening to the comfortable constituency that they had built within the public health framework.
"What can you find when you look as deeply as possible into the medical genomes of healthy individuals?"
Creating the Evidence Base for Preventive Genomics
Despite a number of obstacles, there are researchers who are exploring how to achieve the original vision of genomic testing as a tool for disease prediction and prevention. For example, in our NIH-funded MedSeq Project, we were the first to ask the question: "What can you find when you look as deeply as possible into the medical genomes of healthy individuals?"
Most people do not understand that genetic information comes in four separate categories: 1) dominant mutations putting the individual at risk for rare conditions like familial forms of heart disease or cancer, (2) recessive mutations putting the individual's children at risk for rare conditions like cystic fibrosis or PKU, (3) variants across the genome that can be tallied to construct polygenic risk scores for common conditions like heart disease or type 2 diabetes, and (4) variants that can influence drug metabolism or predict drug side effects such as the muscle pain that occasionally occurs with statin use.
The technological and analytical challenges of our study were formidable, because we decided to systematically interrogate over 5000 disease-associated genes and report results in all four categories of genetic information directly to the primary care physicians for each of our volunteers. We enrolled 200 adults and found that everyone who was sequenced had medically relevant polygenic and pharmacogenomic results, over 90 percent carried recessive mutations that could have been important to reproduction, and an extraordinary 14.5 percent carried dominant mutations for rare genetic conditions.
A few years later we launched the BabySeq Project. In this study, we restricted the number of genes to include only those with child/adolescent onset that could benefit medically from early warning, and even so, we found 9.4 percent carried dominant mutations for rare conditions.
At first, our interpretation around the high proportion of apparently healthy individuals with dominant mutations for rare genetic conditions was simple – that these conditions had lower "penetrance" than anticipated; in other words, only a small proportion of those who carried the dominant mutation would get the disease. If this interpretation were to hold, then genetic risk information might be far less useful than we had hoped.
Suddenly the information available in the genome of even an apparently healthy individual is looking more robust, and the prospect of preventive genomics is looking feasible.
But then we circled back with each adult or infant in order to examine and test them for any possible features of the rare disease in question. When we did this, we were surprised to see that in over a quarter of those carrying such mutations, there were already subtle signs of the disease in question that had not even been suspected! Now our interpretation was different. We now believe that genetic risk may be responsible for subclinical disease in a much higher proportion of people than has ever been suspected!
Meanwhile, colleagues of ours have been demonstrating that detailed analysis of polygenic risk scores can identify individuals at high risk for common conditions like heart disease. So adding up the medically relevant results in any given genome, we start to see that you can learn your risks for a rare monogenic condition, a common polygenic condition, a bad effect from a drug you might take in the future, or for having a child with a devastating recessive condition. Suddenly the information available in the genome of even an apparently healthy individual is looking more robust, and the prospect of preventive genomics is looking feasible.
Preventive Genomics Arrives in Clinical Medicine
There is still considerable evidence to gather before we can recommend genomic screening for the entire population. For example, it is important to make sure that families who learn about such risks do not suffer harms or waste resources from excessive medical attention. And many doctors don't yet have guidance on how to use such information with their patients. But our research is convincing many people that preventive genomics is coming and that it will save lives.
In fact, we recently launched a Preventive Genomics Clinic at Brigham and Women's Hospital where information-seeking adults can obtain predictive genomic testing with the highest quality interpretation and medical context, and be coached over time in light of their disease risks toward a healthier outcome. Insurance doesn't yet cover such testing, so patients must pay out of pocket for now, but they can choose from a menu of genetic screening tests, all of which are more comprehensive than consumer-facing products. Genetic counseling is available but optional. So far, this service is for adults only, but sequencing for children will surely follow soon.
As the costs of sequencing and other Omics technologies continue to decline, we will see both responsible and irresponsible marketing of genetic testing, and we will need to guard against unscientific claims. But at the same time, we must be far more imaginative and fast moving in mainstream medicine than we have been to date in order to claim the emerging benefits of preventive genomics where it is now clear that suffering can be averted, and lives can be saved. The future has arrived if we are bold enough to grasp it.
Funding and Disclosures:
Dr. Green's research is supported by the National Institutes of Health, the Department of Defense and through donations to The Franca Sozzani Fund for Preventive Genomics. Dr. Green receives compensation for advising the following companies: AIA, Applied Therapeutics, Helix, Ohana, OptraHealth, Prudential, Verily and Veritas; and is co-founder and advisor to Genome Medical, Inc, a technology and services company providing genetics expertise to patients, providers, employers and care systems.
Out of Thin Air: A Fresh Solution to Farming’s Water Shortages
California has been plagued by perilous droughts for decades. Freshwater shortages have sparked raging wildfires and killed fruit and vegetable crops. And California is not alone in its danger of running out of water for farming; parts of the Southwest, including Texas, are battling severe drought conditions, according to the North American Drought Monitor. These two states account for 316,900 of the 2 million total U.S. farms.
But even as farming becomes more vulnerable due to water shortages, the world's demand for food is projected to increase 70 percent by 2050, according to Guihua Yu, an associate professor of materials science at The University of Texas at Austin.
"Water is the most limiting natural resource for agricultural production because of the freshwater shortage and enormous water consumption needed for irrigation," Yu said.
As scientists have searched for solutions, an alternative water supply has been hiding in plain sight: Water vapor in the atmosphere. It is abundant, available, and endlessly renewable, just waiting for the moment that technological innovation and necessity converged to make it fit for use. Now, new super-moisture-absorbent gels developed by Yu and a team of researchers can pull that moisture from the air and bring it into soil, potentially expanding the map of farmable land around the globe to dry and remote regions that suffer from water shortages.
"This opens up opportunities to turn those previously poor-quality or inhospitable lands to become useable and without need of centralized water and power supplies," Yu said.
A renewable source of freshwater
The hydrogels are a gelatin-like substance made from synthetic materials. The gels activate in cooler, humid overnight periods and draw water from the air. During a four-week experiment, Yu's team observed that soil with these gels provided enough water to support seed germination and plant growth without an additional liquid water supply. And the soil was able to maintain the moist environment for more than a month, according to Yu.
The super absorbent gels developed at the University of Texas at Austin.
Xingyi Zhou, UT Austin
"It is promising to liberate underdeveloped and drought areas from the long-distance water and power supplies for agricultural production," Yu said.
Crops also rely on fertilizer to maintain soil fertility and increase the production yield, but it is easily lost through leaching. Runoff increases agricultural costs and contributes to environmental pollution. The interaction between the gels and agrochemicals offer slow and controlled fertilizer release to maintain the balance between the root of the plant and the soil.
The possibilities are endless
Harvesting atmospheric water is exciting on multiple fronts. The super-moisture-absorbent gel can also be used for passively cooling solar panels. Solar radiation is the magic behind the process. Overnight, as temperatures cool, the gels absorb water hanging in the atmosphere. The moisture is stored inside the gels until the thermometer rises. Heat from the sun serves as the faucet that turns the gels on so they can release the stored water and cool down the panels. Effective cooling of the solar panels is important for sustainable long-term power generation.
In addition to agricultural uses and cooling for energy devices, atmospheric water harvesting technologies could even reach people's homes.
"They could be developed to enable easy access to drinking water through individual systems for household usage," Yu said.
Next steps
Yu and the team are now focused on affordability and developing practical applications for use. The goal is to optimize the gel materials to achieve higher levels of water uptake from the atmosphere.
"We are exploring different kinds of polymers and solar absorbers while exploring low-cost raw materials for production," Yu said.
The ability to transform atmospheric water vapor into a cheap and plentiful water source would be a game-changer. One day in the not-too-distant future, if climate change intensifies and droughts worsen, this innovation may become vital to our very survival.
On the morning of April 12, 1955, newsrooms across the United States inked headlines onto newsprint: the Salk Polio vaccine was "safe, effective, and potent." This was long-awaited news. Americans had limped through decades of fear, unaware of what caused polio or how to cure it, faced with the disease's terrifying, visible power to paralyze and kill, particularly children.
The announcement of the polio vaccine was celebrated with noisy jubilation: church bells rang, factory whistles sounded, people wept in the streets. Within weeks, mass inoculation began as the nation put its faith in a vaccine that would end polio.
Today, most of us are blissfully ignorant of child polio deaths, making it easier to believe that we have not personally benefited from the development of vaccines. According to Dr. Steven Pinker, cognitive psychologist and author of the bestselling book Enlightenment Now, we've become blasé to the gifts of science. "The default expectation is not that disease is part of life and science is a godsend, but that health is the default, and any disease is some outrage," he says.
We're now in the early stages of another vaccine rollout, one we hope will end the ravages of the COVID-19 pandemic. And yet, the Pfizer, Moderna, and AstraZeneca vaccines are met with far greater hesitancy and skepticism than the polio vaccine was in the 50s.
In 2021, concerns over the speed and safety of vaccine development and technology plague this heroic global effort, but the roots of vaccine hesitancy run far deeper. Vaccine hesitancy has always existed in the U.S., even in the polio era, motivated in part by fears around "living virus" in a bad batch of vaccines produced by Cutter Laboratories in 1955. But in the last half century, we've witnessed seismic cultural shifts—loss of public trust, a rise in misinformation, heightened racial and socioeconomic inequality, and political polarization have all intensified vaccine-related fears and resistance. Making sense of how we got here may help us understand how to move forward.
The Rise and Fall of Public Trust
When the polio vaccine was released in 1955, "we were nearing an all-time high point in public trust," says Matt Baum, Harvard Kennedy School professor and lead author of several reports measuring public trust and vaccine confidence. Baum explains that the U.S. was experiencing a post-war boom following the Allied triumph in WWII, a popular Roosevelt presidency, and the rapid innovation that elevated the country to an international superpower.
The 1950s witnessed the emergence of nuclear technology, a space program, and unprecedented medical breakthroughs, adds Emily Brunson, Texas State University anthropologist and co-chair of the Working Group on Readying Populations for COVID-19 Vaccine. "Antibiotics were a game changer," she states. While before, people got sick with pneumonia for a month, suddenly they had access to pills that accelerated recovery.
During this period, science seemed to hold all the answers; people embraced the idea that we could "come to know the world with an absolute truth," Brunson explains. Doctors were portrayed as unquestioned gods, so Americans were primed to trust experts who told them the polio vaccine was safe.
"The emotional tone of the news has gone downward since the 1940s, and journalists consider it a professional responsibility to cover the negative."
That blind acceptance eroded in the 1960s and 70s as people came to understand that science can be inherently political. "Getting to an absolute truth works out great for white men, but these things affect people socially in radically different ways," Brunson says. As the culture began questioning the white, patriarchal biases of science, doctors lost their god-like status and experts were pushed off their pedestals. This trend continues with greater intensity today, as President Trump has led a campaign against experts and waged a war on science that began long before the pandemic.
The Shift in How We Consume Information
In the 1950s, the media created an informational consensus. The fundamental ideas the public consumed about the state of the world were unified. "People argued about the best solutions, but didn't fundamentally disagree on the factual baseline," says Baum. Indeed, the messaging around the polio vaccine was centralized and consistent, led by President Roosevelt's successful March of Dimes crusade. People of lower socioeconomic status with limited access to this information were less likely to have confidence in the vaccine, but most people consumed media that assured them of the vaccine's safety and mobilized them to receive it.
Today, the information we consume is no longer centralized—in fact, just the opposite. "When you take that away, it's hard for people to know what to trust and what not to trust," Baum explains. We've witnessed an increase in polarization and the technology that makes it easier to give people what they want to hear, reinforcing the human tendencies to vilify the other side and reinforce our preexisting ideas. When information is engineered to further an agenda, each choice and risk calculation made while navigating the COVID-19 pandemic is deeply politicized.
This polarization maps onto a rise in socioeconomic inequality and economic uncertainty. These factors, associated with a sense of lost control, prime people to embrace misinformation, explains Baum, especially when the situation is difficult to comprehend. "The beauty of conspiratorial thinking is that it provides answers to all these questions," he says. Today's insidious fragmentation of news media accelerates the circulation of mis- and disinformation, reaching more people faster, regardless of veracity or motivation. In the case of vaccines, skepticism around their origin, safety, and motivation is intensified.
Alongside the rise in polarization, Pinker says "the emotional tone of the news has gone downward since the 1940s, and journalists consider it a professional responsibility to cover the negative." Relentless focus on everything that goes wrong further erodes public trust and paints a picture of the world getting worse. "Life saved is not a news story," says Pinker, but perhaps it should be, he continues. "If people were more aware of how much better life was generally, they might be more receptive to improvements that will continue to make life better. These improvements don't happen by themselves."
The Future Depends on Vaccine Confidence
So far, the U.S. has been unable to mitigate the catastrophic effects of the pandemic through social distancing, testing, and contact tracing. President Trump has downplayed the effects and threat of the virus, censored experts and scientists, given up on containing the spread, and mobilized his base to protest masks. The Trump Administration failed to devise a national plan, so our national plan has defaulted to hoping for the "miracle" of a vaccine. And they are "something of a miracle," Pinker says, describing vaccines as "the most benevolent invention in the history of our species." In record-breaking time, three vaccines have arrived. But their impact will be weakened unless we achieve mass vaccination. As Brunson notes, "The technology isn't the fix; it's people taking the technology."
Significant challenges remain, including facilitating widespread access and supporting on-the-ground efforts to allay concerns and build trust with specific populations with historic reasons for distrust, says Brunson. Baum predicts continuing delays as well as deaths from other causes that will be linked to the vaccine.
Still, there's every reason for hope. The new administration "has its eyes wide open to these challenges. These are the kind of problems that are amenable to policy solutions if we have the will," Baum says. He forecasts widespread vaccination by late summer and a bounce back from the economic damage, a "Good News Story" that will bolster vaccine acceptance in the future. And Pinker reminds us that science, medicine, and public health have greatly extended our lives in the last few decades, a trend that can only continue if we're willing to roll up our sleeves.