Genetically Sequencing Healthy Babies Yielded Surprising Results
Today in Melrose, Massachusetts, Cora Stetson is the picture of good health, a bubbly precocious 2-year-old. But Cora has two separate mutations in the gene that produces a critical enzyme called biotinidase and her body produces only 40 percent of the normal levels of that enzyme.
In the last few years, the dream of predicting and preventing diseases through genomics, starting in childhood, is finally within reach.
That's enough to pass conventional newborn (heelstick) screening, but may not be enough for normal brain development, putting baby Cora at risk for seizures and cognitive impairment. But thanks to an experimental study in which Cora's DNA was sequenced after birth, this condition was discovered and she is being treated with a safe and inexpensive vitamin supplement.
Stories like these are beginning to emerge from the BabySeq Project, the first clinical trial in the world to systematically sequence healthy newborn infants. This trial was led by my research group with funding from the National Institutes of Health. While still controversial, it is pointing the way to a future in which adults, or even newborns, can receive comprehensive genetic analysis in order to determine their risk of future disease and enable opportunities to prevent them.
Some believe that medicine is still not ready for genomic population screening, but others feel it is long overdue. After all, the sequencing of the Human Genome Project was completed in 2003, and with this milestone, it became feasible to sequence and interpret the genome of any human being. The costs have come down dramatically since then; an entire human genome can now be sequenced for about $800, although the costs of bioinformatic and medical interpretation can add another $200 to $2000 more, depending upon the number of genes interrogated and the sophistication of the interpretive effort.
Two-year-old Cora Stetson, whose DNA sequencing after birth identified a potentially dangerous genetic mutation in time for her to receive preventive treatment.
(Photo courtesy of Robert Green)
The ability to sequence the human genome yielded extraordinary benefits in scientific discovery, disease diagnosis, and targeted cancer treatment. But the ability of genomes to detect health risks in advance, to actually predict the medical future of an individual, has been mired in controversy and slow to manifest. In particular, the oft-cited vision that healthy infants could be genetically tested at birth in order to predict and prevent the diseases they would encounter, has proven to be far tougher to implement than anyone anticipated.
But in the last few years, the dream of predicting and preventing diseases through genomics, starting in childhood, is finally within reach. Why did it take so long? And what remains to be done?
Great Expectations
Part of the problem was the unrealistic expectations that had been building for years in advance of the genomic science itself. For example, the 1997 film Gattaca portrayed a near future in which the lifetime risk of disease was readily predicted the moment an infant is born. In the fanfare that accompanied the completion of the Human Genome Project, the notion of predicting and preventing future disease in an individual became a powerful meme that was used to inspire investment and public support for genomic research long before the tools were in place to make it happen.
Another part of the problem was the success of state-mandated newborn screening programs that began in the 1960's with biochemical tests of the "heel-stick" for babies with metabolic disorders. These programs have worked beautifully, costing only a few dollars per baby and saving thousands of infants from death and severe cognitive impairment. It seemed only logical that a new technology like genome sequencing would add power and promise to such programs. But instead of embracing the notion of newborn sequencing, newborn screening laboratories have thus far rejected the entire idea as too expensive, too ambiguous, and too threatening to the comfortable constituency that they had built within the public health framework.
"What can you find when you look as deeply as possible into the medical genomes of healthy individuals?"
Creating the Evidence Base for Preventive Genomics
Despite a number of obstacles, there are researchers who are exploring how to achieve the original vision of genomic testing as a tool for disease prediction and prevention. For example, in our NIH-funded MedSeq Project, we were the first to ask the question: "What can you find when you look as deeply as possible into the medical genomes of healthy individuals?"
Most people do not understand that genetic information comes in four separate categories: 1) dominant mutations putting the individual at risk for rare conditions like familial forms of heart disease or cancer, (2) recessive mutations putting the individual's children at risk for rare conditions like cystic fibrosis or PKU, (3) variants across the genome that can be tallied to construct polygenic risk scores for common conditions like heart disease or type 2 diabetes, and (4) variants that can influence drug metabolism or predict drug side effects such as the muscle pain that occasionally occurs with statin use.
The technological and analytical challenges of our study were formidable, because we decided to systematically interrogate over 5000 disease-associated genes and report results in all four categories of genetic information directly to the primary care physicians for each of our volunteers. We enrolled 200 adults and found that everyone who was sequenced had medically relevant polygenic and pharmacogenomic results, over 90 percent carried recessive mutations that could have been important to reproduction, and an extraordinary 14.5 percent carried dominant mutations for rare genetic conditions.
A few years later we launched the BabySeq Project. In this study, we restricted the number of genes to include only those with child/adolescent onset that could benefit medically from early warning, and even so, we found 9.4 percent carried dominant mutations for rare conditions.
At first, our interpretation around the high proportion of apparently healthy individuals with dominant mutations for rare genetic conditions was simple – that these conditions had lower "penetrance" than anticipated; in other words, only a small proportion of those who carried the dominant mutation would get the disease. If this interpretation were to hold, then genetic risk information might be far less useful than we had hoped.
Suddenly the information available in the genome of even an apparently healthy individual is looking more robust, and the prospect of preventive genomics is looking feasible.
But then we circled back with each adult or infant in order to examine and test them for any possible features of the rare disease in question. When we did this, we were surprised to see that in over a quarter of those carrying such mutations, there were already subtle signs of the disease in question that had not even been suspected! Now our interpretation was different. We now believe that genetic risk may be responsible for subclinical disease in a much higher proportion of people than has ever been suspected!
Meanwhile, colleagues of ours have been demonstrating that detailed analysis of polygenic risk scores can identify individuals at high risk for common conditions like heart disease. So adding up the medically relevant results in any given genome, we start to see that you can learn your risks for a rare monogenic condition, a common polygenic condition, a bad effect from a drug you might take in the future, or for having a child with a devastating recessive condition. Suddenly the information available in the genome of even an apparently healthy individual is looking more robust, and the prospect of preventive genomics is looking feasible.
Preventive Genomics Arrives in Clinical Medicine
There is still considerable evidence to gather before we can recommend genomic screening for the entire population. For example, it is important to make sure that families who learn about such risks do not suffer harms or waste resources from excessive medical attention. And many doctors don't yet have guidance on how to use such information with their patients. But our research is convincing many people that preventive genomics is coming and that it will save lives.
In fact, we recently launched a Preventive Genomics Clinic at Brigham and Women's Hospital where information-seeking adults can obtain predictive genomic testing with the highest quality interpretation and medical context, and be coached over time in light of their disease risks toward a healthier outcome. Insurance doesn't yet cover such testing, so patients must pay out of pocket for now, but they can choose from a menu of genetic screening tests, all of which are more comprehensive than consumer-facing products. Genetic counseling is available but optional. So far, this service is for adults only, but sequencing for children will surely follow soon.
As the costs of sequencing and other Omics technologies continue to decline, we will see both responsible and irresponsible marketing of genetic testing, and we will need to guard against unscientific claims. But at the same time, we must be far more imaginative and fast moving in mainstream medicine than we have been to date in order to claim the emerging benefits of preventive genomics where it is now clear that suffering can be averted, and lives can be saved. The future has arrived if we are bold enough to grasp it.
Funding and Disclosures:
Dr. Green's research is supported by the National Institutes of Health, the Department of Defense and through donations to The Franca Sozzani Fund for Preventive Genomics. Dr. Green receives compensation for advising the following companies: AIA, Applied Therapeutics, Helix, Ohana, OptraHealth, Prudential, Verily and Veritas; and is co-founder and advisor to Genome Medical, Inc, a technology and services company providing genetics expertise to patients, providers, employers and care systems.
Scientists and Religious Leaders Need to Be More Transparent
[Editor's Note: This essay is in response to our current Big Question series: "How can the religious and scientific communities work together to foster a culture that is equipped to face humanity's biggest challenges?"]
As a Jesuit Catholic priest, and a molecular geneticist, this question has been a fundamental part of my adult life. But first, let me address an issue that our American culture continues to struggle with: how do science and religion actually relate to each other? Is science about the "real" world, and religion just about individual or group beliefs about how the world should be?
Or are science and religion in direct competition with both trying to construct explanations of reality that are "better" or more real than the other's approach? These questions have generated much discussion among scientists, philosophers, and theologians.
The recent advances in our understanding of genetics show how combining the insights of science and religion can be beneficial.
First, we need to be clear that science and religion are two different ways human beings use to understand reality. Science focuses on observable, quantifiable, physical aspects of our universe, whereas, religion, while taking physical reality into consideration, also includes the immaterial, non-quantifiable, human experiences and concepts which relate to the meaning and purpose of existence. While scientific discoveries also often stimulate such profound reflections, these reflections are not technically a part of scientific methodology.
Second, though different in both method and focus, neither way of understanding reality produces a more "real" or accurate comprehension of our human existence. In fact, most often both science and religion add valuable insights into any particular situation, providing a more complete understanding of it as well as how it might be improved.
The recent advances in our understanding of genetics show how combining the insights of science and religion can be beneficial. For instance, the study of genetic differences among people around the world has shown us that the idea that we could accurately classify people as belonging to different races—e.g. African, Caucasian, Asian, etc.—is actually quite incorrect on a biological level. In fact, in many ways two people who appear to be of different races, perhaps African and Caucasian, could be more similar genetically than two people who appear to be of the same African race.
This scientific finding, then, challenges us to critically review the social categories some use to classify people as different from us, and, therefore, somehow of less worth to society. From this perspective, one could argue that this scientific insight synergizes well with some common fundamental religious beliefs regarding the fundamental equality all people have in their relationship to the Divine.
However, this synergy between science and religion is not what we encounter most often in the mass media or public policy debates. In part, this is due to the fact that science and religion working well together is not normally considered newsworthy. What does get attention is when science appears to conflict with religion, or, perhaps more accurately, when the scientific community conflicts with religious communities regarding how a particular scientific advance should be applied. These disagreements usually are not due to a conflict between scientific findings and religious beliefs, but rather between differing moral, social or political agendas.
One way that the two sides can work together is to prioritize honesty and accuracy in public debates instead of crafting informational campaigns to promote political advantage.
For example, genetically modified foods have been a source of controversy for the past several decades. While the various techniques used to create targeted genetic changes in plants—e.g. drought or pest resistance—are scientifically intricate and complex, explaining these techniques to the public is similar to explaining complex medical treatments to patients. Hence, the science alone is not the issue.
The controversy arises from the differing goals various stakeholders have for this technology. Obviously, companies employing this technology want it to be used around the world both for its significantly improved food production, and for improved revenue. Opponents, which have included religious communities, focus more on the social and cultural disruption this technology can create. Since a public debate between a complex technology on one side, and a complex social situation on the other side, is difficult to undertake well, the controversy has too often been reduced to sound bites such as "Frankenfoods." While such phrases may be an effective way to influence public opinion, ultimately, they work against sensible decision-making.
One way that the two sides can work together is to prioritize honesty and accuracy in public debates instead of crafting informational campaigns to promote political advantage. I recognize that presenting a thorough and honest explanation of an organization's position does not fit easily into our 24-hour-a-day-sound-bite system, but this is necessary to make the best decisions we can if we want to foster a healthier and happier world.
Climate change and human genome editing are good examples of this problem. These are both complex issues with impacts that extend well beyond just science and religious beliefs—including economics, societal disruption, and an exacerbation of social inequalities. To achieve solutions that result in significant benefits for the vast majority of people, we must work to create a knowledgeable public that is encouraged to consider the good of both one's own community as well as that of all others. This goal is actually one that both scientific and religious organizations claim to value and pursue.
The experts often fail to understand sufficiently what the public hopes, wants, and fears.
Unfortunately, both types of organizations often fall short because they focus only on informing and instructing instead of truly engaging the public in deliberation. Often both scientists and religious leaders believe that the public is not capable of sufficiently understanding the complexities of the issues, so they resort to assuming that the public should just do what the experts tell them.
However, there is significant research that demonstrates the ability of the general public to grasp complex issues in order to make sound decisions. Hence, it is the experts who often fail to understand how their messages are being received and what the public hopes, wants, and fears.
Overall, I remain sanguine about the likelihood of both religious and scientific organizations learning how to work better with each other, and together with the public. Working together for the good of all, we can integrate the insights and the desires of all stakeholders in order to face our challenges with well-informed reason and compassion for all, particularly those most in need.
[Ed. Note: Don't miss the other perspectives in this Big Question series, from a science scholar and a Rabbi/M.D.]
Scientists: Don’t Leave Religious Communities Out in the Cold
[Editor's Note: This essay is in response to our current Big Question series: "How can the religious and scientific communities work together to foster a culture that is equipped to face humanity's biggest challenges?"]
I humbly submit that the question should be rephrased: How can the religious and scientific communities NOT work together to face humanity's biggest challenges? The stakes are higher than ever before, and we simply cannot afford to go it alone.
I believe in evolution -- the evolution of the relationship of science and religion.
The future of the world depends on our collaboration. I believe in evolution -- the evolution of the relationship of science and religion. Science and religion have lived in alternately varying relationships ranging from peaceful coexistence to outright warfare. Today we have evolved and have begun to embrace the biological relationship of mutualism. This is in part due to the advances in medicine and science.
Previous scientific discoveries and paradigm shifts precipitated varying theological responses. With Copernicus, we grappled with the relationship of the earth to the universe. With Darwin, we re-evaluated the relationship of man to the other creatures on earth. However, as theologically complex as these debates were, they had no practical relevance to the common man. Indeed, it was possible for people to live their entire lives happily without pondering these issues.
In the 21st century, the microscope is honing in further, with discoveries relating to the understanding of the very nature and composition of the human being, both body and mind/soul. Thus, as opposed to the past, the implications of the latest scientific advances directly affect the common man. The religious implications are not left to the ivory tower theologians. Regular people are now confronted with practical religious questions previously unimagined.
For example, in the field of infertility, if a married woman undergoes donor insemination, is she considered an adulteress? If a woman of one faith gestates the child of another faith, to whose faith does the child belong? If your heart is failing, can you avail yourself of stem cells derived from human embryos, or would you be considered an accomplice to murder? Would it be preferable to use artificially derived stem cells if they are available?
The implications of our current debates are profound, and profoundly personal. Science is the great equalizer. Every living being can potentially benefit from medical advances. We are all consumers of the scientific advances, irrespective of race or religion. As such, we all deserve a say in their development.
If the development of the science is collaborative, surely the contemplation of its ethical/religious applications should likewise be.
With gene editing, uterus transplants, head transplants, artificial reproductive seed, and animal-human genetic combinations as daily headlines, we have myriad ethical dilemmas to ponder. What limits should we set for the uses of different technologies? How should they be financed? We must even confront the very definition of what it means to be human. A human could receive multiple artificial transplants, 3D printed organs, genetic derivatives, or organs grown in animals. When does a person become another person or lose his identity? Will a being produced entirely from synthetic DNA be human?
In the Middle Ages, it was possible for one person to master all of the known science, and even sometimes religion as well, such as the great Maimonides. In the pre-modern era, discoveries were almost always attributed to one individual: Jenner, Lister, Koch, Pasteur, and so on. Today, it is impossible for any one human being to master medicine, let alone ethics, religion, etc. Advances are made not usually by one person but by collaboration, often involving hundreds, if not thousands of people across the globe. We cite journal articles, not individuals. Furthermore, the magnitude and speed of development is staggering. Add artificial intelligence and it will continue to expand exponentially.
If the development of the science is collaborative, surely the contemplation of its ethical/religious applications should likewise be. The issues are so profound that we need all genes on deck. The religious community should have a prominent seat at the table. There is great wisdom in the religious traditions that can inform contemporary discussions. In addition, the religious communities are significant consumers of, not to mention contributors to, the medical technology.
An ongoing dialogue between the scientific and religious communities should be an institutionalized endeavor, not a sporadic event, reactive to a particular discovery. The National Institutes of Health or other national organizations could provide an online newsletter designed for the clergy with a summary of the latest developments and their potential applications. An annual meeting of scientists and religious leaders could provide a forum for the scientists to appreciate the religious ramifications of their research (which may be none as well) and for the clergy to appreciate the rapidly developing fields of science and the implications for their congregants. Theological seminaries must include basic scientific literacy as part of their curricula.
We need the proper medium of mutual respect and admiration, despite healthy disagreement.
How do we create a "culture"? Microbiological cultures take time and require the proper medium for maximal growth. If one of the variables is altered, the culture can be affected. To foster a culture of continued successful collaboration between scientists and religious communities, we likewise need the proper medium of mutual respect and admiration, despite healthy disagreement.
The only way we can navigate these unchartered waters is through constant, deep and meaningful collaboration every single step of the way. By cultivating a mutualistic relationship we can inform, caution and safeguard each other to maximize the benefits of emerging technologies.
[Ed. Note: Don't miss the other perspectives in this Big Question series, from a science scholar and a Reverend/molecular geneticist.]