Genetically Sequencing Healthy Babies Yielded Surprising Results
Today in Melrose, Massachusetts, Cora Stetson is the picture of good health, a bubbly precocious 2-year-old. But Cora has two separate mutations in the gene that produces a critical enzyme called biotinidase and her body produces only 40 percent of the normal levels of that enzyme.
In the last few years, the dream of predicting and preventing diseases through genomics, starting in childhood, is finally within reach.
That's enough to pass conventional newborn (heelstick) screening, but may not be enough for normal brain development, putting baby Cora at risk for seizures and cognitive impairment. But thanks to an experimental study in which Cora's DNA was sequenced after birth, this condition was discovered and she is being treated with a safe and inexpensive vitamin supplement.
Stories like these are beginning to emerge from the BabySeq Project, the first clinical trial in the world to systematically sequence healthy newborn infants. This trial was led by my research group with funding from the National Institutes of Health. While still controversial, it is pointing the way to a future in which adults, or even newborns, can receive comprehensive genetic analysis in order to determine their risk of future disease and enable opportunities to prevent them.
Some believe that medicine is still not ready for genomic population screening, but others feel it is long overdue. After all, the sequencing of the Human Genome Project was completed in 2003, and with this milestone, it became feasible to sequence and interpret the genome of any human being. The costs have come down dramatically since then; an entire human genome can now be sequenced for about $800, although the costs of bioinformatic and medical interpretation can add another $200 to $2000 more, depending upon the number of genes interrogated and the sophistication of the interpretive effort.
Two-year-old Cora Stetson, whose DNA sequencing after birth identified a potentially dangerous genetic mutation in time for her to receive preventive treatment.
(Photo courtesy of Robert Green)
The ability to sequence the human genome yielded extraordinary benefits in scientific discovery, disease diagnosis, and targeted cancer treatment. But the ability of genomes to detect health risks in advance, to actually predict the medical future of an individual, has been mired in controversy and slow to manifest. In particular, the oft-cited vision that healthy infants could be genetically tested at birth in order to predict and prevent the diseases they would encounter, has proven to be far tougher to implement than anyone anticipated.
But in the last few years, the dream of predicting and preventing diseases through genomics, starting in childhood, is finally within reach. Why did it take so long? And what remains to be done?
Great Expectations
Part of the problem was the unrealistic expectations that had been building for years in advance of the genomic science itself. For example, the 1997 film Gattaca portrayed a near future in which the lifetime risk of disease was readily predicted the moment an infant is born. In the fanfare that accompanied the completion of the Human Genome Project, the notion of predicting and preventing future disease in an individual became a powerful meme that was used to inspire investment and public support for genomic research long before the tools were in place to make it happen.
Another part of the problem was the success of state-mandated newborn screening programs that began in the 1960's with biochemical tests of the "heel-stick" for babies with metabolic disorders. These programs have worked beautifully, costing only a few dollars per baby and saving thousands of infants from death and severe cognitive impairment. It seemed only logical that a new technology like genome sequencing would add power and promise to such programs. But instead of embracing the notion of newborn sequencing, newborn screening laboratories have thus far rejected the entire idea as too expensive, too ambiguous, and too threatening to the comfortable constituency that they had built within the public health framework.
"What can you find when you look as deeply as possible into the medical genomes of healthy individuals?"
Creating the Evidence Base for Preventive Genomics
Despite a number of obstacles, there are researchers who are exploring how to achieve the original vision of genomic testing as a tool for disease prediction and prevention. For example, in our NIH-funded MedSeq Project, we were the first to ask the question: "What can you find when you look as deeply as possible into the medical genomes of healthy individuals?"
Most people do not understand that genetic information comes in four separate categories: 1) dominant mutations putting the individual at risk for rare conditions like familial forms of heart disease or cancer, (2) recessive mutations putting the individual's children at risk for rare conditions like cystic fibrosis or PKU, (3) variants across the genome that can be tallied to construct polygenic risk scores for common conditions like heart disease or type 2 diabetes, and (4) variants that can influence drug metabolism or predict drug side effects such as the muscle pain that occasionally occurs with statin use.
The technological and analytical challenges of our study were formidable, because we decided to systematically interrogate over 5000 disease-associated genes and report results in all four categories of genetic information directly to the primary care physicians for each of our volunteers. We enrolled 200 adults and found that everyone who was sequenced had medically relevant polygenic and pharmacogenomic results, over 90 percent carried recessive mutations that could have been important to reproduction, and an extraordinary 14.5 percent carried dominant mutations for rare genetic conditions.
A few years later we launched the BabySeq Project. In this study, we restricted the number of genes to include only those with child/adolescent onset that could benefit medically from early warning, and even so, we found 9.4 percent carried dominant mutations for rare conditions.
At first, our interpretation around the high proportion of apparently healthy individuals with dominant mutations for rare genetic conditions was simple – that these conditions had lower "penetrance" than anticipated; in other words, only a small proportion of those who carried the dominant mutation would get the disease. If this interpretation were to hold, then genetic risk information might be far less useful than we had hoped.
Suddenly the information available in the genome of even an apparently healthy individual is looking more robust, and the prospect of preventive genomics is looking feasible.
But then we circled back with each adult or infant in order to examine and test them for any possible features of the rare disease in question. When we did this, we were surprised to see that in over a quarter of those carrying such mutations, there were already subtle signs of the disease in question that had not even been suspected! Now our interpretation was different. We now believe that genetic risk may be responsible for subclinical disease in a much higher proportion of people than has ever been suspected!
Meanwhile, colleagues of ours have been demonstrating that detailed analysis of polygenic risk scores can identify individuals at high risk for common conditions like heart disease. So adding up the medically relevant results in any given genome, we start to see that you can learn your risks for a rare monogenic condition, a common polygenic condition, a bad effect from a drug you might take in the future, or for having a child with a devastating recessive condition. Suddenly the information available in the genome of even an apparently healthy individual is looking more robust, and the prospect of preventive genomics is looking feasible.
Preventive Genomics Arrives in Clinical Medicine
There is still considerable evidence to gather before we can recommend genomic screening for the entire population. For example, it is important to make sure that families who learn about such risks do not suffer harms or waste resources from excessive medical attention. And many doctors don't yet have guidance on how to use such information with their patients. But our research is convincing many people that preventive genomics is coming and that it will save lives.
In fact, we recently launched a Preventive Genomics Clinic at Brigham and Women's Hospital where information-seeking adults can obtain predictive genomic testing with the highest quality interpretation and medical context, and be coached over time in light of their disease risks toward a healthier outcome. Insurance doesn't yet cover such testing, so patients must pay out of pocket for now, but they can choose from a menu of genetic screening tests, all of which are more comprehensive than consumer-facing products. Genetic counseling is available but optional. So far, this service is for adults only, but sequencing for children will surely follow soon.
As the costs of sequencing and other Omics technologies continue to decline, we will see both responsible and irresponsible marketing of genetic testing, and we will need to guard against unscientific claims. But at the same time, we must be far more imaginative and fast moving in mainstream medicine than we have been to date in order to claim the emerging benefits of preventive genomics where it is now clear that suffering can be averted, and lives can be saved. The future has arrived if we are bold enough to grasp it.
Funding and Disclosures:
Dr. Green's research is supported by the National Institutes of Health, the Department of Defense and through donations to The Franca Sozzani Fund for Preventive Genomics. Dr. Green receives compensation for advising the following companies: AIA, Applied Therapeutics, Helix, Ohana, OptraHealth, Prudential, Verily and Veritas; and is co-founder and advisor to Genome Medical, Inc, a technology and services company providing genetics expertise to patients, providers, employers and care systems.
Top Fertility Doctor: Artificially Created Sperm and Eggs "Will Become Normal" One Day
Imagine two men making a baby. Or two women. Or an infertile couple. Or an older woman whose eggs are no longer viable. None of these people could have a baby today without the help of an egg or sperm donor.
Cells scraped from the inside of your cheek could one day be manipulated to become either eggs or sperm.
But in the future, it may be possible for them to reproduce using only their own genetic material, thanks to an emerging technology called IVG, or in vitro gametogenesis.
Researchers are learning how to reprogram adult human cells like skin cells to become lab-created egg and sperm cells, which could then be joined to form an embryo. In other words, cells scraped from the inside of your cheek could one day be manipulated to become either eggs or sperm, no matter your gender or your reproductive fitness.
In 2016, Japanese scientists proved that the concept could be successfully carried out in mice. Now some experts, like Dr. John Zhang, the founder and CEO of New Hope Fertility Center in Manhattan, say it's just "a matter of time" before the method is also made to work in humans.
Such a technological tour de force would upend our most basic assumptions about human reproduction and biology. Combined with techniques like gene editing, these tools could eventually enable prospective parents to have an unprecedented level of choice and control over their children's origins. It's a wildly controversial notion, and an especially timely one now that a Chinese scientist has announced the birth of the first allegedly CRISPR-edited babies. (The claims remain unverified.)
Zhang himself is no stranger to controversy. In 2016, he stunned the world when he announced the birth of a baby conceived using the DNA of three people, a landmark procedure intended to prevent the baby from inheriting a devastating neurological disease. (Zhang went to a clinic in Mexico to carry out the procedure because it is prohibited in the U.S.) Zhang's other achievements to date include helping a 49-year-old woman have a baby using her own eggs and restoring a young woman's fertility through an ovarian tissue transplant surgery.
Zhang recently sat down with our Editor-in-Chief in his New York office overlooking Columbus Circle to discuss the fertility world's latest provocative developments. Here are his top ten insights:
Clearly [gene-editing embryos] will be beneficial to mankind, but it's a matter of how and when the work is done.
1) On a Chinese scientist's claim of creating the first CRISPR-edited babies:
I'm glad that we made a first move toward a clinical application of this technology for mankind. Somebody has to do this. Whether this was a good case or not, there is still time to find out.
Clearly it will be beneficial to mankind, but it's a matter of how and when the work is done. Like any scientific advance, it has to be done in a very responsible way.
Today's response is identical to when the world's first IVF baby was announced in 1978. The major news media didn't take it seriously and thought it was evil, wanted to keep a distance from IVF. Many countries even abandoned IVF, but today you see it is a normal practice. And it took almost 40 years [for the researchers] to win a Nobel Prize.
I think we need more time to understand how this work was done medically, ethically, and let the scientist have the opportunity to present how it was done and let a scientific journal publish the paper. Before these become available, I don't think we should start being upset, scared, or giving harsh criticism.
2) On the international outcry in response to the news:
I feel we are in scientific shock, with many thinking it came too fast, too soon. We all embrace modern technology, but when something really comes along, we fear it. In an old Chinese saying, one of the masters always dreamed of seeing the dragon, and when the dragon really came, he got scared.
Dr. John Zhang, the founder and CEO of New Hope Fertility Center in Manhattan, pictured in his office.
3) On the Western world's perception that Chinese scientists sometimes appear to discount ethics in favor of speedy breakthroughs:
I think this perception is not fair. I don't think China is very casual. It's absolutely not what people think. I don't want people to feel that this case [of CRISPR-edited babies] will mean China has less standards over how human reproduction should be performed. Just because this happened, it doesn't mean in China you can do anything you want.
As far as the regulation of IVF clinics, China is probably the most strictly regulated of any country I know in this world.
4) On China's first public opinion poll gauging attitudes toward gene-edited babies, indicating that more than 60 percent of survey respondents supported using the technology to prevent inherited diseases, but not to enhance traits:
There is a sharp contrast between the general public and the professional world. Being a working health professional and an advocate of scientists working in this field, it is very important to be ethically responsible for what we are doing, but my own feeling is that from time to time we may not take into consideration what the patient needs.
5) On how the three-parent baby is doing today, several years after his birth:
No news is good news.
6) On the potentially game-changing research to develop artificial sperm and eggs:
First of all I think that anything that's technically possible, as long as you are not harmful to other people, to other societies, as long as you do it responsibly, and this is a legitimate desire, I think eventually it will become reality.
My research for now is really to try to overcome the very next obstacle in our field, which is how to let a lady age 44 or older have a baby with her own genetic material.
Practically 99 percent of women over age 43 will never make a baby on their own. And after age 47, we usually don't offer donor egg IVF anymore.
But with improved longevity, and quality of life, the lifespan of females continues to increase. In Japan, the average for females is about 89 years old. So for more than half of your life, you will not be able to produce a baby, which is quite significant in the animal kingdom. In most of the animal kingdom, their reproductive life is very much the same as their life, but then you can argue in the animal kingdom unlike a human being, it doesn't take such a long time for them to contribute to the society because once you know how to hunt and look for food, you're done.
"I think this will become a major ethical debate: whether we should let an older lady have a baby at a very late state of her life."
But humans are different. You need to go to college, get certain skills. It takes 20 years to really bring a human being up to become useful to society. That's why the mom and dad are not supposed to have the same reproductive life equal to their real life.
I think this will become a major ethical debate: whether we should let an older lady have a baby at a very late state of her life and leave the future generation in a very vulnerable situation in which they may lack warm caring, proper guidance, and proper education.
7) On using artificial gametes to grant more reproductive choices to gays and lesbians:
I think it is totally possible to have two sperm make a baby, and two eggs make babies.
If we have two guys, one guy to produce eggs, or two girls, one would have to become sperm. Basically you are creating artificial gametes or converting with gametes from sperm to become egg or egg to become a sperm. Which may not necessarily be very difficult. The key is to be able to do nuclear reprogramming.
So why can two sperm not make offspring now? You get exactly half of your genes from each parent. The genes have their own imprinting that say "made in mom," "made in dad." The two sperm would say "made in dad," "made in dad." If I can erase the "made in dad," and say "made in mom," then these sperm can make offspring.
8) On how close science is to creating artificial gametes for clinical use in pregnancies:
It's very hard to say until we accomplish it. It could be very quick. It could be it takes a long time. I don't want to speculate.
"I think these technologies are the solid foundation just like when we designed the computer -- we never thought a computer would become the iPhone."
9) On whether there should be ethical red lines drawn by authorities or whether the decisions should be left to patients and scientists:
I think we cannot believe a hundred percent in the scientist and the patient but it should not be 100 percent authority. It should be coming from the whole of society.
10) On his expectations for the future:
We are living in a very exciting world. I think that all these technologies can really change the way of mankind and also are not just for baby-making. The research, the experience, the mechanism we learn from these technologies, they will shine some great lights into our long-held dream of being a healthy population that is cancer-free and lives a long life, let's say 120 years.
I think these technologies are the solid foundation just like when we designed the computer -- we never thought a computer would become the iPhone. Imagine making a computer 30 years ago, that this little chip will change your life.
Kira Peikoff was the editor-in-chief of Leaps.org from 2017 to 2021. As a journalist, her work has appeared in The New York Times, Newsweek, Nautilus, Popular Mechanics, The New York Academy of Sciences, and other outlets. She is also the author of four suspense novels that explore controversial issues arising from scientific innovation: Living Proof, No Time to Die, Die Again Tomorrow, and Mother Knows Best. Peikoff holds a B.A. in Journalism from New York University and an M.S. in Bioethics from Columbia University. She lives in New Jersey with her husband and two young sons. Follow her on Twitter @KiraPeikoff.
Black Participants Are Sorely Absent from Medical Research
After years of suffering from mysterious symptoms, my mother Janice Thomas finally found a doctor who correctly diagnosed her with two autoimmune diseases, Lupus and Sjogren's. Both diseases are more prevalent in the black population than in other races and are often misdiagnosed.
The National Institutes of Health has found that minorities make up less than 10 percent of trial participants.
Like many chronic health conditions, a lack of understanding persists about their causes, individual manifestations, and best treatment strategies.
On the search for relief from chronic pain, my mother started researching options and decided to participate in clinical trials as a way to gain much-needed insights. In return, she received discounted medical testing and has played an active role in finding answers for all.
"When my doctor told me I could get financial or medical benefits from participating in clinical trials for the same test I was already doing, I figured it would be an easy way to get some answers at little to no cost," she says.
As a person of color, her presence in clinical studies is rare. The National Institutes of Health has found that minorities make up less than 10 percent of trial participants.
Without trial participation that is reflective of the general population, pharmaceutical companies and medical professionals are left guessing how various drugs work across racial lines. For example, albuterol, a widely used asthma treatment, was found to have decreased effectiveness for black American and Puerto Rican children. Many high mortality conditions, like cancer, also show different outcomes based on race.
Over the last decade, the pervasive lack of representation has left communities of color demanding higher levels of involvement in the research process. However, no consensus yet exists on how best to achieve this.
But experts suggest that before we can improve black participation in medical studies, key misconceptions must be addressed, such as the false assumption that such patients are unwilling to participate because they distrust scientists.
Jill A. Fisher, a professor in the Center for Bioethics at the University of North Carolina at Chapel Hill, learned in one study that mistrust wasn't the main barrier for black Americans. "There is a lot of evidence that researchers' recruitment of black Americans is generally poorly done, with many black patients simply not asked," Fisher says. "Moreover, the underrepresentation of black Americans is primarily true for efficacy trials - those testing whether an investigational drug might therapeutically benefit patients with specific illnesses."
Without increased minority participation, research will not accurately reflect the diversity of the general population.
Dr. Joyce Balls-Berry, a psychiatric epidemiologist and health educator, agrees that black Americans are often overlooked in the process. One study she conducted found that "enrollment of minorities in clinical trials meant using a variety of culturally appropriate strategies to engage participants," she explained.
To overcome this hurdle, The National Black Church Initiative (NBCI), a faith-based organization made up of 34,000 churches and over 15.7 million African Americans, last year urged the Food and Drug Administration to mandate diversity in all clinical trials before approving a drug or device. However, the FDA declined to implement the mandate, declaring that they don't have the authority to regulate diversity in clinical trials.
"African Americans have not been successfully incorporated into the advancement of medicine and research technologies as legitimate and natural born citizens of this country," admonishes NBCI's president Rev. Anthony Evans.
His words conjure a reminder of the medical system's insidious history for people of color. The most infamous incident is the Tuskegee syphilis scandal, in which white government doctors perpetrated harmful experiments on hundreds of unsuspecting black men for forty years, until the research was shut down in the early 1970s.
Today, in the second decade of twenty-first century, the pernicious narrative that blacks are outsiders in science and medicine must be challenged, says Dr. Danielle N. Lee, assistant professor of biological sciences at Southern Illinois University. And having majority white participants in clinical trials only furthers the notion that "whiteness" is the default.
According to Lee, black individuals often see themselves disconnected from scientific and medical processes. "One of the critiques with science and medical research is that communities of color, and black communities in particular, regard ourselves as outsiders of science," Lee says. "We are othered."
Without increased minority participation, research will not accurately reflect the diversity of the general population.
"We are all human, but we are different, and yes, even different populations of people require modified medical responses," she points out.
Another obstacle is that many trials have health requirements that exclude black Americans, like not wanting individuals who have high blood pressure or a history of stroke. Considering that this group faces health disparities at a higher rate than whites, this eliminates eligibility for millions of potential participants.
One way to increase the diversity in sample participation without an FDA mandate is to include more black Americans in both volunteer and clinical roles during the research process to increase accountability in treatment, education, and advocacy.
"When more of us participate in clinical trials, we help build out the basic data sets that account for health disparities from the start, not after the fact," Lee says. She also suggests that researchers involve black patient representatives throughout the clinical trial process, from the study design to the interpretation of results.
"This allows for the black community to give insight on how to increase trial enrollment and help reduce stigma," she explains.
Thankfully, partnerships are popping up like the one between The Howard University's Cancer Center and Driver, a platform that connects cancer patients to treatment and trials. These sorts of targeted and culturally tailored efforts allow black patients to receive assistance from well-respected organizations.
Some observers suggest that the federal government and pharmaceutical industries must step up to address the gap.
However, some experts say that the black community should not be held solely responsible for solving a problem it did not cause. Instead, some observers suggest that the federal government and pharmaceutical industries must step up to address the gap.
According to Balls-Berry, socioeconomic barriers like transportation, time off work, and childcare related to trial participation must be removed. "These are real-world issues and yet many times researchers have not included these things in their budgets."
When asked to comment, a spokesperson for BIO, the world's largest biotech trade association, emailed the following statement:
"BIO believes that that our members' products and services should address the needs of a diverse population, and enhancing participation in clinical trials by a diverse patient population is a priority for BIO and our member companies. By investing in patient education to improve awareness of clinical trial opportunities, we can reduce disparities in clinical research to better reflect the country's changing demographics."
For my mother, the patient suffering from autoimmune disease, the need for broad participation in medical research is clear. "Without clinical trials, we would have less diagnosis and solutions to diseases," she says. "I think it's an underutilized resource."