Genetically Sequencing Healthy Babies Yielded Surprising Results
Today in Melrose, Massachusetts, Cora Stetson is the picture of good health, a bubbly precocious 2-year-old. But Cora has two separate mutations in the gene that produces a critical enzyme called biotinidase and her body produces only 40 percent of the normal levels of that enzyme.
In the last few years, the dream of predicting and preventing diseases through genomics, starting in childhood, is finally within reach.
That's enough to pass conventional newborn (heelstick) screening, but may not be enough for normal brain development, putting baby Cora at risk for seizures and cognitive impairment. But thanks to an experimental study in which Cora's DNA was sequenced after birth, this condition was discovered and she is being treated with a safe and inexpensive vitamin supplement.
Stories like these are beginning to emerge from the BabySeq Project, the first clinical trial in the world to systematically sequence healthy newborn infants. This trial was led by my research group with funding from the National Institutes of Health. While still controversial, it is pointing the way to a future in which adults, or even newborns, can receive comprehensive genetic analysis in order to determine their risk of future disease and enable opportunities to prevent them.
Some believe that medicine is still not ready for genomic population screening, but others feel it is long overdue. After all, the sequencing of the Human Genome Project was completed in 2003, and with this milestone, it became feasible to sequence and interpret the genome of any human being. The costs have come down dramatically since then; an entire human genome can now be sequenced for about $800, although the costs of bioinformatic and medical interpretation can add another $200 to $2000 more, depending upon the number of genes interrogated and the sophistication of the interpretive effort.
Two-year-old Cora Stetson, whose DNA sequencing after birth identified a potentially dangerous genetic mutation in time for her to receive preventive treatment.
(Photo courtesy of Robert Green)
The ability to sequence the human genome yielded extraordinary benefits in scientific discovery, disease diagnosis, and targeted cancer treatment. But the ability of genomes to detect health risks in advance, to actually predict the medical future of an individual, has been mired in controversy and slow to manifest. In particular, the oft-cited vision that healthy infants could be genetically tested at birth in order to predict and prevent the diseases they would encounter, has proven to be far tougher to implement than anyone anticipated.
But in the last few years, the dream of predicting and preventing diseases through genomics, starting in childhood, is finally within reach. Why did it take so long? And what remains to be done?
Great Expectations
Part of the problem was the unrealistic expectations that had been building for years in advance of the genomic science itself. For example, the 1997 film Gattaca portrayed a near future in which the lifetime risk of disease was readily predicted the moment an infant is born. In the fanfare that accompanied the completion of the Human Genome Project, the notion of predicting and preventing future disease in an individual became a powerful meme that was used to inspire investment and public support for genomic research long before the tools were in place to make it happen.
Another part of the problem was the success of state-mandated newborn screening programs that began in the 1960's with biochemical tests of the "heel-stick" for babies with metabolic disorders. These programs have worked beautifully, costing only a few dollars per baby and saving thousands of infants from death and severe cognitive impairment. It seemed only logical that a new technology like genome sequencing would add power and promise to such programs. But instead of embracing the notion of newborn sequencing, newborn screening laboratories have thus far rejected the entire idea as too expensive, too ambiguous, and too threatening to the comfortable constituency that they had built within the public health framework.
"What can you find when you look as deeply as possible into the medical genomes of healthy individuals?"
Creating the Evidence Base for Preventive Genomics
Despite a number of obstacles, there are researchers who are exploring how to achieve the original vision of genomic testing as a tool for disease prediction and prevention. For example, in our NIH-funded MedSeq Project, we were the first to ask the question: "What can you find when you look as deeply as possible into the medical genomes of healthy individuals?"
Most people do not understand that genetic information comes in four separate categories: 1) dominant mutations putting the individual at risk for rare conditions like familial forms of heart disease or cancer, (2) recessive mutations putting the individual's children at risk for rare conditions like cystic fibrosis or PKU, (3) variants across the genome that can be tallied to construct polygenic risk scores for common conditions like heart disease or type 2 diabetes, and (4) variants that can influence drug metabolism or predict drug side effects such as the muscle pain that occasionally occurs with statin use.
The technological and analytical challenges of our study were formidable, because we decided to systematically interrogate over 5000 disease-associated genes and report results in all four categories of genetic information directly to the primary care physicians for each of our volunteers. We enrolled 200 adults and found that everyone who was sequenced had medically relevant polygenic and pharmacogenomic results, over 90 percent carried recessive mutations that could have been important to reproduction, and an extraordinary 14.5 percent carried dominant mutations for rare genetic conditions.
A few years later we launched the BabySeq Project. In this study, we restricted the number of genes to include only those with child/adolescent onset that could benefit medically from early warning, and even so, we found 9.4 percent carried dominant mutations for rare conditions.
At first, our interpretation around the high proportion of apparently healthy individuals with dominant mutations for rare genetic conditions was simple – that these conditions had lower "penetrance" than anticipated; in other words, only a small proportion of those who carried the dominant mutation would get the disease. If this interpretation were to hold, then genetic risk information might be far less useful than we had hoped.
Suddenly the information available in the genome of even an apparently healthy individual is looking more robust, and the prospect of preventive genomics is looking feasible.
But then we circled back with each adult or infant in order to examine and test them for any possible features of the rare disease in question. When we did this, we were surprised to see that in over a quarter of those carrying such mutations, there were already subtle signs of the disease in question that had not even been suspected! Now our interpretation was different. We now believe that genetic risk may be responsible for subclinical disease in a much higher proportion of people than has ever been suspected!
Meanwhile, colleagues of ours have been demonstrating that detailed analysis of polygenic risk scores can identify individuals at high risk for common conditions like heart disease. So adding up the medically relevant results in any given genome, we start to see that you can learn your risks for a rare monogenic condition, a common polygenic condition, a bad effect from a drug you might take in the future, or for having a child with a devastating recessive condition. Suddenly the information available in the genome of even an apparently healthy individual is looking more robust, and the prospect of preventive genomics is looking feasible.
Preventive Genomics Arrives in Clinical Medicine
There is still considerable evidence to gather before we can recommend genomic screening for the entire population. For example, it is important to make sure that families who learn about such risks do not suffer harms or waste resources from excessive medical attention. And many doctors don't yet have guidance on how to use such information with their patients. But our research is convincing many people that preventive genomics is coming and that it will save lives.
In fact, we recently launched a Preventive Genomics Clinic at Brigham and Women's Hospital where information-seeking adults can obtain predictive genomic testing with the highest quality interpretation and medical context, and be coached over time in light of their disease risks toward a healthier outcome. Insurance doesn't yet cover such testing, so patients must pay out of pocket for now, but they can choose from a menu of genetic screening tests, all of which are more comprehensive than consumer-facing products. Genetic counseling is available but optional. So far, this service is for adults only, but sequencing for children will surely follow soon.
As the costs of sequencing and other Omics technologies continue to decline, we will see both responsible and irresponsible marketing of genetic testing, and we will need to guard against unscientific claims. But at the same time, we must be far more imaginative and fast moving in mainstream medicine than we have been to date in order to claim the emerging benefits of preventive genomics where it is now clear that suffering can be averted, and lives can be saved. The future has arrived if we are bold enough to grasp it.
Funding and Disclosures:
Dr. Green's research is supported by the National Institutes of Health, the Department of Defense and through donations to The Franca Sozzani Fund for Preventive Genomics. Dr. Green receives compensation for advising the following companies: AIA, Applied Therapeutics, Helix, Ohana, OptraHealth, Prudential, Verily and Veritas; and is co-founder and advisor to Genome Medical, Inc, a technology and services company providing genetics expertise to patients, providers, employers and care systems.
By now you have probably heard something about CRISPR, the simple and relatively inexpensive method of precisely editing the genomes of plants, animals, and humans.
The treatment of disease in fetuses, the liminal category of life between embryos and humans, poses the next frontier.
Through CRISPR and other methods of gene editing, scientists have produced crops to be more nutritious, better able to resist pests, and tolerate droughts; engineered animals ranging from fruit flies to monkeys to make them better suited for scientific study; and experimentally treated the HIV virus, Hepatitis B, and leukemia in human patients.
There are also currently FDA-approved trials to treat blindness, cancer, and sickle cell disease in humans using gene editing, and there is consensus that CRISPR's therapeutic applications will grow significantly in the coming years.
While the treatment of human disease through use of gene editing is not without its medical and ethical concerns, the avoidance of disease in embryos is far more fraught. Nonetheless, Nature reported in November that He Jiankui, a scientist in China, had edited twin embryos to disable a gene called CCR5 in hopes of avoiding transmission of HIV from their HIV-positive father.
Though there are questions about the effectiveness and necessity of this therapy, He reported that sequencing has proven his embryonic gene edits were successful and the twins were "born normal and healthy," although his claims have not been independently verified.
More recently, Denis Rebrikov, a Russian scientist, announced his plans to disable the same gene in embryos to be implanted in HIV-positive women later this year. Futuristic as it may seem, prenatal gene editing is already here.
The treatment of disease in fetuses, the liminal category of life between embryos and humans, poses the next frontier. Numerous conditions—some minor, some resulting in a lifetime of medical treatment, some incompatible with life outside of the womb—can be diagnosed through use of prenatal diagnostic testing. There is promising research suggesting doctors will soon be able to treat or mitigate at least some of them through use of fetal gene editing.
This research could soon present women carrying genetically anomalous fetuses a third option aside from termination or birthing a child who will likely face a challenging and uncertain medical future: Whether to undergo a fetal genetic intervention.
However, genetic intervention will open the door to a host of ethical considerations, particularly with respect to the relationship between pregnant women and prenatal genetic counselors. Current counselors theoretically provide objective information and answer questions rather than advise their pregnant client whether to continue with her pregnancy, despite the risks, or to have an abortion.
In practice, though, prenatal genetic counseling is most often directive, and the nature of the counseling pregnant women receive can depend on numerous factors, including their religious and cultural beliefs, their perceived ability to handle a complicated pregnancy and subsequent birth, and their financial status. Introducing the possibility of a fetal genetic intervention will exacerbate counselor reliance upon these considerations and in some cases lead to counseling that is even more directive.
Some women in the near future will face the choice of whether to abort, keep, or treat a genetically anomalous fetus.
Future counselors will have to figure out under what circumstances it is even appropriate to broach the subject. Should they only discuss therapies that are FDA-approved, or should they mention experimental treatments? What about interventions that are available in Europe or Asia, but banned in the United States? Or even in the best case of scenario of an FDA-approved treatment, should a counselor make reference to it if she knows for a fact that her client cannot possibly afford it?
Beyond the basic question of what information to share, counselors will have to confront the fact that the very notion of fixing or "editing" offspring will be repugnant to many women, and inherent in the suggestion is the stigmatization of individuals with disabilities. Prenatal genetic counselors will be on the forefront of debates surrounding which fetuses should remain as they are and which ones should be altered.
Despite these concerns, some women in the near future will face the choice of whether to abort, keep, or treat a genetically anomalous fetus in utero. Take, for example, a woman who learns during prenatal testing that her fetus has Angelman syndrome, a genetic disorder characterized by intellectual disability, speech impairment, loss of muscle control, epilepsy, and a small head. There is currently no human treatment for Angelman syndrome, which is caused by a loss of function in a single gene, UBE3A.
But scientists at the University of North Carolina have been able to treat Angelman syndrome in fetal mice by reactivating UBE3A through use of a single injection. The therapy has also proven effective in cultured human brain cells. This suggests that a woman might soon have to consider injecting her fetus's brain with a CRISPR concoction custom-designed to target UBE3A, rather than terminate her pregnancy or bring her fetus to term unaltered.
Assuming she receives the adequate information to make an informed choice, she too will face an ethical conundrum. There will be the inherent risks of injecting anything into a developing fetus's brain, including the possibility of infection, brain damage, and miscarriage. But there are also risks specific to gene editing, such as so-called off-target effects, the possibility of impacting genes other than the intended one. Such effects are highly unpredictable and can be difficult to detect. So too is it impossible to predict how altering UBE3A might lead to other genetic and epigenetic changes once the baby is born.
There are no easy answers to the many questions that will arise in this space.
A woman deciding how to act in this scenario must balance these risks against the potential benefits of the therapy, layered on top of her belief system, resources, and personal ethics. The calculus will be different for every woman, and even the same woman might change her mind from one pregnancy to the next based on the severity of the condition diagnosed and other available medical options.
Her genetic counselor, meanwhile, must be sensitive to all of these concerns in helping her make her decision, keeping up to date on the possible new treatments, and carefully choosing which information to disclose in striving to be neutral. There are no easy answers to the many questions that will arise in this space, but better to start thinking about them now, before it is too late.
Agriculture in the 21st century is not as simple as it once was. With a population seven billion strong, a climate in crisis, and sustainability in farming practices on everyone's radar, figuring out how to feed the masses without destroying the Earth is a pressing concern.
Tufts scientists argue that insect cells may be better suited to lab-created meat protein than traditional farm animal cells.
In addition to low-emission cows and drone pollinators, there's a promising new solution on the table. How does "lab-grown insect meat" grab you?
Writing in Frontiers in Sustainable Food Systems, researchers at Tufts University say insects that are fed plants and genetically modified for maximum growth, nutrition, and flavor could be the best, greenest alternative to our current livestock farming practices. This lab-grown protein source could produce high volume, nutritious food without the massive resources required for traditional animal agriculture.
"Due to the environmental, public health, and animal welfare concerns associated with our current livestock system, it is vital to develop more sustainable food production methods," says lead author Natalie Rubio. Could insect meat be the key?
Next Up
New sustainable food production includes what's called "cellular agriculture," an emerging industry and field of study in which meat and dairy are produced via cells in a lab instead of whole animals. So far, scientists have primarily focused on bovine, porcine, and avian cells to create this "cultured meat."
But the Tufts scientists argue that insect cells may be better suited to lab-created meat protein than traditional farm animal cells.
"Compared to cultured mammalian, avian, and other vertebrate cells, insect cell cultures require fewer resources and less energy-intensive environmental control, as they have lower glucose requirements and can thrive in a wider range of temperature, pH, oxygen, and osmolarity conditions," reports Rubio.
"Alterations necessary for large-scale production are also simpler to achieve with insect cells, which are currently used for biomanufacturing of insecticides, drugs, and vaccines," she adds.
They still have some details to hash out, however, including how to make cultured insect meat more like the steak and chicken we're all familiar with.
"Despite this immense potential, cultured insect meat isn't ready for consumption," says Rubio. "Research is ongoing to master two key processes: controlling development of insect cells into muscle and fat, and combining these in 3D cultures with a meat-like texture." They are currently experimenting with mushroom-derived fiber to tackle the latter.
People would still be able to eat meat—it would just come from a different source.
Open Questions
As the report points out, one thing that makes cellular agriculture an attractive alternative to high-density animal farming is that it doesn't require consumers to change their behaviors. People would still be able to eat meat—it would just come from a different source.
But the big question remains: How will lab-grown insect meat taste? Will the buggers really taste as good as burgers?
And, of course, there's the "ew" factor. Meat alternatives have proven to work for some people—Tofurky is still in business, after all—but it may be a hard sell to get the masses to jump on board with eating bugs. Consuming creepy crawlies sounds simply unpalatable to many, and the term "lab-grown, cellular insect meat" doesn't help much. Perhaps an entirely new nomenclature is in order.
Another question is whether or not folks will trust such scientifically-created food. People already use the term "frankenfood" to refer to genetic modification -- even though the vast majority of the corn and soybeans planted in the U.S. today are genetically engineered, and other major crops with GM varieties include potatoes, apples, squash, and papayas. Still, combining GM technology with eating insects may be a hard sell.
However, we're all going to have to get used to trying new things if we want to leave a habitable home for our children. If a lab-grown bug burger can save the planet, maybe it's worth a shot.