Is Finding Out Your Baby’s Genetics A New Responsibility of Parenting?
Hours after a baby is born, its heel is pricked with a lancet. Drops of the infant's blood are collected on a porous card, which is then mailed to a state laboratory. The dried blood spots are screened for around thirty conditions, including phenylketonuria (PKU), the metabolic disorder that kick-started this kind of newborn screening over 60 years ago. In the U.S., parents are not asked for permission to screen their child. Newborn screening programs are public health programs, and the assumption is that no good parent would refuse a screening test that could identify a serious yet treatable condition in their baby.
Learning as much as you can about your child's health might seem like a natural obligation of parenting. But it's an assumption that I think needs to be much more closely examined.
Today, with the introduction of genome sequencing into clinical medicine, some are asking whether newborn screening goes far enough. As the cost of sequencing falls, should parents take a more expansive look at their children's health, learning not just whether they have a rare but treatable childhood condition, but also whether they are at risk for untreatable conditions or for diseases that, if they occur at all, will strike only in adulthood? Should genome sequencing be a part of every newborn's care?
It's an idea that appeals to Anne Wojcicki, the founder and CEO of the direct-to-consumer genetic testing company 23andMe, who in a 2016 interview with The Guardian newspaper predicted that having newborns tested would soon be considered standard practice—"as critical as testing your cholesterol"—and a new responsibility of parenting. Wojcicki isn't the only one excited to see everyone's genes examined at birth. Francis Collins, director of the National Institutes of Health and perhaps the most prominent advocate of genomics in the United States, has written that he is "almost certain … that whole-genome sequencing will become part of new-born screening in the next few years." Whether that would happen through state-mandated screening programs, or as part of routine pediatric care—or perhaps as a direct-to-consumer service that parents purchase at birth or receive as a baby-shower gift—is not clear.
Learning as much as you can about your child's health might seem like a natural obligation of parenting. But it's an assumption that I think needs to be much more closely examined, both because the results that genome sequencing can return are more complex and more uncertain than one might expect, and because parents are not actually responsible for their child's lifelong health and well-being.
What is a parent supposed to do about such a risk except worry?
Existing newborn screening tests look for the presence of rare conditions that, if identified early in life, before the child shows any symptoms, can be effectively treated. Sequencing could identify many of these same kinds of conditions (and it might be a good tool if it could be targeted to those conditions alone), but it would also identify gene variants that confer an increased risk rather than a certainty of disease. Occasionally that increased risk will be significant. About 12 percent of women in the general population will develop breast cancer during their lives, while those who have a harmful BRCA1 or BRCA2 gene variant have around a 70 percent chance of developing the disease. But for many—perhaps most—conditions, the increased risk associated with a particular gene variant will be very small. Researchers have identified over 600 genes that appear to be associated with schizophrenia, for example, but any one of those confers only a tiny increase in risk for the disorder. What is a parent supposed to do about such a risk except worry?
Sequencing results are uncertain in other important ways as well. While we now have the ability to map the genome—to create a read-out of the pairs of genetic letters that make up a person's DNA—we are still learning what most of it means for a person's health and well-being. Researchers even have a name for gene variants they think might be associated with a disease or disorder, but for which they don't have enough evidence to be sure. They are called "variants of unknown (or uncertain) significance (VUS), and they pop up in most people's sequencing results. In cancer genetics, where much research has been done, about 1 in 5 gene variants are reclassified over time. Most are downgraded, which means that a good number of VUS are eventually designated benign.
While one parent might reasonably decide to learn about their child's risk for a condition about which nothing can be done medically, a different, yet still thoroughly reasonable, parent might prefer to remain ignorant so that they can enjoy the time before their child is afflicted.
Then there's the puzzle of what to do about results that show increased risk or even certainty for a condition that we have no idea how to prevent. Some genomics advocates argue that even if a result is not "medically actionable," it might have "personal utility" because it allows parents to plan for their child's future needs, to enroll them in research, or to connect with other families whose children carry the same genetic marker.
Finding a certain gene variant in one child might inform parents' decisions about whether to have another—and if they do, about whether to use reproductive technologies or prenatal testing to select against that variant in a future child. I have no doubt that for some parents these personal utility arguments are persuasive, but notice how far we've now strayed from the serious yet treatable conditions that motivated governments to set up newborn screening programs, and to mandate such testing for all.
Which brings me to the other problem with the call for sequencing newborn babies: the idea that even if it's not what the law requires, it's what good parents should do. That idea is very compelling when we're talking about sequencing results that show a serious threat to the child's health, especially when interventions are available to prevent or treat that condition. But as I have shown, many sequencing results are not of this type.
While one parent might reasonably decide to learn about their child's risk for a condition about which nothing can be done medically, a different, yet still thoroughly reasonable, parent might prefer to remain ignorant so that they can enjoy the time before their child is afflicted. This parent might decide that the worry—and the hypervigilence it could inspire in them—is not in their child's best interest, or indeed in their own. This parent might also think that it should be up to the child, when he or she is older, to decide whether to learn about his or her risk for adult-onset conditions, especially given that many adults at high familial risk for conditions like Alzheimer's or Huntington's disease choose never to be tested. This parent will value the child's future autonomy and right not to know more than they value the chance to prepare for a health risk that won't strike the child until 40 or 50 years in the future.
Parents are not obligated to learn about their children's risk for a condition that cannot be prevented, has a small risk of occurring, or that would appear only in adulthood.
Contemporary understandings of parenting are famously demanding. We are asked to do everything within our power to advance our children's health and well-being—to act always in our children's best interests. Against that backdrop, the need to sequence every newborn baby's genome might seem obvious. But we should be skeptical. Many sequencing results are complex and uncertain. Parents are not obligated to learn about their children's risk for a condition that cannot be prevented, has a small risk of occurring, or that would appear only in adulthood. To suggest otherwise is to stretch parental responsibilities beyond the realm of childhood and beyond factors that parents can control.
By now you have probably heard something about CRISPR, the simple and relatively inexpensive method of precisely editing the genomes of plants, animals, and humans.
The treatment of disease in fetuses, the liminal category of life between embryos and humans, poses the next frontier.
Through CRISPR and other methods of gene editing, scientists have produced crops to be more nutritious, better able to resist pests, and tolerate droughts; engineered animals ranging from fruit flies to monkeys to make them better suited for scientific study; and experimentally treated the HIV virus, Hepatitis B, and leukemia in human patients.
There are also currently FDA-approved trials to treat blindness, cancer, and sickle cell disease in humans using gene editing, and there is consensus that CRISPR's therapeutic applications will grow significantly in the coming years.
While the treatment of human disease through use of gene editing is not without its medical and ethical concerns, the avoidance of disease in embryos is far more fraught. Nonetheless, Nature reported in November that He Jiankui, a scientist in China, had edited twin embryos to disable a gene called CCR5 in hopes of avoiding transmission of HIV from their HIV-positive father.
Though there are questions about the effectiveness and necessity of this therapy, He reported that sequencing has proven his embryonic gene edits were successful and the twins were "born normal and healthy," although his claims have not been independently verified.
More recently, Denis Rebrikov, a Russian scientist, announced his plans to disable the same gene in embryos to be implanted in HIV-positive women later this year. Futuristic as it may seem, prenatal gene editing is already here.
The treatment of disease in fetuses, the liminal category of life between embryos and humans, poses the next frontier. Numerous conditions—some minor, some resulting in a lifetime of medical treatment, some incompatible with life outside of the womb—can be diagnosed through use of prenatal diagnostic testing. There is promising research suggesting doctors will soon be able to treat or mitigate at least some of them through use of fetal gene editing.
This research could soon present women carrying genetically anomalous fetuses a third option aside from termination or birthing a child who will likely face a challenging and uncertain medical future: Whether to undergo a fetal genetic intervention.
However, genetic intervention will open the door to a host of ethical considerations, particularly with respect to the relationship between pregnant women and prenatal genetic counselors. Current counselors theoretically provide objective information and answer questions rather than advise their pregnant client whether to continue with her pregnancy, despite the risks, or to have an abortion.
In practice, though, prenatal genetic counseling is most often directive, and the nature of the counseling pregnant women receive can depend on numerous factors, including their religious and cultural beliefs, their perceived ability to handle a complicated pregnancy and subsequent birth, and their financial status. Introducing the possibility of a fetal genetic intervention will exacerbate counselor reliance upon these considerations and in some cases lead to counseling that is even more directive.
Some women in the near future will face the choice of whether to abort, keep, or treat a genetically anomalous fetus.
Future counselors will have to figure out under what circumstances it is even appropriate to broach the subject. Should they only discuss therapies that are FDA-approved, or should they mention experimental treatments? What about interventions that are available in Europe or Asia, but banned in the United States? Or even in the best case of scenario of an FDA-approved treatment, should a counselor make reference to it if she knows for a fact that her client cannot possibly afford it?
Beyond the basic question of what information to share, counselors will have to confront the fact that the very notion of fixing or "editing" offspring will be repugnant to many women, and inherent in the suggestion is the stigmatization of individuals with disabilities. Prenatal genetic counselors will be on the forefront of debates surrounding which fetuses should remain as they are and which ones should be altered.
Despite these concerns, some women in the near future will face the choice of whether to abort, keep, or treat a genetically anomalous fetus in utero. Take, for example, a woman who learns during prenatal testing that her fetus has Angelman syndrome, a genetic disorder characterized by intellectual disability, speech impairment, loss of muscle control, epilepsy, and a small head. There is currently no human treatment for Angelman syndrome, which is caused by a loss of function in a single gene, UBE3A.
But scientists at the University of North Carolina have been able to treat Angelman syndrome in fetal mice by reactivating UBE3A through use of a single injection. The therapy has also proven effective in cultured human brain cells. This suggests that a woman might soon have to consider injecting her fetus's brain with a CRISPR concoction custom-designed to target UBE3A, rather than terminate her pregnancy or bring her fetus to term unaltered.
Assuming she receives the adequate information to make an informed choice, she too will face an ethical conundrum. There will be the inherent risks of injecting anything into a developing fetus's brain, including the possibility of infection, brain damage, and miscarriage. But there are also risks specific to gene editing, such as so-called off-target effects, the possibility of impacting genes other than the intended one. Such effects are highly unpredictable and can be difficult to detect. So too is it impossible to predict how altering UBE3A might lead to other genetic and epigenetic changes once the baby is born.
There are no easy answers to the many questions that will arise in this space.
A woman deciding how to act in this scenario must balance these risks against the potential benefits of the therapy, layered on top of her belief system, resources, and personal ethics. The calculus will be different for every woman, and even the same woman might change her mind from one pregnancy to the next based on the severity of the condition diagnosed and other available medical options.
Her genetic counselor, meanwhile, must be sensitive to all of these concerns in helping her make her decision, keeping up to date on the possible new treatments, and carefully choosing which information to disclose in striving to be neutral. There are no easy answers to the many questions that will arise in this space, but better to start thinking about them now, before it is too late.
Agriculture in the 21st century is not as simple as it once was. With a population seven billion strong, a climate in crisis, and sustainability in farming practices on everyone's radar, figuring out how to feed the masses without destroying the Earth is a pressing concern.
Tufts scientists argue that insect cells may be better suited to lab-created meat protein than traditional farm animal cells.
In addition to low-emission cows and drone pollinators, there's a promising new solution on the table. How does "lab-grown insect meat" grab you?
Writing in Frontiers in Sustainable Food Systems, researchers at Tufts University say insects that are fed plants and genetically modified for maximum growth, nutrition, and flavor could be the best, greenest alternative to our current livestock farming practices. This lab-grown protein source could produce high volume, nutritious food without the massive resources required for traditional animal agriculture.
"Due to the environmental, public health, and animal welfare concerns associated with our current livestock system, it is vital to develop more sustainable food production methods," says lead author Natalie Rubio. Could insect meat be the key?
Next Up
New sustainable food production includes what's called "cellular agriculture," an emerging industry and field of study in which meat and dairy are produced via cells in a lab instead of whole animals. So far, scientists have primarily focused on bovine, porcine, and avian cells to create this "cultured meat."
But the Tufts scientists argue that insect cells may be better suited to lab-created meat protein than traditional farm animal cells.
"Compared to cultured mammalian, avian, and other vertebrate cells, insect cell cultures require fewer resources and less energy-intensive environmental control, as they have lower glucose requirements and can thrive in a wider range of temperature, pH, oxygen, and osmolarity conditions," reports Rubio.
"Alterations necessary for large-scale production are also simpler to achieve with insect cells, which are currently used for biomanufacturing of insecticides, drugs, and vaccines," she adds.
They still have some details to hash out, however, including how to make cultured insect meat more like the steak and chicken we're all familiar with.
"Despite this immense potential, cultured insect meat isn't ready for consumption," says Rubio. "Research is ongoing to master two key processes: controlling development of insect cells into muscle and fat, and combining these in 3D cultures with a meat-like texture." They are currently experimenting with mushroom-derived fiber to tackle the latter.
People would still be able to eat meat—it would just come from a different source.
Open Questions
As the report points out, one thing that makes cellular agriculture an attractive alternative to high-density animal farming is that it doesn't require consumers to change their behaviors. People would still be able to eat meat—it would just come from a different source.
But the big question remains: How will lab-grown insect meat taste? Will the buggers really taste as good as burgers?
And, of course, there's the "ew" factor. Meat alternatives have proven to work for some people—Tofurky is still in business, after all—but it may be a hard sell to get the masses to jump on board with eating bugs. Consuming creepy crawlies sounds simply unpalatable to many, and the term "lab-grown, cellular insect meat" doesn't help much. Perhaps an entirely new nomenclature is in order.
Another question is whether or not folks will trust such scientifically-created food. People already use the term "frankenfood" to refer to genetic modification -- even though the vast majority of the corn and soybeans planted in the U.S. today are genetically engineered, and other major crops with GM varieties include potatoes, apples, squash, and papayas. Still, combining GM technology with eating insects may be a hard sell.
However, we're all going to have to get used to trying new things if we want to leave a habitable home for our children. If a lab-grown bug burger can save the planet, maybe it's worth a shot.