Abortions Before Fetal Viability Are Legal: Might Science and the Change on the Supreme Court Undermine That?
This article is part of the magazine, "The Future of Science In America: The Election Issue," co-published by LeapsMag, the Aspen Institute Science & Society Program, and GOOD.
Viability—the potential for a fetus to survive outside the womb—is a core dividing line in American law. For almost 50 years, the Supreme Court of the United States has struck down laws that ban all or most abortions, ruling that women's constitutional rights include choosing to end pregnancies before the point of viability. Once viability is reached, however, states have a "compelling interest" in protecting fetal life. At that point, states can choose to ban or significantly restrict later-term abortions provided states allow an exception to preserve the life or health of the mother.
This distinction between a fetus that could survive outside its mother's body, albeit with significant medical intervention, and one that could not, is at the heart of the court's landmark 1973 decision in Roe v. Wade. The framework of viability remains central to the country's abortion law today, even as some states have passed laws in the name of protecting women's health that significantly undermine Roe. Over the last 30 years, the Supreme Court has upheld these laws, which have the effect of restricting pre-viability abortion access, imposing mandatory waiting periods, requiring parental consent for minors, and placing restrictions on abortion providers.
Viability has always been a slippery notion on which to pin legal rights.
Today, the Guttmacher Institute reports that more than half of American women live in states whose laws are considered hostile to abortion, largely as a result of these intrusions on pre-viability abortion access. Nevertheless, the viability framework stands: while states can pass pre-viability abortion restrictions that (ostensibly) protect the health of the woman or that strike some kind a balance between women's rights and fetal life, it is only after viability that they can completely favor fetal life over the rights of the woman (with limited exceptions when the woman's life is threatened). As a result, judges have struck down certain states' so-called heartbeat laws, which tried to prohibit abortions after detection of a fetal heartbeat (as early as six weeks of pregnancy). Bans on abortion after 12 or 15 weeks' gestation have also been reversed.
Now, with a new Supreme Court Justice expected to be hostile to abortion rights, advances in the care of preterm babies and ongoing research on artificial wombs suggest that the point of viability is already sooner than many assume and could soon be moved radically earlier in gestation, potentially providing a legal basis for earlier and earlier abortion bans.
Viability has always been a slippery notion on which to pin legal rights. It represents an inherently variable and medically shifting moment in the pregnancy timeline that the Roe majority opinion declined to firmly define, noting instead that "[v]iability is usually placed at about seven months (28 weeks) but may occur earlier, even at 24 weeks." Even in 1977, this definition was an optimistic generalization. Every baby is different, and while some 28-week infants born the year Roe was decided did indeed live into adulthood, most died at or shortly after birth. The prognosis for infants born at 24 weeks was much worse.
Today, a baby born at 28 weeks' gestation can be expected to do much better, largely due to the development of surfactant treatment in the early 1990s to help ease the air into babies' lungs. Now, the majority of 24-week-old babies can survive, and several very premature babies, born just shy of 22 weeks' gestation, have lived into childhood. All this variability raises the question: Should the law take a very optimistic, if largely unrealistic, approach to defining viability and place it at 22 weeks, even though the overall survival rate for those preemies remains less than 10% today? Or should the law recognize that keeping a premature infant alive requires specialist care, meaning that actual viability differs not just pregnancy-to-pregnancy but also by healthcare facility and from country to country? A 24-week premature infant born in a rural area or in a developing nation may not be viable as a practical matter, while one born in a major U.S. city with access to state-of-the-art care has a greater than 70% chance of survival. Just as some extremely premature newborns survive, some full-term babies die before, during, or soon after birth, regardless of whether they have access to advanced medical care.
To be accurate, viability should be understood as pregnancy-specific and should take into account the healthcare resources available to that woman. But state laws can't capture this degree of variability by including gestation limits in their abortion laws. Instead, many draw a somewhat arbitrary line at 22, 24, or 28 weeks' gestation, regardless of the particulars of the pregnancy or the medical resources available in that state.
As variable and resource-dependent as viability is today, science may soon move that point even earlier. Ectogenesis is a term coined in 1923 for the growth of an organism outside the body. Long considered science fiction, this technology has made several key advances in the past few years, with scientists announcing in 2017 that they had successfully gestated premature lamb fetuses in an artificial womb for four weeks. Currently in development for use in human fetuses between 22 and 23 weeks' gestation, this technology will almost certainly seek to push viability earlier in pregnancy.
Ectogenesis and other improvements in managing preterm birth deserve to be celebrated, offering new hope to the parents of very premature infants. But in the U.S., and in other nations whose abortion laws are fixed to viability, these same advances also pose a threat to abortion access. Abortion opponents have long sought to move the cutoff for legal abortions, and it is not hard to imagine a state prohibiting all abortions after 18 or 20 weeks by arguing that medical advances render this stage "the new viability," regardless of whether that level of advanced care is available to women in that state. If ectogenesis advances further, the limit could be moved to keep pace.
The Centers for Disease Control and Prevention reports that over 90% of abortions in America are performed at or before 13 weeks, meaning that in the short term, only a small number women would be affected by shifting viability standards. Yet these women are in difficult situations and deserve care and consideration. Research has shown that women seeking later terminations often did not recognize that they were pregnant or had their dates quite wrong, while others report that they had trouble accessing a termination earlier in pregnancy, were afraid to tell their partner or parents, or only recently received a diagnosis of health problems with the fetus.
Shifts in viability over the past few decades have already affected these women, many of whom report struggling to find a provider willing to perform a termination at 18 or 20 weeks out of concern that the woman may have her dates wrong. Ever-earlier gestational limits would continue this chilling effect, making doctors leery of terminating a pregnancy that might be within 2–4 weeks of each new ban. Some states' existing gestational limits on abortion are also inconsistent with prenatal care, which includes genetic testing between 12 and 20 weeks' gestation, as well as an anatomy scan to check the fetus's organ development performed at approximately 20 weeks. If viability moves earlier, prenatal care will be further undermined.
Perhaps most importantly, earlier and earlier abortion bans are inconsistent with the rights and freedoms on which abortion access is based, including recognition of each woman's individual right to bodily integrity and decision-making authority over her own medical care. Those rights and freedoms become meaningless if abortion bans encroach into the weeks that women need to recognize they are pregnant, assess their options, seek medical advice, and access appropriate care. Fetal viability, with its shifting goalposts, isn't the best framework for abortion protection in light of advancing medical science.
Ideally, whether to have an abortion would be a decision that women make in consultation with their doctors, free of state interference. The vast majority of women already make this decision early in pregnancy; the few who come to the decision later do so because something has gone seriously wrong in their lives or with their pregnancies. If states insist on drawing lines based on historical measures of viability, at 24 or 26 or 28 weeks, they should stick with those gestational limits and admit that they no longer represent actual viability but correspond instead to some form of common morality about when the fetus has a protected, if not absolute, right to life. Women need a reasonable amount of time to make careful and informed decisions about whether to continue their pregnancies precisely because these decisions have a lasting impact on their bodies and their lives. To preserve that time, legislators and the courts should decouple abortion rights from ectogenesis and other advances in the care of extremely premature infants that move the point of viability ever earlier.
[Editor's Note: This article was updated after publication to reflect Amy Coney Barrett's confirmation. To read other articles in this special magazine issue, visit the e-reader version.]
The coronavirus pandemic exposed significant weaknesses in the country's food supply chain. Grocery store meat counters were bare. Transportation interruptions influenced supply. Finding beef, poultry, and pork at the store has been, in some places, as challenging as finding toilet paper.
In traditional agriculture models, it takes at least three months to raise chicken, six to nine months for pigs, and 18 months for cattle.
It wasn't a lack of supply -- millions of animals were in the pipeline.
"There's certainly enough food out there, but it can't get anywhere because of the way our system is set up," said Amy Rowat, an associate professor of integrative biology and physiology at UCLA. "Having a more self-contained, self-sufficient way to produce meat could make the supply chain more robust."
Cultured meat could be one way of making the meat supply chain more resilient despite disruptions due to pandemics such as COVID-19. But is the country ready to embrace lab-grown food?
According to a Good Food Institute study, GenZ is almost twice as likely to embrace meat alternatives for reasons related to social and environmental awareness, even prior to the pandemic. That's because this group wants food choices that reflect their values around food justice, equity, and animal welfare.
Largely, the interest in protein alternatives has been plant-based foods. However, factors directly related to COVID-19 may accelerate consumer interest in the scaling up of cell-grown products, according to Liz Specht, the associate director of science and technology at The Good Food Institute. The latter is a nonprofit organization that supports scientists, investors, and entrepreneurs working to develop food alternatives to conventional animal products.
While lab-grown food isn't ready yet to definitively crisis-proof the food supply chain, experts say it offers promise.
Matching Supply and Demand
Companies developing cell-grown meat claim it can take as few as two months to develop a cell into an edible product, according to Anthony Chow, CFA at Agronomics Limited, an investment company focused on meat alternatives. Tissue is taken from an animal and placed in a culture that contains nutrients and proteins the cells need to grow and expand. He cites a Good Food Institute report that claims a 2.5-millimeter sample can grow three and a half tons of meat in 40 days, allowing for exponential growth when needed.
In traditional agriculture models, it takes at least three months to raise chicken, six to nine months for pigs, and 18 months for cattle. To keep enough maturing animals in the pipeline, farms must plan the number of animals to raise months -- even years -- in advance. Lab-grown meat advocates say that because cultured meat supplies can be flexible, it theoretically allows for scaling up or down in significantly less time.
"Supply and demand has drastically changed in some way around the world and cultivated meat processing would be able to adapt much quicker than conventional farming," Chow said.
Scaling Up
Lab-grown meat may provide an eventual solution, but not in the immediate future, said Paul Mozdziak, a professor of physiology at North Carolina State University who researches animal cell culture techniques, transgenic animal production, and muscle biology.
"The challenge is in culture media," he said. "It's going to take some innovation to get the cells to grow at quantities that are going to be similar to what you can get from an animal. These are questions that everybody in the space is working on."
Chow says some of the most advanced cultured meat companies, such as BlueNal, anticipate introducing products to the market midway through next year. However, he thinks COVID-19 has slowed the process. Once introduced, they will be at a premium price, most likely available at restaurants before they hit grocery store shelves.
"I think in five years' time it will be in a different place," he said. "I don't think that this will have relevance for this pandemic, but certainly beyond that."
"Plant-based meats may be perceived as 'alternatives' to meat, whereas lab-grown meat is producing the same meat, just in a much more efficient manner, without the environmental implications."
Of course, all the technological solutions in the world won't solve the problem unless people are open-minded about embracing them. At least for now, a lab-grown burger or bluefin tuna might still be too strange for many people, especially in the U.S.
For instance, a 2019 article published by "Frontiers in Sustainable Food Systems" reflects results from a study of 3,030 consumers showing that 29 percent of U.S. customers, 59 percent of Chinese consumers, and 56 percent of Indian consumers were either 'very' or 'extremely likely' to try cultivated meat.
"Lab-grown meat is genuine meat, at the cellular level, and therefore will match conventional meat with regard to its nutritional content and overall sensory experience. It could be argued that plant-based meat will never be able to achieve this," says Laura Turner, who works with Chow at Agronomics Limited. "Plant-based meats may be perceived as 'alternatives' to meat, whereas lab-grown meat is producing the same meat, just in a much more efficient manner, without the environmental implications."
A Solution Beyond This Pandemic
The coronavirus has done more than raise awareness of the fragility of food supply chains. It has also been a wakeup call for consumers and policy makers that it is time to radically rethink our meat, Specht says. Those factors have elevated the profile of lab-grown meat.
"I think the economy is getting a little bit more steam and if I was an investor, I would be getting excited about it," adds Mozdziak.
Beyond crises, Mozdziak explains that as affluence continues to increase globally, meat consumption increases exponentially. Yet farm animals can only grow so quickly and traditional farming won't be able to keep up.
"Even Tyson is saying that by 2050, there's not going to be enough capacity in the animal meat space to meet demand," he notes. "If we don't look at some innovative technologies, how are we going to overcome that?"
By mid-March, Alpha Lee was growing restless. A pioneer of AI-driven drug discovery, Lee leads a team of researchers at the University of Cambridge, but his lab had been closed amidst the government-initiated lockdowns spreading inexorably across Europe.
If the Moonshot proves successful, they hope it could serve as a future benchmark for finding new medicines for chronic diseases.
Having spoken to his collaborators across the globe – many of whom were seeing their own experiments and research projects postponed indefinitely due to the pandemic – he noticed a similar sense of frustration and helplessness in the face of COVID-19.
While there was talk of finding a novel treatment for the virus, Lee was well aware the process was likely to be long and laborious. Traditional methods of drug discovery risked suffering the same fate as the efforts to find a cure for SARS in the early 2000, which took years and were ultimately abandoned long before a drug ever reached the market.
To avoid such an outcome, Lee was convinced that global collaboration was required. Together with a collection of scientists in the UK, US and Israel, he launched the 'COVID Moonshot' – a project which encouraged chemists worldwide to share their ideas for potential drug designs. If the Moonshot proves successful, they hope it could serve as a future benchmark for finding new medicines for chronic diseases.
Solving a Complex Jigsaw
In February, ShanghaiTech University published the first detailed snapshots of the SARS-CoV-2 coronavirus's proteins using a technique called X-ray crystallography. In particular, they revealed a high-resolution profile of the virus's main protease – the part of its structure that enables it to replicate inside a host – and the main drug target. The images were tantalizing.
"We could see all the tiny pieces sitting in the structure like pieces of a jigsaw," said Lee. "All we needed was for someone to come up with the best idea of joining these pieces together with a drug. Then you'd be left with a strong molecule which sits in the protease, and stops it from working, killing the virus in the process."
Normally, ideas for how best to design such a drug would be kept as carefully guarded secrets within individual labs and companies due to their potential value. But as a result, the steady process of trial and error to reach an optimum design can take years to come to fruition.
However, given the scale of the global emergency, Lee felt that the scientific community would be open to collective brainstorming on a mass scale. "Big Pharma usually wouldn't necessarily do this, but time is of the essence here," he said. "It was a case of, 'Let's just rethink every drug discovery stage to see -- ok, how can we go as fast as we can?'"
On March 13, he launched the COVID moonshot, calling for chemists around the globe to come up with the most creative ideas they could think of, on their laptops at home. No design was too weird or wacky to be considered, and crucially nothing would be patented. The entire project would be done on a not-for-profit basis, meaning that any drug that makes it to market will have been created simply for the good of humanity.
It caught fire: Within just two weeks, more than 2,300 potential drug designs had been submitted. By the middle of July, over 10,000 had been received from scientists around the globe.
The Road Toward Clinical Trials
With so many designs to choose from, the team has been attempting to whittle them down to a shortlist of the most promising. Computational drug discovery experts at Diamond and the Weizmann Institute of Science in Rehovot, Israel, have enabled the Moonshot team to develop algorithms for predicting how quick and easy each design would be to make, and to predict how well each proposed drug might bind to the virus in real life.
The latter is an approach known as computational covalent docking and has previously been used in cancer research. "This was becoming more popular even before COVID-19, with several covalent drugs approved by the FDA in recent years," said Nir London, professor of organic chemistry at the Weizmann Institute, and one of the Moonshot team members. "However, all of these were for oncology. A covalent drug against SARS-CoV-2 will certainly highlight covalent drug-discovery as a viable option."
Through this approach, the team have selected 850 compounds to date, which they have manufactured and tested in various preclinical trials already. Fifty of these compounds - which appear to be especially promising when it comes to killing the virus in a test tube – are now being optimized further.
Lee is hoping that at least one of these potential drugs will be shown to be effective in curing animals of COVID-19 within the next six months, a step that would allow the Moonshot team to reach out to potential pharmaceutical partners to test their compounds in humans.
Future Implications
If the project does succeed, some believe it could open the door to scientific crowdsourcing as a future means of generating novel medicine ideas for other diseases. Frank von Delft, professor of protein science and structural biology at the University of Oxford's Nuffield Department of Medicine, described it as a new form of 'citizen science.'
"There's a vast resource of expertise and imagination that is simply dying to be tapped into," he said.
Others are slightly more skeptical, pointing out that the uniqueness of the current crisis has meant that many scientists were willing to contribute ideas without expecting any future compensation in return. This meant that it was easy to circumvent the traditional hurdles that prevent large-scale global collaborations from happening – namely how to decide who will profit from the final product and who will hold the intellectual property (IP) rights.
"I think it is too early to judge if this is a viable model for future drug discovery," says London. "I am not sure that without the existential threat we would have seen so many contributions, and so many people and institutions willing to waive compensation and future royalties. Many scientists found themselves at home, frustrated that they don't have a way to contribute to the fight against COVID-19, and this project gave them an opportunity. Plus many can get behind the fact that this project has no associated IP and no one will get rich off of this effort. This breaks down a lot of the typical barriers and red-tape for wider collaboration."
"If a drug would sprout from one of these crowdsourced ideas, it would serve as a very powerful argument to consider this mode of drug discovery further in the future."
However the Moonshot team believes that if they can succeed, it will at the very least send a strong statement to policy makers and the scientific community that greater efforts should be made to make such large-scale collaborations more feasible.
"All across the scientific world, we've seen unprecedented adoption of open-science, collaboration and collegiality during this crisis, perhaps recognizing that only a coordinated global effort could address this global challenge," says London. "If a drug would sprout from one of these crowdsourced ideas, it would serve as a very powerful argument to consider this mode of drug discovery further in the future."
[An earlier version of this article was published on June 8th, 2020 as part of a standalone magazine called GOOD10: The Pandemic Issue. Produced as a partnership among LeapsMag, The Aspen Institute, and GOOD, the magazine is available for free online.]