Abortions Before Fetal Viability Are Legal: Might Science and the Change on the Supreme Court Undermine That?
This article is part of the magazine, "The Future of Science In America: The Election Issue," co-published by LeapsMag, the Aspen Institute Science & Society Program, and GOOD.
Viability—the potential for a fetus to survive outside the womb—is a core dividing line in American law. For almost 50 years, the Supreme Court of the United States has struck down laws that ban all or most abortions, ruling that women's constitutional rights include choosing to end pregnancies before the point of viability. Once viability is reached, however, states have a "compelling interest" in protecting fetal life. At that point, states can choose to ban or significantly restrict later-term abortions provided states allow an exception to preserve the life or health of the mother.
This distinction between a fetus that could survive outside its mother's body, albeit with significant medical intervention, and one that could not, is at the heart of the court's landmark 1973 decision in Roe v. Wade. The framework of viability remains central to the country's abortion law today, even as some states have passed laws in the name of protecting women's health that significantly undermine Roe. Over the last 30 years, the Supreme Court has upheld these laws, which have the effect of restricting pre-viability abortion access, imposing mandatory waiting periods, requiring parental consent for minors, and placing restrictions on abortion providers.
Viability has always been a slippery notion on which to pin legal rights.
Today, the Guttmacher Institute reports that more than half of American women live in states whose laws are considered hostile to abortion, largely as a result of these intrusions on pre-viability abortion access. Nevertheless, the viability framework stands: while states can pass pre-viability abortion restrictions that (ostensibly) protect the health of the woman or that strike some kind a balance between women's rights and fetal life, it is only after viability that they can completely favor fetal life over the rights of the woman (with limited exceptions when the woman's life is threatened). As a result, judges have struck down certain states' so-called heartbeat laws, which tried to prohibit abortions after detection of a fetal heartbeat (as early as six weeks of pregnancy). Bans on abortion after 12 or 15 weeks' gestation have also been reversed.
Now, with a new Supreme Court Justice expected to be hostile to abortion rights, advances in the care of preterm babies and ongoing research on artificial wombs suggest that the point of viability is already sooner than many assume and could soon be moved radically earlier in gestation, potentially providing a legal basis for earlier and earlier abortion bans.
Viability has always been a slippery notion on which to pin legal rights. It represents an inherently variable and medically shifting moment in the pregnancy timeline that the Roe majority opinion declined to firmly define, noting instead that "[v]iability is usually placed at about seven months (28 weeks) but may occur earlier, even at 24 weeks." Even in 1977, this definition was an optimistic generalization. Every baby is different, and while some 28-week infants born the year Roe was decided did indeed live into adulthood, most died at or shortly after birth. The prognosis for infants born at 24 weeks was much worse.
Today, a baby born at 28 weeks' gestation can be expected to do much better, largely due to the development of surfactant treatment in the early 1990s to help ease the air into babies' lungs. Now, the majority of 24-week-old babies can survive, and several very premature babies, born just shy of 22 weeks' gestation, have lived into childhood. All this variability raises the question: Should the law take a very optimistic, if largely unrealistic, approach to defining viability and place it at 22 weeks, even though the overall survival rate for those preemies remains less than 10% today? Or should the law recognize that keeping a premature infant alive requires specialist care, meaning that actual viability differs not just pregnancy-to-pregnancy but also by healthcare facility and from country to country? A 24-week premature infant born in a rural area or in a developing nation may not be viable as a practical matter, while one born in a major U.S. city with access to state-of-the-art care has a greater than 70% chance of survival. Just as some extremely premature newborns survive, some full-term babies die before, during, or soon after birth, regardless of whether they have access to advanced medical care.
To be accurate, viability should be understood as pregnancy-specific and should take into account the healthcare resources available to that woman. But state laws can't capture this degree of variability by including gestation limits in their abortion laws. Instead, many draw a somewhat arbitrary line at 22, 24, or 28 weeks' gestation, regardless of the particulars of the pregnancy or the medical resources available in that state.
As variable and resource-dependent as viability is today, science may soon move that point even earlier. Ectogenesis is a term coined in 1923 for the growth of an organism outside the body. Long considered science fiction, this technology has made several key advances in the past few years, with scientists announcing in 2017 that they had successfully gestated premature lamb fetuses in an artificial womb for four weeks. Currently in development for use in human fetuses between 22 and 23 weeks' gestation, this technology will almost certainly seek to push viability earlier in pregnancy.
Ectogenesis and other improvements in managing preterm birth deserve to be celebrated, offering new hope to the parents of very premature infants. But in the U.S., and in other nations whose abortion laws are fixed to viability, these same advances also pose a threat to abortion access. Abortion opponents have long sought to move the cutoff for legal abortions, and it is not hard to imagine a state prohibiting all abortions after 18 or 20 weeks by arguing that medical advances render this stage "the new viability," regardless of whether that level of advanced care is available to women in that state. If ectogenesis advances further, the limit could be moved to keep pace.
The Centers for Disease Control and Prevention reports that over 90% of abortions in America are performed at or before 13 weeks, meaning that in the short term, only a small number women would be affected by shifting viability standards. Yet these women are in difficult situations and deserve care and consideration. Research has shown that women seeking later terminations often did not recognize that they were pregnant or had their dates quite wrong, while others report that they had trouble accessing a termination earlier in pregnancy, were afraid to tell their partner or parents, or only recently received a diagnosis of health problems with the fetus.
Shifts in viability over the past few decades have already affected these women, many of whom report struggling to find a provider willing to perform a termination at 18 or 20 weeks out of concern that the woman may have her dates wrong. Ever-earlier gestational limits would continue this chilling effect, making doctors leery of terminating a pregnancy that might be within 2–4 weeks of each new ban. Some states' existing gestational limits on abortion are also inconsistent with prenatal care, which includes genetic testing between 12 and 20 weeks' gestation, as well as an anatomy scan to check the fetus's organ development performed at approximately 20 weeks. If viability moves earlier, prenatal care will be further undermined.
Perhaps most importantly, earlier and earlier abortion bans are inconsistent with the rights and freedoms on which abortion access is based, including recognition of each woman's individual right to bodily integrity and decision-making authority over her own medical care. Those rights and freedoms become meaningless if abortion bans encroach into the weeks that women need to recognize they are pregnant, assess their options, seek medical advice, and access appropriate care. Fetal viability, with its shifting goalposts, isn't the best framework for abortion protection in light of advancing medical science.
Ideally, whether to have an abortion would be a decision that women make in consultation with their doctors, free of state interference. The vast majority of women already make this decision early in pregnancy; the few who come to the decision later do so because something has gone seriously wrong in their lives or with their pregnancies. If states insist on drawing lines based on historical measures of viability, at 24 or 26 or 28 weeks, they should stick with those gestational limits and admit that they no longer represent actual viability but correspond instead to some form of common morality about when the fetus has a protected, if not absolute, right to life. Women need a reasonable amount of time to make careful and informed decisions about whether to continue their pregnancies precisely because these decisions have a lasting impact on their bodies and their lives. To preserve that time, legislators and the courts should decouple abortion rights from ectogenesis and other advances in the care of extremely premature infants that move the point of viability ever earlier.
[Editor's Note: This article was updated after publication to reflect Amy Coney Barrett's confirmation. To read other articles in this special magazine issue, visit the e-reader version.]
Thousands of Vaccine Volunteers Got a Dummy Shot. Should They Get the Real Deal Now?
The highly anticipated rollout of a COVID-19 vaccine poses ethical considerations: When will trial volunteers who got a placebo be vaccinated? And how will this affect the data in those trials?
It's an issue that vaccine manufacturers and study investigators are wrestling with as the Food and Drug Administration is expected to grant emergency use authorization this weekend to a vaccine developed by Pfizer and the German company BioNTech. Another vaccine, produced by Moderna, is nearing approval in the United States.
The most vulnerable—health care workers and nursing home residents—are deemed eligible to receive the initial limited supply in accordance with priority recommendations from the Centers for Disease Control and Prevention (CDC).
With health care workers constituting an estimated 20 percent of trial participants, this question also comes to the fore: "Is it now ethically imperative that we offer them the vaccine, those who have had placebo?" says William Schaffner, an infectious diseases physician at Vanderbilt University and an adviser to the CDC's immunization practices committee.
When a "gold-standard" measure becomes available, participants in the placebo group "would ordinarily be notified" of the strong public health recommendation to opt for immunization, says Johan Bester, interim assistant dean for biomedical science education and director of bioethics at the University of Nevada, Las Vegas School of Medicine.
"If a treatment or prevention exists that we know works, it is unethical to withhold it from people who would benefit from it just to answer a research question." This moral principle poses a quandary for ethicists and physicians alike, as they ponder possible paths to proceed with vaccination amid ongoing trials. Rigorous trials are double-blinded—neither the participants nor the investigators know who received the actual vaccine and who got a dummy injection.
"The intent of these trials is to follow these folks for up to two years," says Marci Drees, infection prevention officer and hospital epidemiologist for ChristianaCare in Wilmington, Delaware. At a minimum, she adds, researchers would prefer to monitor participants for six months.
"You can still follow safety over a long-term period of time without actually continuing to have a placebo group for comparison."
But in the midst of a pandemic, that may not be feasible. Prolonged exposure to the highly contagious and lethal virus could have dire consequences.
To avoid compromising the integrity of the blinded data, "there are some potentially creative solutions," Drees says. For instance, trial participants could receive the opposite of what they initially got, whether it was the vaccine or the placebo.
One factor in this decision-making process depends on when a particular trial is slated to conclude. If that time is approaching, the risk of waiting would be lower than if the trial is only halfway in progress, says Eric Lofgren, an epidemiologist at Washington State University who has studied the impact of COVID-19 in jails and at in-person sporting events.
Sometimes a study concludes earlier than the projected completion date. "All clinical trials have a data and safety monitoring board that reviews the interim results," Lofgren says. The board may halt a trial after finding evidence of harm, or when a treatment or vaccine has proven to be "sufficiently good," rendering it unethical to deprive the placebo group of its benefits.
The initial months of a trial are most crucial for assessing a vaccine's safety. Differences between the trial groups would be illuminating if fewer individuals who got the active vaccine contracted the virus and developed symptoms when compared to the placebo recipients. After that point, in vaccine-administered participants, "you can still follow safety over a long-term period of time without actually continuing to have a placebo group for comparison," says Dial Hewlett Jr., medical director for disease control at the Westchester County Department of Health in New York.
Even outside of a trial, safety is paramount and any severe side effects that occur will be closely monitored and investigated through national reporting networks. For example, regulators in the U.K. are investigating several rare but serious allergic reactions to the Pfizer vaccine given on Tuesday. The FDA has asked Pfizer to track allergic reactions in its safety monitoring plan, and some experts are proposing that Pfizer conduct a separate study of the vaccine on people with a history of severe allergies.
As the FDA eventually grants authorization to multiple vaccines, more participants are likely to leave trials and opt to be vaccinated. It is important that enough participants choose to stay in ongoing trials, says Nicole Hassoun, professor of philosophy at the State University of New York at Binghamton, where she directs the Global Health Impact program to extend medical access to the poor.
She's hopeful that younger participants and individuals without underlying medical conditions will make that determination. But the departure of too many participants at high risk for the virus would make it more difficult to evaluate the vaccine's safety and efficacy in those populations, Hassoun says, while acknowledging, "We can't have the best of both worlds."
Once a safe and effective vaccine is approved in the United States, "it would not be ethically appropriate to do placebo trials to test new vaccines."
One solution would entail allowing health care workers to exit a trial after a vaccine is approved, even though this would result in "a conundrum when the next group of people are brought forward to get the vaccine—whether they're people age 65 and older or they're essential workers, or whoever they are," says Vanderbilt physician Schaffner, who is a former board member of the Infectious Diseases Society of America. "All of a sudden, you'll have an erosion of the volunteers who are in the trial."
For now, one way or another, experts agree that current and subsequent trials should proceed. There is a compelling reason to identify additional vaccines with potentially greater effectiveness but with fewer side effects or less complex delivery methods that don't require storage at extremely low temperatures.
"Continuing with existing vaccine trials and starting others remains important," says Nir Eyal, professor and director of Rutgers University's Center for Population-Level Bioethics in New Brunswick, New Jersey. "We still need to tell how much proven vaccines block infections and how long their duration lasts. And populations around the world need vaccines that are easier to store and deliver, or simply cheaper."
But once a safe and effective vaccine is approved in the United States, "it would not be ethically appropriate to do placebo trials to test new vaccines," says bioethicist Bester at the University of Nevada, Las Vegas School of Medicine. "One possibility if a new vaccine emerges, is to test it against existing vaccines."
In a letter sent to trial volunteers in November, Pfizer and BioNTech committed to establishing "a process that would allow interested participants in the placebo group who meet the eligibility criteria for early access in their country to 'cross-over' to the vaccine group." The trial plans to continue monitoring all subjects regardless of whether people in the placebo group cross over, Pfizer said in a presentation to the FDA today. After Pfizer has collected six months of safety data, in April 2021, it plans to ask the FDA for full approval of the vaccine.
In the meantime, the company pledged to update volunteers as they obtain more input from regulatory authorities. "Thank you again for making a difference by being a part of this study," they wrote. "It is only through the efforts of volunteers like you that reaching this important milestone and developing a potential vaccine against COVID-19 is possible."
CORRECTION: An earlier version of this article mistakenly stated that the FDA would be granting emergency "approval" to the Pfizer/BioNTech vaccine, rather than "emergency use authorization." We regret the error.
Since March, 35 patients in the care of Dr. Gregory Jicha, a neurologist at the University of Kentucky, have died of Alzheimer's disease or related dementia.
Meanwhile, with 233 active clinical trials underway to find treatments, Jicha wonders why mainstream media outlets don't do more to highlight potential solutions to the physical, emotional and economic devastation of these diseases. "Unfortunately, it's not until we're right at the cusp of a major discovery that anybody pays attention to these very promising agents," he says.
Heightened awareness would bring more resources for faster progress, according to Jicha. Otherwise, he's concerned that current research pipelines will take over a decade.
In recent years, newspapers with national readerships have devoted more technology reporting to key developments in social media, artificial intelligence, wired gadgets and telecom. Less prominent has been news about biotech—innovations based on biology research—and new medicines emerging from this technology. That's the impression of Jicha as well as Craig Lipset, former head of clinical innovation at Pfizer. "Scientists and clinicians are entirely invested [in biotech], yet no one talks about their discoveries," he says.
With the popular press rightly focusing on progress with a vaccine for COVID-19 this year, notable developments in biomarkers, Alzheimer's and cancer research, gene therapies for cystic fibrosis, and therapeutics related to biological age may be going unreported. Jennifer Goldsack, Executive Director of the nonprofit Digital Medicine Society, is confused over the media's soft touch with biotech. "I'm genuinely interested in understanding what editors of technology sections think the public wants to be reading."
The Numbers on Media Coverage
A newspaper's health section is a sensible fit for biotech reporting. In 2020, these departments have concentrated largely on COVID-19—as they should—while sections on technology and science don't necessarily pick up on other biotech news. Emily Mullin, staff writer for the tech magazine OneZero, has observed a gap in newspaper coverage. "You have a lot of [niche outlets] reporting biotech on the business side for industry experts, and you have a lot of reporting heavily from the science side focused on [readers who are] scientists. But there aren't a lot of outlets doing more humanizing coverage of biotech."
Indeed, the volume of coverage by top-tier media outlets in the U.S. for non-COVID biotech has dropped 32 percent since the pandemic spiked in March, according to an analysis run for this article by Commetric, a company that looks at media reputation for clients in many sectors including biotech and artificial intelligence. Meanwhile, the volume of coverage for AI has held steady, up one percent.
Commetric's CEO, Magnus Hakansson, thinks important biotech stories were omitted from mainstream coverage even before the world fell into the grips of the virus. "Apart from COVID, it's been extremely difficult for biotech companies to push out their discoveries," he says. "People in biotech have to be quite creative when they want to communicate [progress in] different therapeutic areas, and that is a problem."
In mid-February, just before the pandemic dominated the news cycle, researchers used machine learning to find a powerful new antibiotic capable of killing strains of disease-causing bacteria that had previously resisted all known antibiotics. Science-focused outlets hailed the work as a breakthrough, but some nationally-read newspapers didn't mention it. "There is this very silent crisis around antibiotic resistance that no one is aware of," says Goldsack. "We could be 50 years away from not being able to give elective surgeries because we are at such a high risk of being unable to control infection."
Could mainstream media strike a better balance between cynicism toward biotech and hyping animal studies that probably won't ever benefit the humans reading about them?
What's to Gain from More Mainstream Biotech
A brighter public spotlight on biotech could result in greater support and faster progress with research, says Lipset. "One of the biggest delays in drug development is patient recruitment. Patients don't know about the opportunities," he said, because, "clinical research pipelines aren't talked about in the mainstream news." Only about eight percent of oncology patients participate.
The current focus on COVID-19, while warranted, could also be excluding lines of research that seem separate from the virus, but are actually relevant. In September, Nir Barzilai, director of the Institute of Aging Research at Albert Einstein College of Medicine, told me about eight different observational studies finding decreased COVID-19 severity among people taking a drug called metformin, which is believed to slow down the major hallmarks of biological aging, such as inflammation. Once a vaccine is approved and distributed, biologically older people could supplement it with metformin.
"Shining the spotlight on this research now could really be critical because COVID has shown what happens in older adults and how they're more at risk," says Jenna Bartley, a researcher of aging and immunology at the University of Connecticut, but she believes mainstream media sometimes miss stories on anti-aging therapies or portray them inaccurately.
The question remains why.
The Theranos Effect and Other Image Problems
Before the pandemic, Mullin, the biotech writer at OneZero, looked into a story for her editor about a company with a new test for infectious diseases. The company said its test, based on technology for editing genes, was fast, easy to use, and could be tailored to any pathogen. Mullin told her editor the evidence for the test's validity was impressive.
He wondered if readers would agree. "This is starting to sound like Theranos," he said.
The brainchild of entrepreneur Elizabeth Holmes, Theranos was valued at $9 billion in 2014. Time Magazine named Holmes one of its most influential people, and the blood-testing company was heavily covered by the media as a game changer for health outcomes—until Holmes was exposed by the Wall Street Journal as a fraud and criminally charged.
In the OneZero article, Mullin and her editor were careful to explain the gene-editing tech was legit, explicitly distinguishing it from Theranos. "I was like, yes—but this actually works! And they can show it works."
While the Holmes scandal explains some of the mistrust, it's part of a bigger pattern. The public's hopes for biotech have been frustrated repeatedly in recent decades, fostering a media mantra of fool me twice, shame on me. A recent report by Commetric noted that after the bursting of the biotech market bubble in the early 2000s, commentators grew deeply skeptical of the field. An additional source of caution may be the number of researchers in biotech with conflicts of interest such as patents or their own startups. "It's a landmine," Mullin said. "We're conditioned to think that scientists are out for the common good, but they have their own biases."
Yet another source of uncertainty: the long regulatory road and cost for new therapies to be approved by the FDA. The process can take 15 years and over a billion dollars; the percentage of drugs actually crossing the final strand of red tape is notoriously low.
"The only time stories have reached the news is when there's a sensational headline about the cure for cancer," said Lipset, "when, in fact it's about mice, and then things drop off." Meanwhile, consumer protection hurdles for some technologies, such as computer chips, are less onerous than the FDA gauntlet for new medicines. The media may view research breakthroughs in digital tech as more impactful because they're likelier to find their way into commercially available products.
And whereas a handful of digital innovations have been democratized for widespread consumption—96 percent of Americans now own a cell phone, and 72 percent use social media—journalists at nationally-read newspapers may see biotech as less attainable for the average reader. Sure, we're all aging, but will the healthcare system grant everyone fair access to treatments for slowing the aging process? Current disparities in healthcare sow reason for doubt.
And yet. Recall Lipset's point that more press coverage would drive greater participation in clinical trials, which could accelerate them and diversify participants. Could mainstream media strike a better balance between cynicism toward biotech and hyping animal studies that probably won't ever benefit the humans reading about them?
Biotech in a Post-COVID World
Imagine it's early 2022. Hopefully, much of the population is protected from the virus through some combination of vaccines, therapeutics, and herd immunity. We're starting to bounce back from the social and economic shocks of 2020. COVID-19 headlines recede from the front pages, then disappear altogether. Gradually, certain aspects of life pick up where they left off in 2019, while a few changes forced by the pandemic prove to be more lasting, some for the better.
Among its possible legacies, the virus could usher in a new era of biotech development and press coverage, with these two trends reinforcing each other. While government has mismanaged its response to the virus, the level of innovation, collaboration and investment in pandemic-related biotech has been compared to the Manhattan Project. "There's no question that vaccine acceleration is a success story," said Kevin Schulman, a professor of medicine and economics at Stanford. "We could use this experience to build new economic models to correct market failures. It could carry over to oncology or Alzheimer's."
As Winston Churchill said, never let a good crisis go to waste.
Lipset thinks the virus has primed us to pay attention, bringing biotech into the public's consciousness like never before. He's amazed at how many neighbors and old friends from high school are coming out of the woodwork to ask him how clinical trials work. "What happens next is interesting. Does this open a window of opportunity to get more content out? People's appetites have been whetted."
High-profile wins could help to sustain interest, such as deployment of rapid tests of COVID-19 to be taken at home, a version of which the FDA authorized on November 18th. The idea bears resemblance to the Theranos concept, also designed as a portable analysis, except this test met the FDA's requirements and has a legitimate chance of changing people's lives. Meanwhile, at least two vaccines are on track to gain government approval in record time. The unprecedented speed could be a catalyst for streamlining inefficiencies in the FDA's approval process in non-emergency situations.
Tests for COVID-19 represent what some view as the future of managing diseases: early detection. This paradigm may be more feasible—and deserving of journalistic ink—than research on diseases in advanced stages, says Azra Raza, professor of medicine at Columbia University. "Journalists have to challenge this conceit of thinking we can cure end-stage cancer," says Raza, author of The First Cell. Beyond animal studies and "exercise helps" articles, she thinks writers should focus on biotech for catching the earliest footprints of cancer when it's more treatable. "Not enough people appreciate the extent of this tragedy, but journalists can help us do it. COVID-19 is a great moment of truth telling."
Another pressing truth is the need for vaccination, as half of Americans have said they'll skip them due to concerns about safety and effectiveness. It's not the kind of stumbling block faced by iPhones or social media algorithms. AI stirs plenty of its own controversy, but the public's interest in learning about AI and engaging with it seems to grow regardless. "Who are the publicists doing such a good job for AI that biotechnology is lacking?" Lipset wonders.
The job description of those publicists, whoever they are, could be expanding. Scientists are increasingly using AI to measure the effects of new medicines that target diseases—including COVID-19—and the pathways of aging. Mullin noted the challenge of reporting breakthroughs in the life sciences in ways the public understands. With many newsrooms tightening budgets, fewer writers have science backgrounds, and "biotech is daunting for journalists," she says. "It's daunting for me and I work in this area." Now factor in the additional expertise required to understand biotech and AI. "I learned the ropes for how to read a biotech paper, but I have no idea how to read an AI paper."
Nevertheless, Mullin believes reporters have a duty to scrutinize whether this convergence of AI and biotech will foster better outcomes. "Is it just the shiny new tool we're employing because we can? Will algorithms help eliminate health disparities or contribute to them even more? We need to pay attention."