Abortions Before Fetal Viability Are Legal: Might Science and the Change on the Supreme Court Undermine That?
This article is part of the magazine, "The Future of Science In America: The Election Issue," co-published by LeapsMag, the Aspen Institute Science & Society Program, and GOOD.
Viability—the potential for a fetus to survive outside the womb—is a core dividing line in American law. For almost 50 years, the Supreme Court of the United States has struck down laws that ban all or most abortions, ruling that women's constitutional rights include choosing to end pregnancies before the point of viability. Once viability is reached, however, states have a "compelling interest" in protecting fetal life. At that point, states can choose to ban or significantly restrict later-term abortions provided states allow an exception to preserve the life or health of the mother.
This distinction between a fetus that could survive outside its mother's body, albeit with significant medical intervention, and one that could not, is at the heart of the court's landmark 1973 decision in Roe v. Wade. The framework of viability remains central to the country's abortion law today, even as some states have passed laws in the name of protecting women's health that significantly undermine Roe. Over the last 30 years, the Supreme Court has upheld these laws, which have the effect of restricting pre-viability abortion access, imposing mandatory waiting periods, requiring parental consent for minors, and placing restrictions on abortion providers.
Viability has always been a slippery notion on which to pin legal rights.
Today, the Guttmacher Institute reports that more than half of American women live in states whose laws are considered hostile to abortion, largely as a result of these intrusions on pre-viability abortion access. Nevertheless, the viability framework stands: while states can pass pre-viability abortion restrictions that (ostensibly) protect the health of the woman or that strike some kind a balance between women's rights and fetal life, it is only after viability that they can completely favor fetal life over the rights of the woman (with limited exceptions when the woman's life is threatened). As a result, judges have struck down certain states' so-called heartbeat laws, which tried to prohibit abortions after detection of a fetal heartbeat (as early as six weeks of pregnancy). Bans on abortion after 12 or 15 weeks' gestation have also been reversed.
Now, with a new Supreme Court Justice expected to be hostile to abortion rights, advances in the care of preterm babies and ongoing research on artificial wombs suggest that the point of viability is already sooner than many assume and could soon be moved radically earlier in gestation, potentially providing a legal basis for earlier and earlier abortion bans.
Viability has always been a slippery notion on which to pin legal rights. It represents an inherently variable and medically shifting moment in the pregnancy timeline that the Roe majority opinion declined to firmly define, noting instead that "[v]iability is usually placed at about seven months (28 weeks) but may occur earlier, even at 24 weeks." Even in 1977, this definition was an optimistic generalization. Every baby is different, and while some 28-week infants born the year Roe was decided did indeed live into adulthood, most died at or shortly after birth. The prognosis for infants born at 24 weeks was much worse.
Today, a baby born at 28 weeks' gestation can be expected to do much better, largely due to the development of surfactant treatment in the early 1990s to help ease the air into babies' lungs. Now, the majority of 24-week-old babies can survive, and several very premature babies, born just shy of 22 weeks' gestation, have lived into childhood. All this variability raises the question: Should the law take a very optimistic, if largely unrealistic, approach to defining viability and place it at 22 weeks, even though the overall survival rate for those preemies remains less than 10% today? Or should the law recognize that keeping a premature infant alive requires specialist care, meaning that actual viability differs not just pregnancy-to-pregnancy but also by healthcare facility and from country to country? A 24-week premature infant born in a rural area or in a developing nation may not be viable as a practical matter, while one born in a major U.S. city with access to state-of-the-art care has a greater than 70% chance of survival. Just as some extremely premature newborns survive, some full-term babies die before, during, or soon after birth, regardless of whether they have access to advanced medical care.
To be accurate, viability should be understood as pregnancy-specific and should take into account the healthcare resources available to that woman. But state laws can't capture this degree of variability by including gestation limits in their abortion laws. Instead, many draw a somewhat arbitrary line at 22, 24, or 28 weeks' gestation, regardless of the particulars of the pregnancy or the medical resources available in that state.
As variable and resource-dependent as viability is today, science may soon move that point even earlier. Ectogenesis is a term coined in 1923 for the growth of an organism outside the body. Long considered science fiction, this technology has made several key advances in the past few years, with scientists announcing in 2017 that they had successfully gestated premature lamb fetuses in an artificial womb for four weeks. Currently in development for use in human fetuses between 22 and 23 weeks' gestation, this technology will almost certainly seek to push viability earlier in pregnancy.
Ectogenesis and other improvements in managing preterm birth deserve to be celebrated, offering new hope to the parents of very premature infants. But in the U.S., and in other nations whose abortion laws are fixed to viability, these same advances also pose a threat to abortion access. Abortion opponents have long sought to move the cutoff for legal abortions, and it is not hard to imagine a state prohibiting all abortions after 18 or 20 weeks by arguing that medical advances render this stage "the new viability," regardless of whether that level of advanced care is available to women in that state. If ectogenesis advances further, the limit could be moved to keep pace.
The Centers for Disease Control and Prevention reports that over 90% of abortions in America are performed at or before 13 weeks, meaning that in the short term, only a small number women would be affected by shifting viability standards. Yet these women are in difficult situations and deserve care and consideration. Research has shown that women seeking later terminations often did not recognize that they were pregnant or had their dates quite wrong, while others report that they had trouble accessing a termination earlier in pregnancy, were afraid to tell their partner or parents, or only recently received a diagnosis of health problems with the fetus.
Shifts in viability over the past few decades have already affected these women, many of whom report struggling to find a provider willing to perform a termination at 18 or 20 weeks out of concern that the woman may have her dates wrong. Ever-earlier gestational limits would continue this chilling effect, making doctors leery of terminating a pregnancy that might be within 2–4 weeks of each new ban. Some states' existing gestational limits on abortion are also inconsistent with prenatal care, which includes genetic testing between 12 and 20 weeks' gestation, as well as an anatomy scan to check the fetus's organ development performed at approximately 20 weeks. If viability moves earlier, prenatal care will be further undermined.
Perhaps most importantly, earlier and earlier abortion bans are inconsistent with the rights and freedoms on which abortion access is based, including recognition of each woman's individual right to bodily integrity and decision-making authority over her own medical care. Those rights and freedoms become meaningless if abortion bans encroach into the weeks that women need to recognize they are pregnant, assess their options, seek medical advice, and access appropriate care. Fetal viability, with its shifting goalposts, isn't the best framework for abortion protection in light of advancing medical science.
Ideally, whether to have an abortion would be a decision that women make in consultation with their doctors, free of state interference. The vast majority of women already make this decision early in pregnancy; the few who come to the decision later do so because something has gone seriously wrong in their lives or with their pregnancies. If states insist on drawing lines based on historical measures of viability, at 24 or 26 or 28 weeks, they should stick with those gestational limits and admit that they no longer represent actual viability but correspond instead to some form of common morality about when the fetus has a protected, if not absolute, right to life. Women need a reasonable amount of time to make careful and informed decisions about whether to continue their pregnancies precisely because these decisions have a lasting impact on their bodies and their lives. To preserve that time, legislators and the courts should decouple abortion rights from ectogenesis and other advances in the care of extremely premature infants that move the point of viability ever earlier.
[Editor's Note: This article was updated after publication to reflect Amy Coney Barrett's confirmation. To read other articles in this special magazine issue, visit the e-reader version.]
Agriculture in the 21st century is not as simple as it once was. With a population seven billion strong, a climate in crisis, and sustainability in farming practices on everyone's radar, figuring out how to feed the masses without destroying the Earth is a pressing concern.
Tufts scientists argue that insect cells may be better suited to lab-created meat protein than traditional farm animal cells.
In addition to low-emission cows and drone pollinators, there's a promising new solution on the table. How does "lab-grown insect meat" grab you?
Writing in Frontiers in Sustainable Food Systems, researchers at Tufts University say insects that are fed plants and genetically modified for maximum growth, nutrition, and flavor could be the best, greenest alternative to our current livestock farming practices. This lab-grown protein source could produce high volume, nutritious food without the massive resources required for traditional animal agriculture.
"Due to the environmental, public health, and animal welfare concerns associated with our current livestock system, it is vital to develop more sustainable food production methods," says lead author Natalie Rubio. Could insect meat be the key?
Next Up
New sustainable food production includes what's called "cellular agriculture," an emerging industry and field of study in which meat and dairy are produced via cells in a lab instead of whole animals. So far, scientists have primarily focused on bovine, porcine, and avian cells to create this "cultured meat."
But the Tufts scientists argue that insect cells may be better suited to lab-created meat protein than traditional farm animal cells.
"Compared to cultured mammalian, avian, and other vertebrate cells, insect cell cultures require fewer resources and less energy-intensive environmental control, as they have lower glucose requirements and can thrive in a wider range of temperature, pH, oxygen, and osmolarity conditions," reports Rubio.
"Alterations necessary for large-scale production are also simpler to achieve with insect cells, which are currently used for biomanufacturing of insecticides, drugs, and vaccines," she adds.
They still have some details to hash out, however, including how to make cultured insect meat more like the steak and chicken we're all familiar with.
"Despite this immense potential, cultured insect meat isn't ready for consumption," says Rubio. "Research is ongoing to master two key processes: controlling development of insect cells into muscle and fat, and combining these in 3D cultures with a meat-like texture." They are currently experimenting with mushroom-derived fiber to tackle the latter.
People would still be able to eat meat—it would just come from a different source.
Open Questions
As the report points out, one thing that makes cellular agriculture an attractive alternative to high-density animal farming is that it doesn't require consumers to change their behaviors. People would still be able to eat meat—it would just come from a different source.
But the big question remains: How will lab-grown insect meat taste? Will the buggers really taste as good as burgers?
And, of course, there's the "ew" factor. Meat alternatives have proven to work for some people—Tofurky is still in business, after all—but it may be a hard sell to get the masses to jump on board with eating bugs. Consuming creepy crawlies sounds simply unpalatable to many, and the term "lab-grown, cellular insect meat" doesn't help much. Perhaps an entirely new nomenclature is in order.
Another question is whether or not folks will trust such scientifically-created food. People already use the term "frankenfood" to refer to genetic modification -- even though the vast majority of the corn and soybeans planted in the U.S. today are genetically engineered, and other major crops with GM varieties include potatoes, apples, squash, and papayas. Still, combining GM technology with eating insects may be a hard sell.
However, we're all going to have to get used to trying new things if we want to leave a habitable home for our children. If a lab-grown bug burger can save the planet, maybe it's worth a shot.
Six Reasons Why Humans Should Return to the Moon
"That's one small step for man; one giant leap for mankind."
This July 20th marks fifty years since Neil Armstrong, mission commander of NASA's Apollo 11, uttered those famous words. Much less discussed is how Project Apollo shifted lunar science into high gear, ultimately teaching scientists just how valuable the Moon could become.
A lunar-based solar power system would actually be cheaper than Earth-based solar power implemented on a global scale.
During the six missions that landed humans on the lunar surface from 1969 to 1972, Apollo astronauts collected some 842 pounds of lunar rocks and dirt. Analysis of these materials has provided us with major clues about the origin of Earth's celestial companion 4.51 billion years ago, but also has revealed the Moon is a treasure trove. Lunar rock contains a plethora of minerals with high industrial value. So let's take a look at some prime examples of how humanity's expected return to the lunar surface in the years to come could help life here on Earth.
24/7 solar energy for Earth
During the 1970s, scientists began examining the Apollo lunar samples to study how the lunar surface could be used as a resource. One such scientist was physicist David Criswell, who has since shown that a lunar-based solar power system would actually be cheaper than Earth-based solar power implemented on a global scale. Whoa! How is that possible, given the high cost of launching people and machines into space?
The key is that it would be enormously expensive to scale up enough Earth-based solar power to supply all of humanity's electrical needs, since solar power on such a scale would require a lot of metal, glass, and cement.
But the Moon's lack of atmosphere and weather means that photovoltaic cells built by robots from lunar materials can be paper thin, in contrast with the heavy structures needed in Earth-based solar arrays. Ringing the Moon, such a system would be in perpetual sunlight, making it cheaper to collect solar power there and beam it down to Earth in the form of microwaves.
A source of helium-3 for clean, safe nuclear fusion power and other uses
The gas helium-3 is extremely rare on Earth, but plentiful on the Moon, and could be used in advanced nuclear fusion reactors. Helium-3 also has anti-terrorism and medical uses, especially in the diagnosis of various pulmonary diseases.
A place to offload industrial pollution
Since there are minerals and oxygen in lunar rocks and dust, and frozen water in certain locations, the Moon is an ideal home for factories. Thus, billionaire Jeff Bezos has proposed relocating large segments of heavy industry there, reducing the amount of pollution that is produced on Earth.
The Moon could be a place for colonists to get their space legs before humans put down roots on more distant locations like Mars.
Radio Astronomy without interference from Earth
Constructed on the Moon's far side (the side of the Moon that always faces away from Earth), radio telescopes advancing human knowledge of the Cosmos, and searching for signals from extraterrestrial civilizations, could operate with increased sensitivity and efficiency.
Lunar Tourism
Using the Moon as a destination for tourists may not sound helpful initially, given that only the very wealthy would be able to afford such journeys in the foreseeable future. However, the economic payoff could be substantial in terms of jobs that lunar tourism could provide on Earth. Furthermore, short of actual tourism, companies are gearing up to provide lunar entertainment to fun-seekers here on Earth in the form of mini lunar rovers that people could control from their living rooms, just for fun.
Lunar Colonies
Similar to lunar tourism, lunar colonization sounds initially like a development that would help only those people who go. But, located just three-days' travel from Earth, the Moon would be an excellent place for humanity to become a multi-planet species. The Moon could be a place for colonists to get their space legs before humans put down roots on more distant locations like Mars. With hundreds or thousands of humans thriving on the Moon, Earthlings might find some level of peace of mind knowing that humanity is in a position to outlive a planetary catastrophe.