Abortions Before Fetal Viability Are Legal: Might Science and the Change on the Supreme Court Undermine That?
This article is part of the magazine, "The Future of Science In America: The Election Issue," co-published by LeapsMag, the Aspen Institute Science & Society Program, and GOOD.
Viability—the potential for a fetus to survive outside the womb—is a core dividing line in American law. For almost 50 years, the Supreme Court of the United States has struck down laws that ban all or most abortions, ruling that women's constitutional rights include choosing to end pregnancies before the point of viability. Once viability is reached, however, states have a "compelling interest" in protecting fetal life. At that point, states can choose to ban or significantly restrict later-term abortions provided states allow an exception to preserve the life or health of the mother.
This distinction between a fetus that could survive outside its mother's body, albeit with significant medical intervention, and one that could not, is at the heart of the court's landmark 1973 decision in Roe v. Wade. The framework of viability remains central to the country's abortion law today, even as some states have passed laws in the name of protecting women's health that significantly undermine Roe. Over the last 30 years, the Supreme Court has upheld these laws, which have the effect of restricting pre-viability abortion access, imposing mandatory waiting periods, requiring parental consent for minors, and placing restrictions on abortion providers.
Viability has always been a slippery notion on which to pin legal rights.
Today, the Guttmacher Institute reports that more than half of American women live in states whose laws are considered hostile to abortion, largely as a result of these intrusions on pre-viability abortion access. Nevertheless, the viability framework stands: while states can pass pre-viability abortion restrictions that (ostensibly) protect the health of the woman or that strike some kind a balance between women's rights and fetal life, it is only after viability that they can completely favor fetal life over the rights of the woman (with limited exceptions when the woman's life is threatened). As a result, judges have struck down certain states' so-called heartbeat laws, which tried to prohibit abortions after detection of a fetal heartbeat (as early as six weeks of pregnancy). Bans on abortion after 12 or 15 weeks' gestation have also been reversed.
Now, with a new Supreme Court Justice expected to be hostile to abortion rights, advances in the care of preterm babies and ongoing research on artificial wombs suggest that the point of viability is already sooner than many assume and could soon be moved radically earlier in gestation, potentially providing a legal basis for earlier and earlier abortion bans.
Viability has always been a slippery notion on which to pin legal rights. It represents an inherently variable and medically shifting moment in the pregnancy timeline that the Roe majority opinion declined to firmly define, noting instead that "[v]iability is usually placed at about seven months (28 weeks) but may occur earlier, even at 24 weeks." Even in 1977, this definition was an optimistic generalization. Every baby is different, and while some 28-week infants born the year Roe was decided did indeed live into adulthood, most died at or shortly after birth. The prognosis for infants born at 24 weeks was much worse.
Today, a baby born at 28 weeks' gestation can be expected to do much better, largely due to the development of surfactant treatment in the early 1990s to help ease the air into babies' lungs. Now, the majority of 24-week-old babies can survive, and several very premature babies, born just shy of 22 weeks' gestation, have lived into childhood. All this variability raises the question: Should the law take a very optimistic, if largely unrealistic, approach to defining viability and place it at 22 weeks, even though the overall survival rate for those preemies remains less than 10% today? Or should the law recognize that keeping a premature infant alive requires specialist care, meaning that actual viability differs not just pregnancy-to-pregnancy but also by healthcare facility and from country to country? A 24-week premature infant born in a rural area or in a developing nation may not be viable as a practical matter, while one born in a major U.S. city with access to state-of-the-art care has a greater than 70% chance of survival. Just as some extremely premature newborns survive, some full-term babies die before, during, or soon after birth, regardless of whether they have access to advanced medical care.
To be accurate, viability should be understood as pregnancy-specific and should take into account the healthcare resources available to that woman. But state laws can't capture this degree of variability by including gestation limits in their abortion laws. Instead, many draw a somewhat arbitrary line at 22, 24, or 28 weeks' gestation, regardless of the particulars of the pregnancy or the medical resources available in that state.
As variable and resource-dependent as viability is today, science may soon move that point even earlier. Ectogenesis is a term coined in 1923 for the growth of an organism outside the body. Long considered science fiction, this technology has made several key advances in the past few years, with scientists announcing in 2017 that they had successfully gestated premature lamb fetuses in an artificial womb for four weeks. Currently in development for use in human fetuses between 22 and 23 weeks' gestation, this technology will almost certainly seek to push viability earlier in pregnancy.
Ectogenesis and other improvements in managing preterm birth deserve to be celebrated, offering new hope to the parents of very premature infants. But in the U.S., and in other nations whose abortion laws are fixed to viability, these same advances also pose a threat to abortion access. Abortion opponents have long sought to move the cutoff for legal abortions, and it is not hard to imagine a state prohibiting all abortions after 18 or 20 weeks by arguing that medical advances render this stage "the new viability," regardless of whether that level of advanced care is available to women in that state. If ectogenesis advances further, the limit could be moved to keep pace.
The Centers for Disease Control and Prevention reports that over 90% of abortions in America are performed at or before 13 weeks, meaning that in the short term, only a small number women would be affected by shifting viability standards. Yet these women are in difficult situations and deserve care and consideration. Research has shown that women seeking later terminations often did not recognize that they were pregnant or had their dates quite wrong, while others report that they had trouble accessing a termination earlier in pregnancy, were afraid to tell their partner or parents, or only recently received a diagnosis of health problems with the fetus.
Shifts in viability over the past few decades have already affected these women, many of whom report struggling to find a provider willing to perform a termination at 18 or 20 weeks out of concern that the woman may have her dates wrong. Ever-earlier gestational limits would continue this chilling effect, making doctors leery of terminating a pregnancy that might be within 2–4 weeks of each new ban. Some states' existing gestational limits on abortion are also inconsistent with prenatal care, which includes genetic testing between 12 and 20 weeks' gestation, as well as an anatomy scan to check the fetus's organ development performed at approximately 20 weeks. If viability moves earlier, prenatal care will be further undermined.
Perhaps most importantly, earlier and earlier abortion bans are inconsistent with the rights and freedoms on which abortion access is based, including recognition of each woman's individual right to bodily integrity and decision-making authority over her own medical care. Those rights and freedoms become meaningless if abortion bans encroach into the weeks that women need to recognize they are pregnant, assess their options, seek medical advice, and access appropriate care. Fetal viability, with its shifting goalposts, isn't the best framework for abortion protection in light of advancing medical science.
Ideally, whether to have an abortion would be a decision that women make in consultation with their doctors, free of state interference. The vast majority of women already make this decision early in pregnancy; the few who come to the decision later do so because something has gone seriously wrong in their lives or with their pregnancies. If states insist on drawing lines based on historical measures of viability, at 24 or 26 or 28 weeks, they should stick with those gestational limits and admit that they no longer represent actual viability but correspond instead to some form of common morality about when the fetus has a protected, if not absolute, right to life. Women need a reasonable amount of time to make careful and informed decisions about whether to continue their pregnancies precisely because these decisions have a lasting impact on their bodies and their lives. To preserve that time, legislators and the courts should decouple abortion rights from ectogenesis and other advances in the care of extremely premature infants that move the point of viability ever earlier.
[Editor's Note: This article was updated after publication to reflect Amy Coney Barrett's confirmation. To read other articles in this special magazine issue, visit the e-reader version.]
Your Future Smartphone May Detect Problems in Your Water
In 2014, the city of Flint, Michigan switched the residents' water supply to the Flint river, citing cheaper costs. However, due to improper filtering, lead contaminated this water, and according to the Associated Press, many of the city's residents soon reported health issues like hair loss and rashes. In 2015, a report found that children there had high levels of lead in their blood. The National Resource Defense Council recently discovered there could still be as many as twelve million lead pipes carrying water to homes across the U.S.
What if Flint residents and others in afflicted areas could simply flick water onto their phone screens and an app would tell them if they were about to drink contaminated water? This is what researchers at the University of Cambridge are working on to prevent catastrophes like what occurred in Flint, and to prepare for an uncertain future of scarcer resources.
Underneath the tough glass of our phone screen lies a transparent layer of electrodes. Because our bodies hold an electric charge, when our finger touches the screen, it disrupts the electric field created among the electrodes. This is how the screen can sense where a touch occurs. Cambridge scientists used this same idea to explore whether the screen could detect charges in water, too. Metals like arsenic and lead can appear in water in the form of ions, which are charged particles. When the ionic solution is placed on the screen's surface, the electrodes sense that charge like how they sense our finger.
Imagine a new generation of smartphones with a designated area of the screen responsible for detecting contamination—this is one of the possible futures the researchers propose.
The experiment measured charges in various electrolyte solutions on a touchscreen. The researchers found that a thin polymer layer between the electrodes and the sample solution helped pick up the charges.
"How can we get really close to the touch electrodes, and be better than a phone screen?" Horstmann, the lead scientist on the study, asked himself while designing the protective coating. "We found that when we put electrolytes directly on the electrodes, they were too close, even short-circuiting," he said. When they placed the polymer layer on top the electrodes, however, this short-circuiting did not occur. Horstmann speaks of the polymer layer as one of the key findings of the paper, as it allowed for optimum conductivity. The coating they designed was much thinner than what you'd see with a typical smartphone touchscreen, but because it's already so similar, he feels optimistic about the technology's practical applications in the real world.
While the Cambridge scientists were using touchscreens to measure water contamination, Dr. Baojun Wang, a synthetic biologist at the University of Edinburgh, along with his team, created a way to measure arsenic contamination in Bangladesh groundwater samples using what is called a cell-based biosensor. These biosensors use cornerstones of cellular activity like transcription and promoter sequences to detect the presence of metal ions in water. A promoter can be thought of as a "flag" that tells certain molecules where to begin copying genetic code. By hijacking this aspect of the cell's machinery and increasing the cell's sensing and signal processing ability, they were able to amplify the signal to detect tiny amounts of arsenic in the groundwater samples. All this was conducted in a 384-well plate, each well smaller than a pencil eraser.
They placed arsenic sensors with different sensitivities across part of the plate so it resembled a volume bar of increasing levels of arsenic, similar to diagnostics on a Fitbit or glucose monitor. The whole device is about the size of an iPhone, and can be scaled down to a much smaller size.
Dr. Wang says cell-based biosensors are bringing sensing technology closer to field applications, because their machinery uses inherent cellular activity. This makes them ideal for low-resource communities, and he expects his device to be affordable, portable, and easily stored for widespread use in households.
"It hasn't worked on actual phones yet, but I don't see any reason why it can't be an app," says Horstmann of their technology. Imagine a new generation of smartphones with a designated area of the screen responsible for detecting contamination—this is one of the possible futures the researchers propose. But industry collaborations will be crucial to making their advancements practical. The scientists anticipate that without collaborative efforts from the business sector, the public might have to wait ten years until this becomes something all our smartphones are capable of—but with the right partners, "it could go really quickly," says Dr. Elizabeth Hall, one of the authors on the touchscreen water contamination study.
"That's where the science ends and the business begins," Dr. Hall says. "There is a lot of interest coming through as a result of this paper. I think the people who make the investments and decisions are seeing that there might be something useful here."
As for Flint, according to The Detroit News, the city has entered the final stages in removing lead pipe infrastructure. It's difficult to imagine how many residents might fare better today if they'd had the technology that scientists are now creating.
Of all its tragedy, COVID-19 has increased demand for at-home testing methods, which has carried over to non-COVID-19-related devices. Various testing efforts are now in the public eye.
"I like that the public is watching these directions," says Horstmann. "I think there's a long way to go still, but it's exciting."
Fungus is the ‘New Black’ in Eco-Friendly Fashion
A natural material that looks and feels like real leather is taking the fashion world by storm. Scientists view mycelium—the vegetative part of a mushroom-producing fungus—as a planet-friendly alternative to animal hides and plastics.
Products crafted from this vegan leather are emerging, with others poised to hit the market soon. Among them are the Hermès Victoria bag, Lululemon's yoga accessories, Adidas' Stan Smith Mylo sneaker, and a Stella McCartney apparel collection.
The Adidas' Stan Smith Mylo concept sneaker, made in partnership with Bolt Threads, uses an alternative leather grown from mycelium; a commercial version is expected in the near future.
Adidas
Hermès has held presales on the new bag, says Philip Ross, co-founder and chief technology officer of MycoWorks, a San Francisco Bay area firm whose materials constituted the design. By year-end, Ross expects several more clients to debut mycelium-based merchandise. With "comparable qualities to luxury leather," mycelium can be molded to engineer "all the different verticals within fashion," he says, particularly footwear and accessories.
More than a half-dozen trailblazers are fine-tuning mycelium to create next-generation leather materials, according to the Material Innovation Initiative, a nonprofit advocating for animal-free materials in the fashion, automotive, and home-goods industries. These high-performance products can supersede items derived from leather, silk, down, fur, wool, and exotic skins, says A. Sydney Gladman, the institute's chief scientific officer.
That's only the beginning of mycelium's untapped prowess. "We expect to see an uptick in commercial leather alternative applications for mycelium-based materials as companies refine their R&D [research and development] and scale up," Gladman says, adding that "technological innovation and untapped natural materials have the potential to transform the materials industry and solve the enormous environmental challenges it faces."
In fewer than 10 days in indoor agricultural farms, "we grow large slabs of mycelium that are many feet wide and long. We are not confined to the shape or geometry of an animal."
Reducing our carbon footprint becomes possible because mycelium can flourish in indoor farms, using agricultural waste as feedstock and emitting inherently low greenhouse gas emissions. Carbon dioxide is the primary greenhouse gas. "We often think that when plant tissues like wood rot, that they go from something to nothing," says Jonathan Schilling, professor of plant and microbial biology at the University of Minnesota and a member of MycoWorks' Scientific Advisory Board.
But that assumption doesn't hold true for all carbon in plant tissues. When the fungi dominating the decomposition of plants fulfill their function, they transform a large portion of carbon into fungal biomass, Schilling says. That, in turn, ends up in the soil, with mycelium forming a network underneath that traps the carbon.
Unlike the large amounts of fossil fuels needed to produce styrofoam, leather and plastic, less fuel-intensive processing is involved in creating similar materials with a fungal organism. While some fungi consist of a single cell, others are multicellular and develop as very fine threadlike structures. A mass of them collectively forms a "mycelium" that can be either loose and low density or tightly packed and high density. "When these fungi grow at extremely high density," Schilling explains, "they can take on the feel of a solid material such as styrofoam, leather or even plastic."
Tunable and supple in the cultivation process, mycelium is also reliably sturdy in composition. "We believe that mycelium has some unique attributes that differentiate it from plastic-based and animal-derived products," says Gavin McIntyre, who co-founded Ecovative Design, an upstate New York-based biomaterials company, in 2007 with the goal of displacing some environmentally burdensome materials and making "a meaningful impact on our planet."
After inventing a type of mushroom-based packaging for all sorts of goods, in 2013 the firm ventured into manufacturing mycelium that can be adapted for textiles, he says, because mushrooms are "nature's recycling system."
The company aims for its material—which is "so tough and tenacious" that it doesn't require any plastic add-on as reinforcement—to be generally accessible from a pricing standpoint and not confined to a luxury space. The cost, McIntyre says, would approach that of bovine leather, not the more upscale varieties of lamb and goat skins.
Already, production has taken off by leaps and bounds. In fewer than 10 days in indoor agricultural farms, "we grow large slabs of mycelium that are many feet wide and long," he says. "We are not confined to the shape or geometry of an animal," so there's a much lower scrap rate.
Decreasing the scrap rate is a major selling point. "Our customers can order the pieces to the way that they want them, and there is almost no waste in the processing," explains Ross of MycoWorks. "We can make ours thinner or thicker," depending on a client's specific needs. Growing materials locally also results in a reduction in transportation, shipping, and other supply chain costs, he says.
Yet another advantage to making things out of mycelium is its biodegradability at the end of an item's lifecycle. When a pair of old sneakers lands in a compost pile or landfill, it decomposes thanks to microbial processes that, once again, involve fungi. "It is cool to think that the same organism used to create a product can also be what recycles it, perhaps building something else useful in the same act," says biologist Schilling. That amounts to "more than a nice business model—it is a window into how sustainability works in nature."
A product can be called "sustainable" if it's biodegradable, leaves a minimal carbon footprint during production, and is also profitable, says Preeti Arya, an assistant professor at the Fashion Institute of Technology in New York City and faculty adviser to a student club of the American Association of Textile Chemists and Colorists.
On the opposite end of the spectrum, products composed of petroleum-based polymers don't biodegrade—they break down into smaller pieces or even particles. These remnants pollute landfills, oceans, and rivers, contaminating edible fish and eventually contributing to the growth of benign and cancerous tumors in humans, Arya says.
Commending the steps a few designers have taken toward bringing more environmentally conscious merchandise to consumers, she says, "I'm glad that they took the initiative because others also will try to be part of this competition toward sustainability." And consumers will take notice. "The more people become aware, the more these brands will start acting on it."
A further shift toward mycelium-based products has the capability to reap tremendous environmental dividends, says Drew Endy, associate chair of bioengineering at Stanford University and president of the BioBricks Foundation, which focuses on biotechnology in the public interest.
The continued development of "leather surrogates on a scaled and sustainable basis will provide the greatest benefit to the greatest number of people, in perpetuity," Endy says. "Transitioning the production of leather goods from a process that involves the industrial-scale slaughter of vertebrate mammals to a process that instead uses renewable fungal-based manufacturing will be more just."