We Should Resist Making “Synthetic Embryos” Too Realistic
Ethics needs context. So does science – specifically, science that aims to create bioengineered models of early human embryo development in a dish (hereafter synthetic embryos). Even the term "synthetic embryos" begs for an explanation. What are these? And why would anyone want to create them?
"This knowledge may help scientists understand how certain birth defects are formed and why miscarriages often occur."
First the research context. Synthetic embryos are stem cell-derived simulations of human post-implantation embryos that are designed to mimic a stage of early development called gastrulation. That's the stage—around 14-15 days after fertilization – when embryos begin to form a very primitive body plan (basic dorsal-ventral and anterior-posterior axes, and distinct cell lineages). Researchers are starting to create synthetic embryos in the lab – albeit imperfect and incomplete versions – to learn how gastrulation might unfold in real human embryos embedded unseen in the womb. This knowledge may help scientists understand how certain birth defects are formed and why miscarriages often occur soon after implantation. As such, synthetic embryos are meant to be models of human embryo development, not themselves actually embryos. But will synthetic embryos ever get to the point where they are practically the same thing as "natural" human embryos? That is my concern and why I think researchers should avoid creating synthetic embryos capable of doing everything natural embryos can do.
It may not be too difficult to prevent this slide from synthetic to real. Synthetic embryos must be created using sophisticated 3D culture systems that mimic the complex architecture of human embryos. These complex culture systems also have to incorporate precise microinjection systems to chemically trigger the symmetry-breaking events involved in early body plan formation. In short, synthetic embryos need a heavy dose of engineering to get their biological processes going and to help keep them going. And like most engineered entities, designs can be built into the system early to serve well-considered goals – in our case, the goal of not wanting to create synthetic embryos that are too realistic.
"If one wants to study how car engines work, one can model an engine without also modeling the wheels, transmission, and every other car part together."
A good example of this point is found a report published in Nature Communications where scientists created a human stem cell-based 3D model that faithfully recapitulates the biological events around post-implantation amniotic sac development. Importantly, however, the embryo model they developed lacked several key structures and therefore – despite its partial resemblance to an early human embryo – did not have complete human form and potential. While fulfilling their model's aim of revealing a previously inaccessible early developmental event, the team intentionally did not recreate the entire post-implantation human embryo because they did not want to provoke any ethical concerns, as the lead author told me personally. Besides, creating a complete synthetic embryo was not necessary or scientifically justified for the research question they were pursuing. This example goes to show that researchers can create a synthetic embryo to model specific developmental events they want to study without modeling every aspect of a developing embryo. Likewise – to use a somewhat imprecise but instructive analogy – if one wants to study how car engines work, one can model an engine without also modeling the wheels, transmission, and every other car part together.
A representative "synthetic embryo," which in some ways resembles a post-implantation embryo around 14 days after fertilization.
(Courtesy of Yue Shao)
But why should researchers resist creating complete synthetic embryos? To answer this, we need some policy context. Currently there is an embryo research rule in place – a law in many nations, in others a culturally accepted agreement – that intact human embryos must not be grown for research in the lab for longer than 14 consecutive days after fertilization or the formation of the primitive streak (a faint embryonic band that signals the start of gastrulation). This is commonly referred to as the 14-day rule. It was established in the UK decades ago to carve out a space for meritorious human embryo research while simultaneously assuring the public that researchers won't go too far in cultivating embryos to later developmental stages before destroying them at the end of their studies. Many citizens accepting of pre-implantation stage human embryo research would not have tolerated post-implantation stage embryo use. The 14-day rule was a line in the sand, drawn to protect the advancement of embryo research, which otherwise might have been stifled without this clear stopping point. To date, the 14-day rule has not been revoked anywhere in the world, although new research in extended natural embryo cultivation is starting to put some pressure on it.
"Perhaps the day will come when scientists don't have to apply for research funding under such a dark cloud of anti-science sentiment."
Why does this policy context matter? The creation of complete synthetic embryos could raise serious questions (some of them legal) about whether the 14-day rule applies to these lab entities. Although they can be constructed in far fewer than 14 days, they would, at least in theory, be capable of recapitulating all of a natural embryo's developmental events at the gastrulation stage, thus possibly violating the spirit of the 14-day rule. Embryo research laws and policies worldwide are not ready yet to tackle this issue. Furthermore, professional guidelines issued by the International Society for Stem Cell Research prohibit the culture of any "organized embryo-like cellular structures with human organismal potential" to be cultured past the formation of the primitive streak. Thus, researchers should wait until there is greater clarity on this point, or until the 14-day rule is revised through proper policy-making channels to explicitly exclude complete synthetic embryos from its reach.
I should be clear that I am not basing my recommendations on any anti-embryo-research position per se, or on any metaphysical position regarding the positive moral status of synthetic embryos. Rather, I am concerned about the potential backlash that research on complete synthetic embryos might bring to embryo research in general. I began this essay by saying that ethics needs context. The ethics of synthetic embryo research needs to be considered within the context of today's fraught political environment. Perhaps the day will come when scientists don't have to apply for research funding under such a dark cloud of anti-science sentiment. Until then, however, it is my hope that scientists can fulfill their research aims by working on an array of different but each purposefully incomplete synthetic embryo models to generate, in the aggregate of their published work, a unified portrait of human development such that biologically complete synthetic embryo models will not be necessary.
Editor's Note: Read a different viewpoint here written by a leading New York fertility doctor/researcher.
After his grandmother’s dementia diagnosis, one man invented a snack to keep her healthy and hydrated.
On a visit to his grandmother’s nursing home in 2016, college student Lewis Hornby made a shocking discovery: Dehydration is a common (and dangerous) problem among seniors—especially those that are diagnosed with dementia.
Hornby’s grandmother, Pat, had always had difficulty keeping up her water intake as she got older, a common issue with seniors. As we age, our body composition changes, and we naturally hold less water than younger adults or children, so it’s easier to become dehydrated quickly if those fluids aren’t replenished. What’s more, our thirst signals diminish naturally as we age as well—meaning our body is not as good as it once was in letting us know that we need to rehydrate. This often creates a perfect storm that commonly leads to dehydration. In Pat’s case, her dehydration was so severe she nearly died.
When Lewis Hornby visited his grandmother at her nursing home afterward, he learned that dehydration especially affects people with dementia, as they often don’t feel thirst cues at all, or may not recognize how to use cups correctly. But while dementia patients often don’t remember to drink water, it seemed to Hornby that they had less problem remembering to eat, particularly candy.
Where people with dementia often forget to drink water, they're more likely to pick up a colorful snack, Hornby found. alzheimers.org.uk
Hornby wanted to create a solution for elderly people who struggled keeping their fluid intake up. He spent the next eighteen months researching and designing a solution and securing funding for his project. In 2019, Hornby won a sizable grant from the Alzheimer’s Society, a UK-based care and research charity for people with dementia and their caregivers. Together, through the charity’s Accelerator Program, they created a bite-sized, sugar-free, edible jelly drop that looked and tasted like candy. The candy, called Jelly Drops, contained 95% water and electrolytes—important minerals that are often lost during dehydration. The final product launched in 2020—and was an immediate success. The drops were able to provide extra hydration to the elderly, as well as help keep dementia patients safe, since dehydration commonly leads to confusion, hospitalization, and sometimes even death.
Not only did Jelly Drops quickly become a favorite snack among dementia patients in the UK, but they were able to provide an additional boost of hydration to hospital workers during the pandemic. In NHS coronavirus hospital wards, patients infected with the virus were regularly given Jelly Drops to keep their fluid levels normal—and staff members snacked on them as well, since long shifts and personal protective equipment (PPE) they were required to wear often left them feeling parched.
In April 2022, Jelly Drops launched in the United States. The company continues to donate 1% of its profits to help fund Alzheimer’s research.
Last week, researchers at the University of Oxford announced that they have received funding to create a brand new way of preventing ovarian cancer: A vaccine. The vaccine, known as OvarianVax, will teach the immune system to recognize and destroy mutated cells—one of the earliest indicators of ovarian cancer.
Understanding Ovarian Cancer
Despite advancements in medical research and treatment protocols over the last few decades, ovarian cancer still poses a significant threat to women’s health. In the United States alone, more than 12,0000 women die of ovarian cancer each year, and only about half of women diagnosed with ovarian cancer survive five or more years past diagnosis. Unlike cervical cancer, there is no routine screening for ovarian cancer, so it often goes undetected until it has reached advanced stages. Additionally, the primary symptoms of ovarian cancer—frequent urination, bloating, loss of appetite, and abdominal pain—can often be mistaken for other non-cancerous conditions, delaying treatment.
An American woman has roughly a one percent chance of developing ovarian cancer throughout her lifetime. However, these odds increase significantly if she has inherited mutations in the BRCA1 or BRCA2 genes. Women who carry these mutations face a 46% lifetime risk for ovarian and breast cancers.
An Unlikely Solution
To address this escalating health concern, the organization Cancer Research UK has invested £600,000 over the next three years in research aimed at creating a vaccine, which would destroy cancerous cells before they have a chance to develop any further.
Researchers at the University of Oxford are at the forefront of this initiative. With funding from Cancer Research UK, scientists will use tissue samples from the ovaries and fallopian tubes of patients currently battling ovarian cancer. Using these samples, University of Oxford scientists will create a vaccine to recognize certain proteins on the surface of ovarian cancer cells known as tumor-associated antigens. The vaccine will then train that person’s immune system to recognize the cancer markers and destroy them.
The next step
Once developed, the vaccine will first be tested in patients with the disease, to see if their ovarian tumors will shrink or disappear. Then, the vaccine will be tested in women with the BRCA1 or BRCA2 mutations as well as women in the general population without genetic mutations, to see whether the vaccine can prevent the cancer altogether.
While the vaccine still has “a long way to go,” according to Professor Ahmed Ahmed, Director of Oxford University’s ovarian cancer cell laboratory, he is “optimistic” about the results.
“We need better strategies to prevent ovarian cancer,” said Ahmed in a press release from the University of Oxford. “Currently, women with BRCA1/2 mutations are offered surgery which prevents cancer but robs them of the chance to have children afterward.
Teaching the immune system to recognize the very early signs of cancer is a tough challenge. But we now have highly sophisticated tools which give us real insights into how the immune system recognizes ovarian cancer. OvarianVax could offer the solution.”