Top Fertility Doctor: Artificially Created Sperm and Eggs "Will Become Normal" One Day
Kira Peikoff was the editor-in-chief of Leaps.org from 2017 to 2021. As a journalist, her work has appeared in The New York Times, Newsweek, Nautilus, Popular Mechanics, The New York Academy of Sciences, and other outlets. She is also the author of four suspense novels that explore controversial issues arising from scientific innovation: Living Proof, No Time to Die, Die Again Tomorrow, and Mother Knows Best. Peikoff holds a B.A. in Journalism from New York University and an M.S. in Bioethics from Columbia University. She lives in New Jersey with her husband and two young sons. Follow her on Twitter @KiraPeikoff.
Imagine two men making a baby. Or two women. Or an infertile couple. Or an older woman whose eggs are no longer viable. None of these people could have a baby today without the help of an egg or sperm donor.
Cells scraped from the inside of your cheek could one day be manipulated to become either eggs or sperm.
But in the future, it may be possible for them to reproduce using only their own genetic material, thanks to an emerging technology called IVG, or in vitro gametogenesis.
Researchers are learning how to reprogram adult human cells like skin cells to become lab-created egg and sperm cells, which could then be joined to form an embryo. In other words, cells scraped from the inside of your cheek could one day be manipulated to become either eggs or sperm, no matter your gender or your reproductive fitness.
In 2016, Japanese scientists proved that the concept could be successfully carried out in mice. Now some experts, like Dr. John Zhang, the founder and CEO of New Hope Fertility Center in Manhattan, say it's just "a matter of time" before the method is also made to work in humans.
Such a technological tour de force would upend our most basic assumptions about human reproduction and biology. Combined with techniques like gene editing, these tools could eventually enable prospective parents to have an unprecedented level of choice and control over their children's origins. It's a wildly controversial notion, and an especially timely one now that a Chinese scientist has announced the birth of the first allegedly CRISPR-edited babies. (The claims remain unverified.)
Zhang himself is no stranger to controversy. In 2016, he stunned the world when he announced the birth of a baby conceived using the DNA of three people, a landmark procedure intended to prevent the baby from inheriting a devastating neurological disease. (Zhang went to a clinic in Mexico to carry out the procedure because it is prohibited in the U.S.) Zhang's other achievements to date include helping a 49-year-old woman have a baby using her own eggs and restoring a young woman's fertility through an ovarian tissue transplant surgery.
Zhang recently sat down with our Editor-in-Chief in his New York office overlooking Columbus Circle to discuss the fertility world's latest provocative developments. Here are his top ten insights:
Clearly [gene-editing embryos] will be beneficial to mankind, but it's a matter of how and when the work is done.
1) On a Chinese scientist's claim of creating the first CRISPR-edited babies:
I'm glad that we made a first move toward a clinical application of this technology for mankind. Somebody has to do this. Whether this was a good case or not, there is still time to find out.
Clearly it will be beneficial to mankind, but it's a matter of how and when the work is done. Like any scientific advance, it has to be done in a very responsible way.
Today's response is identical to when the world's first IVF baby was announced in 1978. The major news media didn't take it seriously and thought it was evil, wanted to keep a distance from IVF. Many countries even abandoned IVF, but today you see it is a normal practice. And it took almost 40 years [for the researchers] to win a Nobel Prize.
I think we need more time to understand how this work was done medically, ethically, and let the scientist have the opportunity to present how it was done and let a scientific journal publish the paper. Before these become available, I don't think we should start being upset, scared, or giving harsh criticism.
2) On the international outcry in response to the news:
I feel we are in scientific shock, with many thinking it came too fast, too soon. We all embrace modern technology, but when something really comes along, we fear it. In an old Chinese saying, one of the masters always dreamed of seeing the dragon, and when the dragon really came, he got scared.
Dr. John Zhang, the founder and CEO of New Hope Fertility Center in Manhattan, pictured in his office.
3) On the Western world's perception that Chinese scientists sometimes appear to discount ethics in favor of speedy breakthroughs:
I think this perception is not fair. I don't think China is very casual. It's absolutely not what people think. I don't want people to feel that this case [of CRISPR-edited babies] will mean China has less standards over how human reproduction should be performed. Just because this happened, it doesn't mean in China you can do anything you want.
As far as the regulation of IVF clinics, China is probably the most strictly regulated of any country I know in this world.
4) On China's first public opinion poll gauging attitudes toward gene-edited babies, indicating that more than 60 percent of survey respondents supported using the technology to prevent inherited diseases, but not to enhance traits:
There is a sharp contrast between the general public and the professional world. Being a working health professional and an advocate of scientists working in this field, it is very important to be ethically responsible for what we are doing, but my own feeling is that from time to time we may not take into consideration what the patient needs.
5) On how the three-parent baby is doing today, several years after his birth:
No news is good news.
6) On the potentially game-changing research to develop artificial sperm and eggs:
First of all I think that anything that's technically possible, as long as you are not harmful to other people, to other societies, as long as you do it responsibly, and this is a legitimate desire, I think eventually it will become reality.
My research for now is really to try to overcome the very next obstacle in our field, which is how to let a lady age 44 or older have a baby with her own genetic material.
Practically 99 percent of women over age 43 will never make a baby on their own. And after age 47, we usually don't offer donor egg IVF anymore.
But with improved longevity, and quality of life, the lifespan of females continues to increase. In Japan, the average for females is about 89 years old. So for more than half of your life, you will not be able to produce a baby, which is quite significant in the animal kingdom. In most of the animal kingdom, their reproductive life is very much the same as their life, but then you can argue in the animal kingdom unlike a human being, it doesn't take such a long time for them to contribute to the society because once you know how to hunt and look for food, you're done.
"I think this will become a major ethical debate: whether we should let an older lady have a baby at a very late state of her life."
But humans are different. You need to go to college, get certain skills. It takes 20 years to really bring a human being up to become useful to society. That's why the mom and dad are not supposed to have the same reproductive life equal to their real life.
I think this will become a major ethical debate: whether we should let an older lady have a baby at a very late state of her life and leave the future generation in a very vulnerable situation in which they may lack warm caring, proper guidance, and proper education.
7) On using artificial gametes to grant more reproductive choices to gays and lesbians:
I think it is totally possible to have two sperm make a baby, and two eggs make babies.
If we have two guys, one guy to produce eggs, or two girls, one would have to become sperm. Basically you are creating artificial gametes or converting with gametes from sperm to become egg or egg to become a sperm. Which may not necessarily be very difficult. The key is to be able to do nuclear reprogramming.
So why can two sperm not make offspring now? You get exactly half of your genes from each parent. The genes have their own imprinting that say "made in mom," "made in dad." The two sperm would say "made in dad," "made in dad." If I can erase the "made in dad," and say "made in mom," then these sperm can make offspring.
8) On how close science is to creating artificial gametes for clinical use in pregnancies:
It's very hard to say until we accomplish it. It could be very quick. It could be it takes a long time. I don't want to speculate.
"I think these technologies are the solid foundation just like when we designed the computer -- we never thought a computer would become the iPhone."
9) On whether there should be ethical red lines drawn by authorities or whether the decisions should be left to patients and scientists:
I think we cannot believe a hundred percent in the scientist and the patient but it should not be 100 percent authority. It should be coming from the whole of society.
10) On his expectations for the future:
We are living in a very exciting world. I think that all these technologies can really change the way of mankind and also are not just for baby-making. The research, the experience, the mechanism we learn from these technologies, they will shine some great lights into our long-held dream of being a healthy population that is cancer-free and lives a long life, let's say 120 years.
I think these technologies are the solid foundation just like when we designed the computer -- we never thought a computer would become the iPhone. Imagine making a computer 30 years ago, that this little chip will change your life.
Kira Peikoff was the editor-in-chief of Leaps.org from 2017 to 2021. As a journalist, her work has appeared in The New York Times, Newsweek, Nautilus, Popular Mechanics, The New York Academy of Sciences, and other outlets. She is also the author of four suspense novels that explore controversial issues arising from scientific innovation: Living Proof, No Time to Die, Die Again Tomorrow, and Mother Knows Best. Peikoff holds a B.A. in Journalism from New York University and an M.S. in Bioethics from Columbia University. She lives in New Jersey with her husband and two young sons. Follow her on Twitter @KiraPeikoff.
Staying well in the 21st century is like playing a game of chess
This article originally appeared in One Health/One Planet, a single-issue magazine that explores how climate change and other environmental shifts are increasing vulnerabilities to infectious diseases by land and by sea. The magazine probes how scientists are making progress with leaders in other fields toward solutions that embrace diverse perspectives and the interconnectedness of all lifeforms and the planet.
On July 30, 1999, the Centers for Disease Control and Prevention published a report comparing data on the control of infectious disease from the beginning of the 20th century to the end. The data showed that deaths from infectious diseases declined markedly. In the early 1900s, pneumonia, tuberculosis and diarrheal diseases were the three leading killers, accounting for one-third of total deaths in the U.S.—with 40 percent being children under five.
Mass vaccinations, the discovery of antibiotics and overall sanitation and hygiene measures eventually eradicated smallpox, beat down polio, cured cholera, nearly rid the world of tuberculosis and extended the U.S. life expectancy by 25 years. By 1997, there was a shift in population health in the U.S. such that cancer, diabetes and heart disease were now the leading causes of death.
The control of infectious diseases is considered to be one of the “10 Great Public Health Achievements.” Yet on the brink of the 21st century, new trouble was already brewing. Hospitals were seeing periodic cases of antibiotic-resistant infections. Novel viruses, or those that previously didn’t afflict humans, began to emerge, causing outbreaks of West Nile, SARS, MERS or swine flu.In the years that followed, tuberculosis made a comeback, at least in certain parts of the world. What we didn’t take into account was the very concept of evolution: as we built better protections, our enemies eventually boosted their attacking prowess, so soon enough we found ourselves on the defensive once again.
At the same time, new, previously unknown or extremely rare disorders began to rise, such as autoimmune or genetic conditions. Two decades later, scientists began thinking about health differently—not as a static achievement guaranteed to last, but as something dynamic and constantly changing—and sometimes, for the worse.
What emerged since then is a different paradigm that makes our interactions with the microbial world more like a biological chess match, says Victoria McGovern, a biochemist and program officer for the Burroughs Wellcome Fund’s Infectious Disease and Population Sciences Program. In this chess game, humans may make a clever strategic move, which could involve creating a new vaccine or a potent antibiotic, but that advantage is fleeting. At some point, the organisms we are up against could respond with a move of their own—such as developing resistance to medication or genetic mutations that attack our bodies. Simply eradicating the “opponent,” or the pathogenic microbes, as efficiently as possible isn’t enough to keep humans healthy long-term.
Instead, scientists should focus on studying the complexity of interactions between humans and their pathogens. “We need to better understand the lifestyles of things that afflict us,” McGovern says. “The solutions are going to be in understanding various parts of their biology so we can influence how they behave around our systems.”
Genetics and cell biology, combined with imaging techniques that allow one to see tissues and individual cells in actions, will enable scientists to define and quantify what it means to be healthy at the molecular level.
What is being proposed will require a pivot to basic biology and other disciplines that have suffered from lack of research funding in recent years. Yet, according to McGovern, the research teams of funded proposals are answering bigger questions. “We look for people exploring questions about hosts and pathogens, and what happens when they touch, but we’re also looking for people with big ideas,” she says. For example, if one specific infection causes a chain of pathological events in the body, can other infections cause them too? And if we find a way to break that chain for one pathogen, can we play the same trick on another? “We really want to see people thinking of not just one experiment but about big implications of their work,” McGovern says.
Jonah Cool, a cell biologist, geneticist and science officer at the Chan Zuckerberg Initiative, says that it’s necessary to define what constitutes a healthy organism and how it overcomes infections or environmental assaults, such as pollution from forest fires or toxins from industrial smokestacks. An organism that catches a disease isn’t necessarily an unhealthy one, as long as it fights it off successfully—an ability that arises from the complex interplay of its genes, the immune system, age, stress levels and other factors. Modern science allows many of these factors to be measured, recorded and compared. “We need a data-driven, deep-phenotyping approach to defining healthy biological systems and their responses to insults—which can be infectious disease or environmental exposures—and their ability to navigate their way through that space,” Cool says.
Genetics and cell biology, combined with imaging techniques that allow one to see tissues and individual cells in actions, will enable scientists to define and quantify what it means to be healthy at the molecular level. “As a geneticist and cell biologist, I believe in all these molecular underpinnings and how they arise in phenotypic differences in cells, genes, proteins—and how their combinations form complex cellular states,” Cool says.
Julie Graves, a physician, public health consultant, former adjunct professor of management, policy and community health at the University of Texas Health Science Center in Houston, stresses the necessity of nutritious diets. According to the Rockefeller Food Initiative, “poor diet is the leading risk factor for disease, disability and premature death in the majority of countries around the world.” Adequate nutrition is critical for maintaining human health and life. Yet, Western diets are often low in essential nutrients, high in calories and heavy on processed foods. Overconsumption of these foods has contributed to high rates of obesity and chronic disease in the U.S. In fact, more than half of American adults have at least one chronic disease, and 27 percent have more than one—which increases vulnerability to COVID-19 infections, according to the 2018 National Health Interview Survey.
Further, the contamination of our food supply with various agricultural and industrial toxins—petrochemicals, pesticides, PFAS and others—has implications for morbidity, mortality, and overall quality of life. “These chemicals are insidiously in everything, including our bodies,” Graves says—and they are interfering with our normal biological functions. “We need to stop how we manufacture food,” she adds, and rid our sustenance of these contaminants.
According to the Humane Society of the United States, factory farms result in nearly 40 percent of emissions of methane. Concentrated animal feeding operations or CAFOs may serve as breeding grounds for pandemics, scientists warn, so humans should research better ways to raise and treat livestock. Diego Rose, a professor of food and nutrition policy at Tulane University School of Public Health & Tropical Medicine, and his colleagues found that “20 percent of Americans’ diets account for about 45 percent of the environmental impacts [that come from food].” A subsequent study explored the impacts of specific foods and found that substituting beef for chicken lowers an individual’s carbon footprint by nearly 50 percent, with water usage decreased by 30 percent. Notably, however, eating too much red meat has been associated with a variety of illnesses.
In some communities, the option to swap food types is limited or impossible. For example, “many populations live in relative food deserts where there’s not a local grocery store that has any fresh produce,” says Louis Muglia, the president and CEO of Burroughs Wellcome. Individuals in these communities suffer from an insufficient intake of beneficial macronutrients, and they’re “probably being exposed to phenols and other toxins that are in the packaging.” An equitable, sustainable and nutritious food supply will be vital to humanity’s wellbeing in the era of climate change, unpredictable weather and spillover events.
A recent report by See Change Institute and the Climate Mental Health Network showed that people who are experiencing socioeconomic inequalities, including many people of color, contribute the least to climate change, yet they are impacted the most. For example, people in low-income communities are disproportionately exposed to vehicle emissions, Muglia says. Through its Climate Change and Human Health Seed Grants program, Burroughs Wellcome funds research that aims to understand how various factors related to climate change and environmental chemicals contribute to premature births, associated with health vulnerabilities over the course of a person’s life—and map such hot spots.
“It’s very complex, the combinations of socio-economic environment, race, ethnicity and environmental exposure, whether that’s heat or toxic chemicals,” Muglia explains. “Disentangling those things really requires a very sophisticated, multidisciplinary team. That’s what we’ve put together to describe where these hotspots are and see how they correlate with different toxin exposure levels.”
In addition to mapping the risks, researchers are developing novel therapeutics that will be crucial to our armor arsenal, but we will have to be smarter at designing and using them. We will need more potent, better-working monoclonal antibodies. Instead of directly attacking a pathogen, we may have to learn to stimulate the immune system—training it to fight the disease-causing microbes on its own. And rather than indiscriminately killing all bacteria with broad-scope drugs, we would need more targeted medications. “Instead of wiping out the entire gut flora, we will need to come up with ways that kill harmful bacteria but not healthy ones,” Graves says. Training our immune systems to recognize and react to pathogens by way of vaccination will keep us ahead of our biological opponents, too. “Continued development of vaccines against infectious diseases is critical,” says Graves.
With all of the unpredictable events that lie ahead, it is difficult to foresee what achievements in public health will be reported at the end of the 21st century. Yet, technological advances, better modeling and pursuing bigger questions in science, along with education and working closely with communities will help overcome the challenges. The Chan Zuckerberg Initiative displays an optimistic message on its website: “Is it possible to cure, prevent, or manage all diseases by the end of this century? We think so.” Cool shares the view of his employer—and believes that science can get us there. Just give it some time and a chance. “It’s a big, bold statement,” he says, “but the end of the century is a long way away.”Lina Zeldovich has written about science, medicine and technology for Popular Science, Smithsonian, National Geographic, Scientific American, Reader’s Digest, the New York Times and other major national and international publications. A Columbia J-School alumna, she has won several awards for her stories, including the ASJA Crisis Coverage Award for Covid reporting, and has been a contributing editor at Nautilus Magazine. In 2021, Zeldovich released her first book, The Other Dark Matter, published by the University of Chicago Press, about the science and business of turning waste into wealth and health. You can find her on http://linazeldovich.com/ and @linazeldovich.
Alzheimer’s prevention may be less about new drugs, more about income, zip code and education
That your risk of Alzheimer’s disease depends on your salary, what you ate as a child, or the block where you live may seem implausible. But researchers are discovering that social determinants of health (SDOH) play an outsized role in Alzheimer’s disease and related dementias, possibly more than age, and new strategies are emerging for how to address these factors.
At the 2022 Alzheimer’s Association International Conference, a series of presentations offered evidence that a string of socioeconomic factors—such as employment status, social support networks, education and home ownership—significantly affected dementia risk, even when adjusting data for genetic risk. What’s more, memory declined more rapidly in people who earned lower wages and slower in people who had parents of higher socioeconomic status.
In 2020, a first-of-its kind study in JAMA linked Alzheimer’s incidence to “neighborhood disadvantage,” which is based on SDOH indicators. Through autopsies, researchers analyzed brain tissue markers related to Alzheimer’s and found an association with these indicators. In 2022, Ryan Powell, the lead author of that study, published further findings that neighborhood disadvantage was connected with having more neurofibrillary tangles and amyloid plaques, the main pathological features of Alzheimer's disease.
As of yet, little is known about the biological processes behind this, says Powell, director of data science at the Center for Health Disparities Research at the University of Wisconsin School of Medicine and Public Health. “We know the association but not the direct causal pathway.”
The corroborative findings keep coming. In a Nature study published a few months after Powell’s study, every social determinant investigated affected Alzheimer’s risk except for marital status. The links were highest for income, education, and occupational status.
Clinical trials on new Alzheimer’s medications get all the headlines but preventing dementia through policy and public health interventions should not be underestimated.
The potential for prevention is significant. One in three older adults dies with Alzheimer's or another dementia—more than breast and prostate cancers combined. Further, a 2020 report from the Lancet Commission determined that about 40 percent of dementia cases could theoretically be prevented or delayed by managing the risk factors that people can modify.
Take inactivity. Older adults who took 9,800 steps daily were half as likely to develop dementia over the next 7 years, in a 2022 JAMA study. Hearing loss, another risk factor that can be managed, accounts for about 9 percent of dementia cases.
Clinical trials on new Alzheimer’s medications get all the headlines but preventing dementia through policy and public health interventions should not be underestimated. Simply slowing the course of Alzheimer’s or delaying its onset by five years would cut the incidence in half, according to the Global Council on Brain Health.
Minorities Hit the Hardest
The World Health Organization defines SDOH as “conditions in which people are born, work, live, and age, and the wider set of forces and systems shaping the conditions of daily life.”
Anyone who exists on processed food, smokes cigarettes, or skimps on sleep has heightened risks for dementia. But minority groups get hit harder. Older Black Americans are twice as likely to have Alzheimer’s or another form of dementia as white Americans; older Hispanics are about one and a half times more likely.
This is due in part to higher rates of diabetes, obesity, and high blood pressure within these communities. These diseases are linked to Alzheimer’s, and SDOH factors multiply the risks. Blacks and Hispanics earn less income on average than white people. This means they are more likely to live in neighborhoods with limited access to healthy food, medical care, and good schools, and suffer greater exposure to noise (which impairs hearing) and air pollution—additional risk factors for dementia.
Related Reading: The Toxic Effects of Noise and What We're Not Doing About it
Plus, when Black people are diagnosed with dementia, their cognitive impairment and neuropsychiatric symptom are more advanced than in white patients. Why? Some African-Americans delay seeing a doctor because of perceived discrimination and a sense they will not be heard, says Carl V. Hill, chief diversity, equity, and inclusion officer at the Alzheimer’s Association.
Misinformation about dementia is another issue in Black communities. The thinking is that Alzheimer’s is genetic or age-related, not realizing that diet and physical activity can improve brain health, Hill says.
African Americans are severely underrepresented in clinical trials for Alzheimer’s, too. So, researchers miss the opportunity to learn more about health disparities. “It’s a bioethical issue,” Hill says. “The people most likely to have Alzheimer’s aren’t included in the trials.”
The Cure: Systemic Change
People think of lifestyle as a choice but there are limitations, says Muniza Anum Majoka, a geriatric psychiatrist and assistant professor of psychiatry at Yale University, who published an overview of SDOH factors that impact dementia. “For a lot of people, those choices [to improve brain health] are not available,” she says. If you don’t live in a safe neighborhood, for example, walking for exercise is not an option.
Hill wants to see the focus of prevention shift from individual behavior change to ensuring everyone has access to the same resources. Advice about healthy eating only goes so far if someone lives in a food desert. Systemic change also means increasing the number of minority physicians and recruiting minorities in clinical drug trials so studies will be relevant to these communities, Hill says.
Based on SDOH impact research, raising education levels has the most potential to prevent dementia. One theory is that highly educated people have a greater brain reserve that enables them to tolerate pathological changes in the brain, thus delaying dementia, says Majoka. Being curious, learning new things and problem-solving also contribute to brain health, she adds. Plus, having more education may be associated with higher socioeconomic status, more access to accurate information and healthier lifestyle choices.
New Strategies
The chasm between what researchers know about brain health and how the knowledge is being applied is huge. “There’s an explosion of interest in this area. We’re just in the first steps,” says Powell. One day, he predicts that physicians will manage Alzheimer’s through precision medicine customized to the patient’s specific risk factors and needs.
Raina Croff, assistant professor of neurology at Oregon Health & Science University School of Medicine, created the SHARP (Sharing History through Active Reminiscence and Photo-imagery) walking program to forestall memory loss in African Americans with mild cognitive impairment or early dementia.
Participants and their caregivers walk in historically black neighborhoods three times a week over six months. A smart tablet provides information about “Memory Markers” they pass, such as the route of a civil rights march. People celebrate their community and culture while “brain health is running in the background,” Croff says.
Photos and memory prompts engage participants in the SHARP program.
OHSU/Kristyna Wentz-Graff
The project began in 2015 as a pilot study in Croff’s hometown of Portland, Ore., expanded to Seattle, and will soon start in Oakland, Calif. “Walking is good for slowing [brain] decline,” she says. A post-study assessment of 40 participants in 2017 showed that half had higher cognitive scores after the program; 78 percent had lower blood pressure; and 44 percent lost weight. Those with mild cognitive impairment showed the most gains. The walkers also reported improved mood and energy along with increased involvement in other activities.
It’s never too late to reap the benefits of working your brain and being socially engaged, Majoka says.
In Milwaukee, the Wisconsin Alzheimer’s Institute launched the The Amazing Grace Chorus® to stave off cognitive decline in seniors. People in early stages of Alzheimer’s practice and perform six concerts each year. The activity provides opportunities for social engagement, mental stimulation, and a support network. Among the benefits, 55 percent reported better communication at home and nearly half of participants said they got involved with more activities after participating in the chorus.
Private companies are offering intervention services to healthcare providers and insurers to manage SDOH, too. One such service, MyHello, makes calls to at-risk people to assess their needs—be it food, transportation or simply a friendly voice. Having a social support network is critical for seniors, says Majoka, noting there was a steep decline in cognitive function among isolated elders during Covid lockdowns.
About 1 in 9 Americans age 65 or older live with Alzheimer’s today. With a surge in people with the disease predicted, public health professionals have to think more broadly about resource targets and effective intervention points, Powell says.
Beyond breakthrough pills, that is. Like Dorothy in Kansas discovering happiness was always in her own backyard, we are beginning to learn that preventing Alzheimer’s is in our reach if only we recognized it.