Technology is Redefining the Age of 'Older Mothers'
In October 2021, a woman from Gujarat, India, stunned the world when it was revealed she had her first child through in vitro fertilization (IVF) at age 70. She had actually been preceded by a compatriot of hers who, two years before, gave birth to twins at the age of 73, again with the help of IVF treatment. The oldest known mother to conceive naturally lived in the UK; in 1997, Dawn Brooke conceived a son at age 59.
These women may seem extreme outliers, almost freaks of nature; in the US, for example, the average age of first-time mothers is 26. A few decades from now, though, the sight of 70-year-old first-time mothers may not even raise eyebrows, say futurists.
“We could absolutely have more 70-year-old mothers because we are learning how to regulate the aging process better,” says Andrew Hessel, a microbiologist and geneticist, who cowrote "The Genesis Machine," a book about “rewriting life in the age of synthetic biology,” with Amy Webb, the futurist who recently wondered why 70-year-old women shouldn’t give birth.
Technically, we're already doing this, says Hessel, pointing to a technique known as in vitro gametogenesis (IVG). IVG refers to turning adult cells into sperm or egg cells. “You can think of it as the upgrade to IVF,” Hessel says. These vanguard stem cell research technologies can take even skin cells and turn them into induced pluripotent stem cells (iPSCs), which are basically master cells capable of maturing into any human cell, be it kidney cells, liver cells, brain cells or gametes, aka eggs and sperm, says Henry T. “Hank” Greely, a Stanford law professor who specializes in ethical, legal, and social issues in biosciences.
Mothers over 70 will be a minor blip, statistically speaking, Greely predicts.
In 2016, Greely wrote "The End of Sex," a book in which he described the science of making gametes out of iPSCs in detail. Greely says science will indeed enable us to see 70-year-old new mums fraternize with mothers several decades younger at kindergartens in the (not far) future. And it won’t be that big of a deal.
“An awful lot of children all around the world have been raised by grandmothers for millennia. To have 70-year-olds and 30-year-olds mingling in maternal roles is not new,” he says. That said, he doubts that many women will want to have a baby in the eighth decade of their life, even if science allows it. “Having a baby and raising a child is hard work. Even if 1% of all mothers are over 65, they aren’t going to change the world,” Greely says. Mothers over 70 will be a minor blip, statistically speaking, he predicts. But one thing is certain: the technology is here.
And more technologies for the same purpose could be on the way. In March 2021, researchers from Monash University in Melbourne, Australia, published research in Nature, where they successfully reprogrammed skin cells into a three-dimensional cellular structure that was morphologically and molecularly similar to a human embryo–the iBlastoid. In compliance with Australian law and international guidelines referencing the “primitive streak rule," which bans the use of embryos older than 14 days in scientific research, Monash scientists stopped growing their iBlastoids in vitro on day 11.
“The research was both cutting-edge and controversial, because it essentially created a new human life, not for the purpose of a patient who's wanting to conceive, but for basic research,” says Lindsay Wu, a senior lecturer in the School of Medical Sciences at the University of New South Wales (UNSW), in Kensington, Australia. If you really want to make sure what you are breeding is an embryo, you need to let it develop into a viable baby. “This is the real proof in the pudding,'' says Wu, who runs UNSW’s Laboratory for Ageing Research. Then you get to a stage where you decide for ethical purposes you have to abort it. “Fiddling here a bit too much?” he asks. Wu believes there are other approaches to tackling declining fertility due to older age that are less morally troubling.
He is actually working on them. Why would it be that women, who are at peak physical health in almost every other regard, in their mid- to late- thirties, have problems conceiving, asked Wu and his team in a research paper published in 2020 in Cell Reports. The simple answer is the egg cell. An average girl in puberty has between 300,000 and 400,000 eggs, while at around age 37, the same woman has only 25,000 eggs left. Things only go downhill from there. So, what torments the egg cells?
The UNSW team found that the levels of key molecules called NAD+ precursors, which are essential to the metabolism and genome stability of egg cells, decline with age. The team proceeded to add these vitamin-like substances back into the drinking water of reproductively aged, infertile lab mice, which then had babies.
“It's an important proof of concept,” says Wu. He is investigating how safe it is to replicate the experiment with humans in two ongoing studies. The ultimate goal is to restore the quality of egg cells that are left in patients in their late 30s and early- to mid-40s, says Wu. He sees the goal of getting pregnant for this age group as less ethically troubling, compared to 70-year-olds.
But what is ethical, anyway? “It is a tricky word,” says Hessel. He differentiates between ethics, which represent a personal position and may, thus, be more transient, and morality, longer lasting principles embraced across society such as, “Thou shalt not kill.” Unprecedented advances often bring out fear and antagonism until time passes and they just become…ordinary. When IVF pioneer Landrum Shettles tried to perform IVF in 1973, the chairman of Columbia’s College of Physicians and Surgeons interdicted the procedure at the last moment. Almost all countries in the world have IVF clinics today, and the global IVF services market is clearly a growth industry.
Besides, you don’t have a baby at 70 by accident: you really want it, Greely and Hessel agree. And by that age, mothers may be wiser and more financially secure, Hessel says (though he is quick to add that even the pregnancy of his own wife, who had her child at 40, was a high-risk one).
As a research question, figuring out whether older mothers are better than younger ones and vice-versa entails too many confounding variables, says Greely. And why should we focus on who’s the better mother anyway? “We've had 70-year-old and 80-year-old fathers forever–why should people have that much trouble getting used to mothers doing the same?” Greely wonders. For some women having a child at an old(er) age would be comforting; maybe that’s what matters.
And the technology to enable older women to have children is already here or coming very soon. That, perhaps, matters even more. Researchers have already created mice–and their offspring–entirely from scratch in the lab. “Doing this to produce human eggs is similar," says Hessel. "It is harder to collect tissues, and the inducing cocktails are different, but steady advances are being made." He predicts that the demand for fertility treatments will keep financing research and development in the area. He says that big leaps will be made if ethical concerns don’t block them: it is not far-fetched to believe that the first baby produced from lab-grown eggs will be born within the next decade.
In an op-ed in 2020 with Stat, Greely argued that we’ve already overcome the technical barrier for human cloning, but no one's really talking about it. Likewise, scientists are also working on enabling 70-year-old women to have babies, says Hessel, but most commentators are keeping really quiet about it. At least so far.
As Our AI Systems Get Better, So Must We
As the power and capability of our AI systems increase by the day, the essential question we now face is what constitutes peak human. If we stay where we are while the AI systems we are unleashing continually get better, they will meet and then exceed our capabilities in an ever-growing number of domains. But while some technology visionaries like Elon Musk call for us to slow down the development of AI systems to buy time, this approach alone will simply not work in our hyper-competitive world, particularly when the potential benefits of AI are so great and our frameworks for global governance are so weak. In order to build the future we want, we must also become ever better humans.
The list of activities we once saw as uniquely human where AIs have now surpassed us is long and growing. First, AI systems could beat our best chess players, then our best Go players, then our best champions of multi-player poker. They can see patterns far better than we can, generate medical and other hypotheses most human specialists miss, predict and map out new cellular structures, and even generate beautiful, and, yes, creative, art.
A recent paper by Microsoft researchers analyzing the significant leap in capabilities in OpenAI’s latest AI bot, ChatGPT-4, asserted that the algorithm can “solve novel and difficult tasks that span mathematics, coding, vision, medicine, law, psychology and more, without needing any special prompting.” Calling this functionality “strikingly close to human-level performance,” the authors conclude it “could reasonably be viewed as an early (yet still incomplete) version of an artificial general intelligence (AGI) system.”
The concept of AGI has been around for decades. In its common use, the term suggests a time when individual machines can do many different things at a human level, not just one thing like playing Go or analyzing radiological images. Debating when AGI might arrive, a favorite pastime of computer scientists for years, now has become outdated.
We already have AI algorithms and chatbots that can do lots of different things. Based on the generalist definition, in other words, AGI is essentially already here.
Unfettered by the evolved capacity and storage constraints of our brains, AI algorithms can access nearly all of the digitized cultural inheritance of humanity since the dawn of recorded history and have increasing access to growing pools of digitized biological data from across the spectrum of life.
Once we recognize that both AI systems and humans have unique superpowers, the essential question becomes what each of us can do better than the other and what humans and AIs can best do in active collaboration. The future of our species will depend upon our ability to safely, dynamically, and continually figure that out.
With these ever-larger datasets, rapidly increasing computing and memory power, and new and better algorithms, our AI systems will keep getting better faster than most of us can today imagine. These capabilities have the potential to help us radically improve our healthcare, agriculture, and manufacturing, make our economies more productive and our development more sustainable, and do many important things better.
Soon, they will learn how to write their own code. Like human children, in other words, AI systems will grow up. But even that doesn’t mean our human goose is cooked.
Just like dolphins and dogs, these alternate forms of intelligence will be uniquely theirs, not a lesser or greater version of ours. There are lots of things AI systems can't do and will never be able to do because our AI algorithms, for better and for worse, will never be human. Our embodied human intelligence is its own thing.
Our human intelligence is uniquely ours based on the capacities we have developed in our 3.8-billion-year journey from single cell organisms to us. Our brains and bodies represent continuous adaptations on earlier models, which is why our skeletal systems look like those of lizards and our brains like most other mammals with some extra cerebral cortex mixed in. Human intelligence isn’t just some type of disembodied function but the inextricable manifestation of our evolved physical reality. It includes our sensory analytical skills and all of our animal instincts, intuitions, drives, and perceptions. Disembodied machine intelligence is something different than what we have evolved and possess.
Because of this, some linguists including Noam Chomsky have recently argued that AI systems will never be intelligent as long as they are just manipulating symbols and mathematical tokens without any inherent understanding. Nothing could be further from the truth. Anyone interacting with even first-generation AI chatbots quickly realizes that while these systems are far from perfect or omniscient and can sometimes be stupendously oblivious, they are surprisingly smart and versatile and will get more so… forever. We have little idea even how our own minds work, so judging AI systems based on their output is relatively close to how we evaluate ourselves.
Anyone not awed by the potential of these AI systems is missing the point. AI’s newfound capacities demand that we work urgently to establish norms, standards, and regulations at all levels from local to global to manage the very real risks. Pausing our development of AI systems now doesn’t make sense, however, even if it were possible, because we have no sufficient ways of uniformly enacting such a pause, no plan for how we would use the time, and no common framework for addressing global collective challenges like this.
But if all we feel is a passive awe for these new capabilities, we will also be missing the point.
Human evolution, biology, and cultural history are not just some kind of accidental legacy, disability, or parlor trick, but our inherent superpower. Our ancestors outcompeted rivals for billions of years to make us so well suited to the world we inhabit and helped build. Our social organization at scale has made it possible for us to forge civilizations of immense complexity, engineer biology and novel intelligence, and extend our reach to the stars. Our messy, embodied, intuitive, social human intelligence is roughly mimicable by AI systems but, by definition, never fully replicable by them.
Once we recognize that both AI systems and humans have unique superpowers, the essential question becomes what each of us can do better than the other and what humans and AIs can best do in active collaboration. We still don't know. The future of our species will depend upon our ability to safely, dynamically, and continually figure that out.
As we do, we'll learn that many of our ideas and actions are made up of parts, some of which will prove essentially human and some of which can be better achieved by AI systems. Those in every walk of work and life who most successfully identify the optimal contributions of humans, AIs, and the two together, and who build systems and workflows empowering humans to do human things, machines to do machine things, and humans and machines to work together in ways maximizing the respective strengths of each, will be the champions of the 21st century across all fields.
The dawn of the age of machine intelligence is upon us. It’s a quantum leap equivalent to the domestication of plants and animals, industrialization, electrification, and computing. Each of these revolutions forced us to rethink what it means to be human, how we live, and how we organize ourselves. The AI revolution will happen more suddenly than these earlier transformations but will follow the same general trajectory. Now is the time to aggressively prepare for what is fast heading our way, including by active public engagement, governance, and regulation.
AI systems will not replace us, but, like these earlier technology-driven revolutions, they will force us to become different humans as we co-evolve with our technology. We will never reach peak human in our ongoing evolutionary journey, but we’ve got to manage this transition wisely to build the type of future we’d like to inhabit.
Alongside our ascending AIs, we humans still have a lot of climbing to do.
Story by Big Think
Our gut microbiome plays a substantial role in our health and well-being. Most research, however, focuses on bacteria, rather than the viruses that hide within them. Now, research from the University of Copenhagen, newly published in Nature Microbiology, found that people who live past age 100 have a greater diversity of bacteria-infecting viruses in their intestines than younger people. Furthermore, they found that the viruses are linked to changes in bacterial metabolism that may support mucosal integrity and resistance to pathogens.
The microbiota and aging
In the early 1970s, scientists discovered that the composition of our gut microbiota changes as we age. Recent studies have found that the changes are remarkably predictable and follow a pattern: The microbiota undergoes rapid, dramatic changes as toddlers transition to solid foods; further changes become less dramatic during childhood as the microbiota strikes a balance between the host and the environment; and as that balance is achieved, the microbiota remains mostly stable during our adult years (ages 18-60). However, that stability is lost as we enter our elderly years, and the microbiome undergoes dramatic reorganization. This discovery led scientists to question what causes this change and what effect it has on health.
Centenarians have a distinct gut community enriched in microorganisms that synthesize potent antimicrobial molecules that can kill multidrug-resistant pathogens.
“We are always eager to find out why some people live extremely long lives. Previous research has shown that the intestinal bacteria of old Japanese citizens produce brand-new molecules that make them resistant to pathogenic — that is, disease-promoting — microorganisms. And if their intestines are better protected against infection, well, then that is probably one of the things that cause them to live longer than others,” said Joachim Johansen, a researcher at the University of Copenhagen.
In 2021, a team of Japanese scientists set out to characterize the effect of this change on older people’s health. They specifically wanted to determine if people who lived to be over 100 years old — that is, centenarians — underwent changes that provided them with unique benefits. They discovered centenarians have a distinct gut community enriched in microorganisms that synthesize potent antimicrobial molecules that can kill multidrug-resistant pathogens, including Clostridioides difficile and Enterococcus faecium. In other words, the late-life shift in microbiota reduces an older person’s susceptibility to common gut pathogens.
Viruses can change alter the genes of bacteria
Although the late-in-life microbiota change could be beneficial to health, it remained unclear what facilitated this shift. To solve this mystery, Johansen and his colleagues turned their attention to an often overlooked member of the microbiome: viruses. “Our intestines contain billions of viruses living inside bacteria, and they could not care less about human cells; instead, they infect the bacterial cells. And seeing as there are hundreds of different types of bacteria in our intestines, there are also lots of bacterial viruses,” said Simon Rasmussen, Johansen’s research advisor.
Centenarians had a more diverse virome, including previously undescribed viral genera.
For decades, scientists have explored the possibility of phage therapy — that is, using viruses that infect bacteria (called bacteriophages or simply phages) to kill pathogens. However, bacteriophages can also enhance the bacteria they infect. For example, they can provide genes that help their bacterial host attack other bacteria or provide new metabolic capabilities. Both of these can change which bacteria colonize the gut and, in turn, protect against certain disease states.
Intestinal viruses give bacteria new abilities
Johansen and his colleagues were interested in what types of viruses centenarians had in their gut and whether those viruses carried genes that altered metabolism. They compared fecal samples of healthy centenarians (100+ year-olds) with samples from younger patients (18-100 year-olds). They found that the centenarians had a more diverse virome, including previously undescribed viral genera.
They also revealed an enrichment of genes supporting key steps in the sulfate metabolic pathway. The authors speculate that this translates to increased levels of microbially derived sulfide, which may lead to health-promoting outcomes, such as supporting mucosal integrity and resistance to potential pathogens.
“We have learned that if a virus pays a bacterium a visit, it may actually strengthen the bacterium. The viruses we found in the healthy Japanese centenarians contained extra genes that could boost the bacteria,” said Johansen.
Simon Rasmussen added, “If you discover bacteria and viruses that have a positive effect on the human intestinal flora, the obvious next step is to find out whether only some or all of us have them. If we are able to get these bacteria and their viruses to move in with the people who do not have them, more people could benefit from them.”
This article originally appeared on Big Think, home of the brightest minds and biggest ideas of all time.
Sign up for Big Think’s newsletter