Technology is Redefining the Age of 'Older Mothers'
In October 2021, a woman from Gujarat, India, stunned the world when it was revealed she had her first child through in vitro fertilization (IVF) at age 70. She had actually been preceded by a compatriot of hers who, two years before, gave birth to twins at the age of 73, again with the help of IVF treatment. The oldest known mother to conceive naturally lived in the UK; in 1997, Dawn Brooke conceived a son at age 59.
These women may seem extreme outliers, almost freaks of nature; in the US, for example, the average age of first-time mothers is 26. A few decades from now, though, the sight of 70-year-old first-time mothers may not even raise eyebrows, say futurists.
“We could absolutely have more 70-year-old mothers because we are learning how to regulate the aging process better,” says Andrew Hessel, a microbiologist and geneticist, who cowrote "The Genesis Machine," a book about “rewriting life in the age of synthetic biology,” with Amy Webb, the futurist who recently wondered why 70-year-old women shouldn’t give birth.
Technically, we're already doing this, says Hessel, pointing to a technique known as in vitro gametogenesis (IVG). IVG refers to turning adult cells into sperm or egg cells. “You can think of it as the upgrade to IVF,” Hessel says. These vanguard stem cell research technologies can take even skin cells and turn them into induced pluripotent stem cells (iPSCs), which are basically master cells capable of maturing into any human cell, be it kidney cells, liver cells, brain cells or gametes, aka eggs and sperm, says Henry T. “Hank” Greely, a Stanford law professor who specializes in ethical, legal, and social issues in biosciences.
Mothers over 70 will be a minor blip, statistically speaking, Greely predicts.
In 2016, Greely wrote "The End of Sex," a book in which he described the science of making gametes out of iPSCs in detail. Greely says science will indeed enable us to see 70-year-old new mums fraternize with mothers several decades younger at kindergartens in the (not far) future. And it won’t be that big of a deal.
“An awful lot of children all around the world have been raised by grandmothers for millennia. To have 70-year-olds and 30-year-olds mingling in maternal roles is not new,” he says. That said, he doubts that many women will want to have a baby in the eighth decade of their life, even if science allows it. “Having a baby and raising a child is hard work. Even if 1% of all mothers are over 65, they aren’t going to change the world,” Greely says. Mothers over 70 will be a minor blip, statistically speaking, he predicts. But one thing is certain: the technology is here.
And more technologies for the same purpose could be on the way. In March 2021, researchers from Monash University in Melbourne, Australia, published research in Nature, where they successfully reprogrammed skin cells into a three-dimensional cellular structure that was morphologically and molecularly similar to a human embryo–the iBlastoid. In compliance with Australian law and international guidelines referencing the “primitive streak rule," which bans the use of embryos older than 14 days in scientific research, Monash scientists stopped growing their iBlastoids in vitro on day 11.
“The research was both cutting-edge and controversial, because it essentially created a new human life, not for the purpose of a patient who's wanting to conceive, but for basic research,” says Lindsay Wu, a senior lecturer in the School of Medical Sciences at the University of New South Wales (UNSW), in Kensington, Australia. If you really want to make sure what you are breeding is an embryo, you need to let it develop into a viable baby. “This is the real proof in the pudding,'' says Wu, who runs UNSW’s Laboratory for Ageing Research. Then you get to a stage where you decide for ethical purposes you have to abort it. “Fiddling here a bit too much?” he asks. Wu believes there are other approaches to tackling declining fertility due to older age that are less morally troubling.
He is actually working on them. Why would it be that women, who are at peak physical health in almost every other regard, in their mid- to late- thirties, have problems conceiving, asked Wu and his team in a research paper published in 2020 in Cell Reports. The simple answer is the egg cell. An average girl in puberty has between 300,000 and 400,000 eggs, while at around age 37, the same woman has only 25,000 eggs left. Things only go downhill from there. So, what torments the egg cells?
The UNSW team found that the levels of key molecules called NAD+ precursors, which are essential to the metabolism and genome stability of egg cells, decline with age. The team proceeded to add these vitamin-like substances back into the drinking water of reproductively aged, infertile lab mice, which then had babies.
“It's an important proof of concept,” says Wu. He is investigating how safe it is to replicate the experiment with humans in two ongoing studies. The ultimate goal is to restore the quality of egg cells that are left in patients in their late 30s and early- to mid-40s, says Wu. He sees the goal of getting pregnant for this age group as less ethically troubling, compared to 70-year-olds.
But what is ethical, anyway? “It is a tricky word,” says Hessel. He differentiates between ethics, which represent a personal position and may, thus, be more transient, and morality, longer lasting principles embraced across society such as, “Thou shalt not kill.” Unprecedented advances often bring out fear and antagonism until time passes and they just become…ordinary. When IVF pioneer Landrum Shettles tried to perform IVF in 1973, the chairman of Columbia’s College of Physicians and Surgeons interdicted the procedure at the last moment. Almost all countries in the world have IVF clinics today, and the global IVF services market is clearly a growth industry.
Besides, you don’t have a baby at 70 by accident: you really want it, Greely and Hessel agree. And by that age, mothers may be wiser and more financially secure, Hessel says (though he is quick to add that even the pregnancy of his own wife, who had her child at 40, was a high-risk one).
As a research question, figuring out whether older mothers are better than younger ones and vice-versa entails too many confounding variables, says Greely. And why should we focus on who’s the better mother anyway? “We've had 70-year-old and 80-year-old fathers forever–why should people have that much trouble getting used to mothers doing the same?” Greely wonders. For some women having a child at an old(er) age would be comforting; maybe that’s what matters.
And the technology to enable older women to have children is already here or coming very soon. That, perhaps, matters even more. Researchers have already created mice–and their offspring–entirely from scratch in the lab. “Doing this to produce human eggs is similar," says Hessel. "It is harder to collect tissues, and the inducing cocktails are different, but steady advances are being made." He predicts that the demand for fertility treatments will keep financing research and development in the area. He says that big leaps will be made if ethical concerns don’t block them: it is not far-fetched to believe that the first baby produced from lab-grown eggs will be born within the next decade.
In an op-ed in 2020 with Stat, Greely argued that we’ve already overcome the technical barrier for human cloning, but no one's really talking about it. Likewise, scientists are also working on enabling 70-year-old women to have babies, says Hessel, but most commentators are keeping really quiet about it. At least so far.
Your Future Smartphone May Detect Problems in Your Water
In 2014, the city of Flint, Michigan switched the residents' water supply to the Flint river, citing cheaper costs. However, due to improper filtering, lead contaminated this water, and according to the Associated Press, many of the city's residents soon reported health issues like hair loss and rashes. In 2015, a report found that children there had high levels of lead in their blood. The National Resource Defense Council recently discovered there could still be as many as twelve million lead pipes carrying water to homes across the U.S.
What if Flint residents and others in afflicted areas could simply flick water onto their phone screens and an app would tell them if they were about to drink contaminated water? This is what researchers at the University of Cambridge are working on to prevent catastrophes like what occurred in Flint, and to prepare for an uncertain future of scarcer resources.
Underneath the tough glass of our phone screen lies a transparent layer of electrodes. Because our bodies hold an electric charge, when our finger touches the screen, it disrupts the electric field created among the electrodes. This is how the screen can sense where a touch occurs. Cambridge scientists used this same idea to explore whether the screen could detect charges in water, too. Metals like arsenic and lead can appear in water in the form of ions, which are charged particles. When the ionic solution is placed on the screen's surface, the electrodes sense that charge like how they sense our finger.
Imagine a new generation of smartphones with a designated area of the screen responsible for detecting contamination—this is one of the possible futures the researchers propose.
The experiment measured charges in various electrolyte solutions on a touchscreen. The researchers found that a thin polymer layer between the electrodes and the sample solution helped pick up the charges.
"How can we get really close to the touch electrodes, and be better than a phone screen?" Horstmann, the lead scientist on the study, asked himself while designing the protective coating. "We found that when we put electrolytes directly on the electrodes, they were too close, even short-circuiting," he said. When they placed the polymer layer on top the electrodes, however, this short-circuiting did not occur. Horstmann speaks of the polymer layer as one of the key findings of the paper, as it allowed for optimum conductivity. The coating they designed was much thinner than what you'd see with a typical smartphone touchscreen, but because it's already so similar, he feels optimistic about the technology's practical applications in the real world.
While the Cambridge scientists were using touchscreens to measure water contamination, Dr. Baojun Wang, a synthetic biologist at the University of Edinburgh, along with his team, created a way to measure arsenic contamination in Bangladesh groundwater samples using what is called a cell-based biosensor. These biosensors use cornerstones of cellular activity like transcription and promoter sequences to detect the presence of metal ions in water. A promoter can be thought of as a "flag" that tells certain molecules where to begin copying genetic code. By hijacking this aspect of the cell's machinery and increasing the cell's sensing and signal processing ability, they were able to amplify the signal to detect tiny amounts of arsenic in the groundwater samples. All this was conducted in a 384-well plate, each well smaller than a pencil eraser.
They placed arsenic sensors with different sensitivities across part of the plate so it resembled a volume bar of increasing levels of arsenic, similar to diagnostics on a Fitbit or glucose monitor. The whole device is about the size of an iPhone, and can be scaled down to a much smaller size.
Dr. Wang says cell-based biosensors are bringing sensing technology closer to field applications, because their machinery uses inherent cellular activity. This makes them ideal for low-resource communities, and he expects his device to be affordable, portable, and easily stored for widespread use in households.
"It hasn't worked on actual phones yet, but I don't see any reason why it can't be an app," says Horstmann of their technology. Imagine a new generation of smartphones with a designated area of the screen responsible for detecting contamination—this is one of the possible futures the researchers propose. But industry collaborations will be crucial to making their advancements practical. The scientists anticipate that without collaborative efforts from the business sector, the public might have to wait ten years until this becomes something all our smartphones are capable of—but with the right partners, "it could go really quickly," says Dr. Elizabeth Hall, one of the authors on the touchscreen water contamination study.
"That's where the science ends and the business begins," Dr. Hall says. "There is a lot of interest coming through as a result of this paper. I think the people who make the investments and decisions are seeing that there might be something useful here."
As for Flint, according to The Detroit News, the city has entered the final stages in removing lead pipe infrastructure. It's difficult to imagine how many residents might fare better today if they'd had the technology that scientists are now creating.
Of all its tragedy, COVID-19 has increased demand for at-home testing methods, which has carried over to non-COVID-19-related devices. Various testing efforts are now in the public eye.
"I like that the public is watching these directions," says Horstmann. "I think there's a long way to go still, but it's exciting."
Fungus is the ‘New Black’ in Eco-Friendly Fashion
A natural material that looks and feels like real leather is taking the fashion world by storm. Scientists view mycelium—the vegetative part of a mushroom-producing fungus—as a planet-friendly alternative to animal hides and plastics.
Products crafted from this vegan leather are emerging, with others poised to hit the market soon. Among them are the Hermès Victoria bag, Lululemon's yoga accessories, Adidas' Stan Smith Mylo sneaker, and a Stella McCartney apparel collection.
The Adidas' Stan Smith Mylo concept sneaker, made in partnership with Bolt Threads, uses an alternative leather grown from mycelium; a commercial version is expected in the near future.
Adidas
Hermès has held presales on the new bag, says Philip Ross, co-founder and chief technology officer of MycoWorks, a San Francisco Bay area firm whose materials constituted the design. By year-end, Ross expects several more clients to debut mycelium-based merchandise. With "comparable qualities to luxury leather," mycelium can be molded to engineer "all the different verticals within fashion," he says, particularly footwear and accessories.
More than a half-dozen trailblazers are fine-tuning mycelium to create next-generation leather materials, according to the Material Innovation Initiative, a nonprofit advocating for animal-free materials in the fashion, automotive, and home-goods industries. These high-performance products can supersede items derived from leather, silk, down, fur, wool, and exotic skins, says A. Sydney Gladman, the institute's chief scientific officer.
That's only the beginning of mycelium's untapped prowess. "We expect to see an uptick in commercial leather alternative applications for mycelium-based materials as companies refine their R&D [research and development] and scale up," Gladman says, adding that "technological innovation and untapped natural materials have the potential to transform the materials industry and solve the enormous environmental challenges it faces."
In fewer than 10 days in indoor agricultural farms, "we grow large slabs of mycelium that are many feet wide and long. We are not confined to the shape or geometry of an animal."
Reducing our carbon footprint becomes possible because mycelium can flourish in indoor farms, using agricultural waste as feedstock and emitting inherently low greenhouse gas emissions. Carbon dioxide is the primary greenhouse gas. "We often think that when plant tissues like wood rot, that they go from something to nothing," says Jonathan Schilling, professor of plant and microbial biology at the University of Minnesota and a member of MycoWorks' Scientific Advisory Board.
But that assumption doesn't hold true for all carbon in plant tissues. When the fungi dominating the decomposition of plants fulfill their function, they transform a large portion of carbon into fungal biomass, Schilling says. That, in turn, ends up in the soil, with mycelium forming a network underneath that traps the carbon.
Unlike the large amounts of fossil fuels needed to produce styrofoam, leather and plastic, less fuel-intensive processing is involved in creating similar materials with a fungal organism. While some fungi consist of a single cell, others are multicellular and develop as very fine threadlike structures. A mass of them collectively forms a "mycelium" that can be either loose and low density or tightly packed and high density. "When these fungi grow at extremely high density," Schilling explains, "they can take on the feel of a solid material such as styrofoam, leather or even plastic."
Tunable and supple in the cultivation process, mycelium is also reliably sturdy in composition. "We believe that mycelium has some unique attributes that differentiate it from plastic-based and animal-derived products," says Gavin McIntyre, who co-founded Ecovative Design, an upstate New York-based biomaterials company, in 2007 with the goal of displacing some environmentally burdensome materials and making "a meaningful impact on our planet."
After inventing a type of mushroom-based packaging for all sorts of goods, in 2013 the firm ventured into manufacturing mycelium that can be adapted for textiles, he says, because mushrooms are "nature's recycling system."
The company aims for its material—which is "so tough and tenacious" that it doesn't require any plastic add-on as reinforcement—to be generally accessible from a pricing standpoint and not confined to a luxury space. The cost, McIntyre says, would approach that of bovine leather, not the more upscale varieties of lamb and goat skins.
Already, production has taken off by leaps and bounds. In fewer than 10 days in indoor agricultural farms, "we grow large slabs of mycelium that are many feet wide and long," he says. "We are not confined to the shape or geometry of an animal," so there's a much lower scrap rate.
Decreasing the scrap rate is a major selling point. "Our customers can order the pieces to the way that they want them, and there is almost no waste in the processing," explains Ross of MycoWorks. "We can make ours thinner or thicker," depending on a client's specific needs. Growing materials locally also results in a reduction in transportation, shipping, and other supply chain costs, he says.
Yet another advantage to making things out of mycelium is its biodegradability at the end of an item's lifecycle. When a pair of old sneakers lands in a compost pile or landfill, it decomposes thanks to microbial processes that, once again, involve fungi. "It is cool to think that the same organism used to create a product can also be what recycles it, perhaps building something else useful in the same act," says biologist Schilling. That amounts to "more than a nice business model—it is a window into how sustainability works in nature."
A product can be called "sustainable" if it's biodegradable, leaves a minimal carbon footprint during production, and is also profitable, says Preeti Arya, an assistant professor at the Fashion Institute of Technology in New York City and faculty adviser to a student club of the American Association of Textile Chemists and Colorists.
On the opposite end of the spectrum, products composed of petroleum-based polymers don't biodegrade—they break down into smaller pieces or even particles. These remnants pollute landfills, oceans, and rivers, contaminating edible fish and eventually contributing to the growth of benign and cancerous tumors in humans, Arya says.
Commending the steps a few designers have taken toward bringing more environmentally conscious merchandise to consumers, she says, "I'm glad that they took the initiative because others also will try to be part of this competition toward sustainability." And consumers will take notice. "The more people become aware, the more these brands will start acting on it."
A further shift toward mycelium-based products has the capability to reap tremendous environmental dividends, says Drew Endy, associate chair of bioengineering at Stanford University and president of the BioBricks Foundation, which focuses on biotechnology in the public interest.
The continued development of "leather surrogates on a scaled and sustainable basis will provide the greatest benefit to the greatest number of people, in perpetuity," Endy says. "Transitioning the production of leather goods from a process that involves the industrial-scale slaughter of vertebrate mammals to a process that instead uses renewable fungal-based manufacturing will be more just."