Genetically Sequencing Healthy Babies Yielded Surprising Results
Today in Melrose, Massachusetts, Cora Stetson is the picture of good health, a bubbly precocious 2-year-old. But Cora has two separate mutations in the gene that produces a critical enzyme called biotinidase and her body produces only 40 percent of the normal levels of that enzyme.
In the last few years, the dream of predicting and preventing diseases through genomics, starting in childhood, is finally within reach.
That's enough to pass conventional newborn (heelstick) screening, but may not be enough for normal brain development, putting baby Cora at risk for seizures and cognitive impairment. But thanks to an experimental study in which Cora's DNA was sequenced after birth, this condition was discovered and she is being treated with a safe and inexpensive vitamin supplement.
Stories like these are beginning to emerge from the BabySeq Project, the first clinical trial in the world to systematically sequence healthy newborn infants. This trial was led by my research group with funding from the National Institutes of Health. While still controversial, it is pointing the way to a future in which adults, or even newborns, can receive comprehensive genetic analysis in order to determine their risk of future disease and enable opportunities to prevent them.
Some believe that medicine is still not ready for genomic population screening, but others feel it is long overdue. After all, the sequencing of the Human Genome Project was completed in 2003, and with this milestone, it became feasible to sequence and interpret the genome of any human being. The costs have come down dramatically since then; an entire human genome can now be sequenced for about $800, although the costs of bioinformatic and medical interpretation can add another $200 to $2000 more, depending upon the number of genes interrogated and the sophistication of the interpretive effort.
Two-year-old Cora Stetson, whose DNA sequencing after birth identified a potentially dangerous genetic mutation in time for her to receive preventive treatment.
(Photo courtesy of Robert Green)
The ability to sequence the human genome yielded extraordinary benefits in scientific discovery, disease diagnosis, and targeted cancer treatment. But the ability of genomes to detect health risks in advance, to actually predict the medical future of an individual, has been mired in controversy and slow to manifest. In particular, the oft-cited vision that healthy infants could be genetically tested at birth in order to predict and prevent the diseases they would encounter, has proven to be far tougher to implement than anyone anticipated.
But in the last few years, the dream of predicting and preventing diseases through genomics, starting in childhood, is finally within reach. Why did it take so long? And what remains to be done?
Great Expectations
Part of the problem was the unrealistic expectations that had been building for years in advance of the genomic science itself. For example, the 1997 film Gattaca portrayed a near future in which the lifetime risk of disease was readily predicted the moment an infant is born. In the fanfare that accompanied the completion of the Human Genome Project, the notion of predicting and preventing future disease in an individual became a powerful meme that was used to inspire investment and public support for genomic research long before the tools were in place to make it happen.
Another part of the problem was the success of state-mandated newborn screening programs that began in the 1960's with biochemical tests of the "heel-stick" for babies with metabolic disorders. These programs have worked beautifully, costing only a few dollars per baby and saving thousands of infants from death and severe cognitive impairment. It seemed only logical that a new technology like genome sequencing would add power and promise to such programs. But instead of embracing the notion of newborn sequencing, newborn screening laboratories have thus far rejected the entire idea as too expensive, too ambiguous, and too threatening to the comfortable constituency that they had built within the public health framework.
"What can you find when you look as deeply as possible into the medical genomes of healthy individuals?"
Creating the Evidence Base for Preventive Genomics
Despite a number of obstacles, there are researchers who are exploring how to achieve the original vision of genomic testing as a tool for disease prediction and prevention. For example, in our NIH-funded MedSeq Project, we were the first to ask the question: "What can you find when you look as deeply as possible into the medical genomes of healthy individuals?"
Most people do not understand that genetic information comes in four separate categories: 1) dominant mutations putting the individual at risk for rare conditions like familial forms of heart disease or cancer, (2) recessive mutations putting the individual's children at risk for rare conditions like cystic fibrosis or PKU, (3) variants across the genome that can be tallied to construct polygenic risk scores for common conditions like heart disease or type 2 diabetes, and (4) variants that can influence drug metabolism or predict drug side effects such as the muscle pain that occasionally occurs with statin use.
The technological and analytical challenges of our study were formidable, because we decided to systematically interrogate over 5000 disease-associated genes and report results in all four categories of genetic information directly to the primary care physicians for each of our volunteers. We enrolled 200 adults and found that everyone who was sequenced had medically relevant polygenic and pharmacogenomic results, over 90 percent carried recessive mutations that could have been important to reproduction, and an extraordinary 14.5 percent carried dominant mutations for rare genetic conditions.
A few years later we launched the BabySeq Project. In this study, we restricted the number of genes to include only those with child/adolescent onset that could benefit medically from early warning, and even so, we found 9.4 percent carried dominant mutations for rare conditions.
At first, our interpretation around the high proportion of apparently healthy individuals with dominant mutations for rare genetic conditions was simple – that these conditions had lower "penetrance" than anticipated; in other words, only a small proportion of those who carried the dominant mutation would get the disease. If this interpretation were to hold, then genetic risk information might be far less useful than we had hoped.
Suddenly the information available in the genome of even an apparently healthy individual is looking more robust, and the prospect of preventive genomics is looking feasible.
But then we circled back with each adult or infant in order to examine and test them for any possible features of the rare disease in question. When we did this, we were surprised to see that in over a quarter of those carrying such mutations, there were already subtle signs of the disease in question that had not even been suspected! Now our interpretation was different. We now believe that genetic risk may be responsible for subclinical disease in a much higher proportion of people than has ever been suspected!
Meanwhile, colleagues of ours have been demonstrating that detailed analysis of polygenic risk scores can identify individuals at high risk for common conditions like heart disease. So adding up the medically relevant results in any given genome, we start to see that you can learn your risks for a rare monogenic condition, a common polygenic condition, a bad effect from a drug you might take in the future, or for having a child with a devastating recessive condition. Suddenly the information available in the genome of even an apparently healthy individual is looking more robust, and the prospect of preventive genomics is looking feasible.
Preventive Genomics Arrives in Clinical Medicine
There is still considerable evidence to gather before we can recommend genomic screening for the entire population. For example, it is important to make sure that families who learn about such risks do not suffer harms or waste resources from excessive medical attention. And many doctors don't yet have guidance on how to use such information with their patients. But our research is convincing many people that preventive genomics is coming and that it will save lives.
In fact, we recently launched a Preventive Genomics Clinic at Brigham and Women's Hospital where information-seeking adults can obtain predictive genomic testing with the highest quality interpretation and medical context, and be coached over time in light of their disease risks toward a healthier outcome. Insurance doesn't yet cover such testing, so patients must pay out of pocket for now, but they can choose from a menu of genetic screening tests, all of which are more comprehensive than consumer-facing products. Genetic counseling is available but optional. So far, this service is for adults only, but sequencing for children will surely follow soon.
As the costs of sequencing and other Omics technologies continue to decline, we will see both responsible and irresponsible marketing of genetic testing, and we will need to guard against unscientific claims. But at the same time, we must be far more imaginative and fast moving in mainstream medicine than we have been to date in order to claim the emerging benefits of preventive genomics where it is now clear that suffering can be averted, and lives can be saved. The future has arrived if we are bold enough to grasp it.
Funding and Disclosures:
Dr. Green's research is supported by the National Institutes of Health, the Department of Defense and through donations to The Franca Sozzani Fund for Preventive Genomics. Dr. Green receives compensation for advising the following companies: AIA, Applied Therapeutics, Helix, Ohana, OptraHealth, Prudential, Verily and Veritas; and is co-founder and advisor to Genome Medical, Inc, a technology and services company providing genetics expertise to patients, providers, employers and care systems.
Your Future Smartphone May Detect Problems in Your Water
In 2014, the city of Flint, Michigan switched the residents' water supply to the Flint river, citing cheaper costs. However, due to improper filtering, lead contaminated this water, and according to the Associated Press, many of the city's residents soon reported health issues like hair loss and rashes. In 2015, a report found that children there had high levels of lead in their blood. The National Resource Defense Council recently discovered there could still be as many as twelve million lead pipes carrying water to homes across the U.S.
What if Flint residents and others in afflicted areas could simply flick water onto their phone screens and an app would tell them if they were about to drink contaminated water? This is what researchers at the University of Cambridge are working on to prevent catastrophes like what occurred in Flint, and to prepare for an uncertain future of scarcer resources.
Underneath the tough glass of our phone screen lies a transparent layer of electrodes. Because our bodies hold an electric charge, when our finger touches the screen, it disrupts the electric field created among the electrodes. This is how the screen can sense where a touch occurs. Cambridge scientists used this same idea to explore whether the screen could detect charges in water, too. Metals like arsenic and lead can appear in water in the form of ions, which are charged particles. When the ionic solution is placed on the screen's surface, the electrodes sense that charge like how they sense our finger.
Imagine a new generation of smartphones with a designated area of the screen responsible for detecting contamination—this is one of the possible futures the researchers propose.
The experiment measured charges in various electrolyte solutions on a touchscreen. The researchers found that a thin polymer layer between the electrodes and the sample solution helped pick up the charges.
"How can we get really close to the touch electrodes, and be better than a phone screen?" Horstmann, the lead scientist on the study, asked himself while designing the protective coating. "We found that when we put electrolytes directly on the electrodes, they were too close, even short-circuiting," he said. When they placed the polymer layer on top the electrodes, however, this short-circuiting did not occur. Horstmann speaks of the polymer layer as one of the key findings of the paper, as it allowed for optimum conductivity. The coating they designed was much thinner than what you'd see with a typical smartphone touchscreen, but because it's already so similar, he feels optimistic about the technology's practical applications in the real world.
While the Cambridge scientists were using touchscreens to measure water contamination, Dr. Baojun Wang, a synthetic biologist at the University of Edinburgh, along with his team, created a way to measure arsenic contamination in Bangladesh groundwater samples using what is called a cell-based biosensor. These biosensors use cornerstones of cellular activity like transcription and promoter sequences to detect the presence of metal ions in water. A promoter can be thought of as a "flag" that tells certain molecules where to begin copying genetic code. By hijacking this aspect of the cell's machinery and increasing the cell's sensing and signal processing ability, they were able to amplify the signal to detect tiny amounts of arsenic in the groundwater samples. All this was conducted in a 384-well plate, each well smaller than a pencil eraser.
They placed arsenic sensors with different sensitivities across part of the plate so it resembled a volume bar of increasing levels of arsenic, similar to diagnostics on a Fitbit or glucose monitor. The whole device is about the size of an iPhone, and can be scaled down to a much smaller size.
Dr. Wang says cell-based biosensors are bringing sensing technology closer to field applications, because their machinery uses inherent cellular activity. This makes them ideal for low-resource communities, and he expects his device to be affordable, portable, and easily stored for widespread use in households.
"It hasn't worked on actual phones yet, but I don't see any reason why it can't be an app," says Horstmann of their technology. Imagine a new generation of smartphones with a designated area of the screen responsible for detecting contamination—this is one of the possible futures the researchers propose. But industry collaborations will be crucial to making their advancements practical. The scientists anticipate that without collaborative efforts from the business sector, the public might have to wait ten years until this becomes something all our smartphones are capable of—but with the right partners, "it could go really quickly," says Dr. Elizabeth Hall, one of the authors on the touchscreen water contamination study.
"That's where the science ends and the business begins," Dr. Hall says. "There is a lot of interest coming through as a result of this paper. I think the people who make the investments and decisions are seeing that there might be something useful here."
As for Flint, according to The Detroit News, the city has entered the final stages in removing lead pipe infrastructure. It's difficult to imagine how many residents might fare better today if they'd had the technology that scientists are now creating.
Of all its tragedy, COVID-19 has increased demand for at-home testing methods, which has carried over to non-COVID-19-related devices. Various testing efforts are now in the public eye.
"I like that the public is watching these directions," says Horstmann. "I think there's a long way to go still, but it's exciting."
Fungus is the ‘New Black’ in Eco-Friendly Fashion
A natural material that looks and feels like real leather is taking the fashion world by storm. Scientists view mycelium—the vegetative part of a mushroom-producing fungus—as a planet-friendly alternative to animal hides and plastics.
Products crafted from this vegan leather are emerging, with others poised to hit the market soon. Among them are the Hermès Victoria bag, Lululemon's yoga accessories, Adidas' Stan Smith Mylo sneaker, and a Stella McCartney apparel collection.
The Adidas' Stan Smith Mylo concept sneaker, made in partnership with Bolt Threads, uses an alternative leather grown from mycelium; a commercial version is expected in the near future.
Adidas
Hermès has held presales on the new bag, says Philip Ross, co-founder and chief technology officer of MycoWorks, a San Francisco Bay area firm whose materials constituted the design. By year-end, Ross expects several more clients to debut mycelium-based merchandise. With "comparable qualities to luxury leather," mycelium can be molded to engineer "all the different verticals within fashion," he says, particularly footwear and accessories.
More than a half-dozen trailblazers are fine-tuning mycelium to create next-generation leather materials, according to the Material Innovation Initiative, a nonprofit advocating for animal-free materials in the fashion, automotive, and home-goods industries. These high-performance products can supersede items derived from leather, silk, down, fur, wool, and exotic skins, says A. Sydney Gladman, the institute's chief scientific officer.
That's only the beginning of mycelium's untapped prowess. "We expect to see an uptick in commercial leather alternative applications for mycelium-based materials as companies refine their R&D [research and development] and scale up," Gladman says, adding that "technological innovation and untapped natural materials have the potential to transform the materials industry and solve the enormous environmental challenges it faces."
In fewer than 10 days in indoor agricultural farms, "we grow large slabs of mycelium that are many feet wide and long. We are not confined to the shape or geometry of an animal."
Reducing our carbon footprint becomes possible because mycelium can flourish in indoor farms, using agricultural waste as feedstock and emitting inherently low greenhouse gas emissions. Carbon dioxide is the primary greenhouse gas. "We often think that when plant tissues like wood rot, that they go from something to nothing," says Jonathan Schilling, professor of plant and microbial biology at the University of Minnesota and a member of MycoWorks' Scientific Advisory Board.
But that assumption doesn't hold true for all carbon in plant tissues. When the fungi dominating the decomposition of plants fulfill their function, they transform a large portion of carbon into fungal biomass, Schilling says. That, in turn, ends up in the soil, with mycelium forming a network underneath that traps the carbon.
Unlike the large amounts of fossil fuels needed to produce styrofoam, leather and plastic, less fuel-intensive processing is involved in creating similar materials with a fungal organism. While some fungi consist of a single cell, others are multicellular and develop as very fine threadlike structures. A mass of them collectively forms a "mycelium" that can be either loose and low density or tightly packed and high density. "When these fungi grow at extremely high density," Schilling explains, "they can take on the feel of a solid material such as styrofoam, leather or even plastic."
Tunable and supple in the cultivation process, mycelium is also reliably sturdy in composition. "We believe that mycelium has some unique attributes that differentiate it from plastic-based and animal-derived products," says Gavin McIntyre, who co-founded Ecovative Design, an upstate New York-based biomaterials company, in 2007 with the goal of displacing some environmentally burdensome materials and making "a meaningful impact on our planet."
After inventing a type of mushroom-based packaging for all sorts of goods, in 2013 the firm ventured into manufacturing mycelium that can be adapted for textiles, he says, because mushrooms are "nature's recycling system."
The company aims for its material—which is "so tough and tenacious" that it doesn't require any plastic add-on as reinforcement—to be generally accessible from a pricing standpoint and not confined to a luxury space. The cost, McIntyre says, would approach that of bovine leather, not the more upscale varieties of lamb and goat skins.
Already, production has taken off by leaps and bounds. In fewer than 10 days in indoor agricultural farms, "we grow large slabs of mycelium that are many feet wide and long," he says. "We are not confined to the shape or geometry of an animal," so there's a much lower scrap rate.
Decreasing the scrap rate is a major selling point. "Our customers can order the pieces to the way that they want them, and there is almost no waste in the processing," explains Ross of MycoWorks. "We can make ours thinner or thicker," depending on a client's specific needs. Growing materials locally also results in a reduction in transportation, shipping, and other supply chain costs, he says.
Yet another advantage to making things out of mycelium is its biodegradability at the end of an item's lifecycle. When a pair of old sneakers lands in a compost pile or landfill, it decomposes thanks to microbial processes that, once again, involve fungi. "It is cool to think that the same organism used to create a product can also be what recycles it, perhaps building something else useful in the same act," says biologist Schilling. That amounts to "more than a nice business model—it is a window into how sustainability works in nature."
A product can be called "sustainable" if it's biodegradable, leaves a minimal carbon footprint during production, and is also profitable, says Preeti Arya, an assistant professor at the Fashion Institute of Technology in New York City and faculty adviser to a student club of the American Association of Textile Chemists and Colorists.
On the opposite end of the spectrum, products composed of petroleum-based polymers don't biodegrade—they break down into smaller pieces or even particles. These remnants pollute landfills, oceans, and rivers, contaminating edible fish and eventually contributing to the growth of benign and cancerous tumors in humans, Arya says.
Commending the steps a few designers have taken toward bringing more environmentally conscious merchandise to consumers, she says, "I'm glad that they took the initiative because others also will try to be part of this competition toward sustainability." And consumers will take notice. "The more people become aware, the more these brands will start acting on it."
A further shift toward mycelium-based products has the capability to reap tremendous environmental dividends, says Drew Endy, associate chair of bioengineering at Stanford University and president of the BioBricks Foundation, which focuses on biotechnology in the public interest.
The continued development of "leather surrogates on a scaled and sustainable basis will provide the greatest benefit to the greatest number of people, in perpetuity," Endy says. "Transitioning the production of leather goods from a process that involves the industrial-scale slaughter of vertebrate mammals to a process that instead uses renewable fungal-based manufacturing will be more just."