Genetically Sequencing Healthy Babies Yielded Surprising Results
Today in Melrose, Massachusetts, Cora Stetson is the picture of good health, a bubbly precocious 2-year-old. But Cora has two separate mutations in the gene that produces a critical enzyme called biotinidase and her body produces only 40 percent of the normal levels of that enzyme.
In the last few years, the dream of predicting and preventing diseases through genomics, starting in childhood, is finally within reach.
That's enough to pass conventional newborn (heelstick) screening, but may not be enough for normal brain development, putting baby Cora at risk for seizures and cognitive impairment. But thanks to an experimental study in which Cora's DNA was sequenced after birth, this condition was discovered and she is being treated with a safe and inexpensive vitamin supplement.
Stories like these are beginning to emerge from the BabySeq Project, the first clinical trial in the world to systematically sequence healthy newborn infants. This trial was led by my research group with funding from the National Institutes of Health. While still controversial, it is pointing the way to a future in which adults, or even newborns, can receive comprehensive genetic analysis in order to determine their risk of future disease and enable opportunities to prevent them.
Some believe that medicine is still not ready for genomic population screening, but others feel it is long overdue. After all, the sequencing of the Human Genome Project was completed in 2003, and with this milestone, it became feasible to sequence and interpret the genome of any human being. The costs have come down dramatically since then; an entire human genome can now be sequenced for about $800, although the costs of bioinformatic and medical interpretation can add another $200 to $2000 more, depending upon the number of genes interrogated and the sophistication of the interpretive effort.
Two-year-old Cora Stetson, whose DNA sequencing after birth identified a potentially dangerous genetic mutation in time for her to receive preventive treatment.
(Photo courtesy of Robert Green)
The ability to sequence the human genome yielded extraordinary benefits in scientific discovery, disease diagnosis, and targeted cancer treatment. But the ability of genomes to detect health risks in advance, to actually predict the medical future of an individual, has been mired in controversy and slow to manifest. In particular, the oft-cited vision that healthy infants could be genetically tested at birth in order to predict and prevent the diseases they would encounter, has proven to be far tougher to implement than anyone anticipated.
But in the last few years, the dream of predicting and preventing diseases through genomics, starting in childhood, is finally within reach. Why did it take so long? And what remains to be done?
Great Expectations
Part of the problem was the unrealistic expectations that had been building for years in advance of the genomic science itself. For example, the 1997 film Gattaca portrayed a near future in which the lifetime risk of disease was readily predicted the moment an infant is born. In the fanfare that accompanied the completion of the Human Genome Project, the notion of predicting and preventing future disease in an individual became a powerful meme that was used to inspire investment and public support for genomic research long before the tools were in place to make it happen.
Another part of the problem was the success of state-mandated newborn screening programs that began in the 1960's with biochemical tests of the "heel-stick" for babies with metabolic disorders. These programs have worked beautifully, costing only a few dollars per baby and saving thousands of infants from death and severe cognitive impairment. It seemed only logical that a new technology like genome sequencing would add power and promise to such programs. But instead of embracing the notion of newborn sequencing, newborn screening laboratories have thus far rejected the entire idea as too expensive, too ambiguous, and too threatening to the comfortable constituency that they had built within the public health framework.
"What can you find when you look as deeply as possible into the medical genomes of healthy individuals?"
Creating the Evidence Base for Preventive Genomics
Despite a number of obstacles, there are researchers who are exploring how to achieve the original vision of genomic testing as a tool for disease prediction and prevention. For example, in our NIH-funded MedSeq Project, we were the first to ask the question: "What can you find when you look as deeply as possible into the medical genomes of healthy individuals?"
Most people do not understand that genetic information comes in four separate categories: 1) dominant mutations putting the individual at risk for rare conditions like familial forms of heart disease or cancer, (2) recessive mutations putting the individual's children at risk for rare conditions like cystic fibrosis or PKU, (3) variants across the genome that can be tallied to construct polygenic risk scores for common conditions like heart disease or type 2 diabetes, and (4) variants that can influence drug metabolism or predict drug side effects such as the muscle pain that occasionally occurs with statin use.
The technological and analytical challenges of our study were formidable, because we decided to systematically interrogate over 5000 disease-associated genes and report results in all four categories of genetic information directly to the primary care physicians for each of our volunteers. We enrolled 200 adults and found that everyone who was sequenced had medically relevant polygenic and pharmacogenomic results, over 90 percent carried recessive mutations that could have been important to reproduction, and an extraordinary 14.5 percent carried dominant mutations for rare genetic conditions.
A few years later we launched the BabySeq Project. In this study, we restricted the number of genes to include only those with child/adolescent onset that could benefit medically from early warning, and even so, we found 9.4 percent carried dominant mutations for rare conditions.
At first, our interpretation around the high proportion of apparently healthy individuals with dominant mutations for rare genetic conditions was simple – that these conditions had lower "penetrance" than anticipated; in other words, only a small proportion of those who carried the dominant mutation would get the disease. If this interpretation were to hold, then genetic risk information might be far less useful than we had hoped.
Suddenly the information available in the genome of even an apparently healthy individual is looking more robust, and the prospect of preventive genomics is looking feasible.
But then we circled back with each adult or infant in order to examine and test them for any possible features of the rare disease in question. When we did this, we were surprised to see that in over a quarter of those carrying such mutations, there were already subtle signs of the disease in question that had not even been suspected! Now our interpretation was different. We now believe that genetic risk may be responsible for subclinical disease in a much higher proportion of people than has ever been suspected!
Meanwhile, colleagues of ours have been demonstrating that detailed analysis of polygenic risk scores can identify individuals at high risk for common conditions like heart disease. So adding up the medically relevant results in any given genome, we start to see that you can learn your risks for a rare monogenic condition, a common polygenic condition, a bad effect from a drug you might take in the future, or for having a child with a devastating recessive condition. Suddenly the information available in the genome of even an apparently healthy individual is looking more robust, and the prospect of preventive genomics is looking feasible.
Preventive Genomics Arrives in Clinical Medicine
There is still considerable evidence to gather before we can recommend genomic screening for the entire population. For example, it is important to make sure that families who learn about such risks do not suffer harms or waste resources from excessive medical attention. And many doctors don't yet have guidance on how to use such information with their patients. But our research is convincing many people that preventive genomics is coming and that it will save lives.
In fact, we recently launched a Preventive Genomics Clinic at Brigham and Women's Hospital where information-seeking adults can obtain predictive genomic testing with the highest quality interpretation and medical context, and be coached over time in light of their disease risks toward a healthier outcome. Insurance doesn't yet cover such testing, so patients must pay out of pocket for now, but they can choose from a menu of genetic screening tests, all of which are more comprehensive than consumer-facing products. Genetic counseling is available but optional. So far, this service is for adults only, but sequencing for children will surely follow soon.
As the costs of sequencing and other Omics technologies continue to decline, we will see both responsible and irresponsible marketing of genetic testing, and we will need to guard against unscientific claims. But at the same time, we must be far more imaginative and fast moving in mainstream medicine than we have been to date in order to claim the emerging benefits of preventive genomics where it is now clear that suffering can be averted, and lives can be saved. The future has arrived if we are bold enough to grasp it.
Funding and Disclosures:
Dr. Green's research is supported by the National Institutes of Health, the Department of Defense and through donations to The Franca Sozzani Fund for Preventive Genomics. Dr. Green receives compensation for advising the following companies: AIA, Applied Therapeutics, Helix, Ohana, OptraHealth, Prudential, Verily and Veritas; and is co-founder and advisor to Genome Medical, Inc, a technology and services company providing genetics expertise to patients, providers, employers and care systems.
Americans Fell for a Theranos-Style Scam 100 Years Ago. Will We Ever Learn?
The huckster understands what people want – an easy route to good health -- and figures out just how to provide it as long as no one asks too many questions.
"Americans are very much prone to this sort of thinking: Give me a pill or give me a magical bean that can make me lose weight!"
The keys to success: Hoopla, fancy technology, and gullibility. And oh yes, one more thing: a blood sample. Well, lots and lots of blood samples. Every testing fee counts.
Sound familiar? It could be the story of the preternaturally persuasive Elizabeth Holmes, the disgraced founder of Theranos who stands accused of perpetrating a massive blood-testing fraud. But this is a different story from a different time, one that dates back 100 years but sounds almost like it could unfold on the front page of The Wall Street Journal today.
The main difference: Back then, watchdogs thought they'd be able to vanquish fake medicine and scam science. Fat chance, it turned out. It seems like we're more likely to lose-weight-quick than make much of a dent into quackery and health fraud.
Why? Have we learned anything at all over the past century? As we sweep into a new decade, experts says we're not as advanced as we'd like to think. But the fight against fraud and fakery continues.
Quackery: As American As America Itself
In the 17th century, British healers of questionable reputation got a new name -- "quack," from the Dutch word "quacksalver," which originally referred to someone who treats others with home remedies but developed a new meaning along the lines of "charlatan." And these quacks got a new place to sell their wares: the American colonies.
By 1692, a Boston newspaper advertised a patent medicine that promised to cure "the Griping of the Guts, and the Wind Cholick" and – for good measure – "preventeth that woeful Distemper of the Dry Belly Ach." A couple centuries later, the most famous woman in the United States wasn't a first lady or feminist but a hawker of nostrums named Lydia Estes Pinkham whose "vegetable compound" promised to banish "female complaints." One advertisement suggested that the "sure cure" would have saved the life of a Connecticut clergyman whose wife killed him after suffering from feminine maladies for 16 years.
By the early 20th century, Americans were fascinated by electricity and radiation, and both healers and hucksters embraced the new high-tech era. Men with flagging libidos, for example, could irradiate their private parts with the radioactive Radiendocrinator or buy battery-powered electric belts equipped with dangling bits to supercharge their, um, dangling bits.
The Rise of the Radio Wave 'Cure'
Enter radionics, the (supposed) science of better health via radio waves. The idea was that "healthy people radiate healthy energy," and sickness could be reversed through diagnosis and re-tuning, write Dr. Lydia Kang and Nate Pedersen in their 2017 book "Quackery: A Brief History of the Worst Ways to Cure Everything."
Detecting illness and fixing it required machinery -- Dynamizers, Radioclasts and Oscillocasts – that could cost hundreds of dollars each. Thousands of physicians bought them. Fortunately, they could work remotely, for a fee. The worried-and-potentially-unwell just needed to send a blood sample and, of course, a personal check.
Sting operations revealed radionics to be bogus. A skeptic sent a blood sample to one radionics practitioner in Albuquerque who reported back with news of an infected fallopian tube. In fact, the blood sample came from a male guinea pig. As an American Medical Association leader reported, the guinea pig "had shown no female characteristics up to that time, and a postmortem examination yielded no evidence of ladylike attributes."
When Quackery Refused to Yield
The rise of bogus medical technology in the early 20th century spawned a watchdog industry as organizations like the American Medical Association swept into action, said medical historian Eric Boyle, author of 2012's "Quack Medicine: A History of Combating Health Fraud in Twentieth-Century America."
"When quackery was recognized as a major problem, the people who campaigned for its demise were confident that they could get rid of it," he said. "A lot of people believed that increased education, the truths of science, and laws designed to protect consumers would ultimately drive quackery from the marketplace. And then throughout the century, as modern medicine developed, and more effectively treated one disease after another, many observers remained confident in that prediction."
There's a bid to "flood the information highway with truth to turn the storm of fake promotional stuff into a trickle."
But fake medicine persisted as Americans continued their quest to get- healthy-quick… or get-rich-quick by promising to help others to get- healthy-quick. Even radionics refused to die. It's still around in various forms. And, as the Theranos scandal reveals, we're still hoping our blood can offer the keys to longevity and good health.
Why Do We Still Fall for Scams?
In our own era, the Theranos company rose to prominence when founder and CEO Elizabeth Holmes convinced journalists and investors that she'd found a way to cheaply test drops of blood for hundreds of conditions. Then it all fell apart, famously, when the world learned that the technology didn't work. The company has folded, and Holmes faces a federal trial on fraud charges this year.
"There were a lot of prominent, very smart people who bought into the myth of Elizabeth Holmes," a former employee told "60 Minutes," even though the blood tests never actually worked as advertised.
Shouldn't "prominent, very smart people" know better? "People are gullible," said Dr. Stephen Barrett, a psychiatrist and leading quack-buster who runs the QuackWatch website. But there's more to the story. According to him, we're uniquely vulnerable as individuals to bogus medicine.
Scam artists specifically pinpoint their target audiences, such as "smart people," desperate people and alienated people, he said.
Smart people, for example, might be overconfident about their ability to detect fraud and fall for bogus medicine. Alienated people may distrust the establishment, whether it's the medical field or government watchdogs, and be more receptive to alternative sources of information.
Dr. Barrett also points a finger at magical thinking, which comes in different forms. It could mean a New Age-style belief that our minds can control the world around us. Or, as professional quack-buster Alex Berezow said, it could refer to "our cultural obsession with quick fixes."
"Americans are very much prone to this sort of thinking: Give me a pill or give me a magical bean that can make me lose weight! But complex problems need complex solutions," said Berezow, a microbiologist who debunks junk science in his job as a spokesman for the American Council on Science & Health.
American mistrust of expertise makes matters worse, he said. "When I tell people they need to get vaccinated, I'm called a shill for the pharmaceutical industry," he said. "If I say dietary supplements generally don't work, I'm a shill for doctors who want to keep people sick."
What can ordinary citizens do to protect themselves from fake medicine? "You have to have a healthy skepticism of everything," Berezow said. "When you come across something new, is someone trying to take advantage of you? It's a horrible way to think about the world, but there's some truth to it."
"Like any chronic disease, we will have to live with it while we do our best to fight it."
The government and experts have their own roles to play via regulation and education, respectively. For all the criticism it gets, the Food & Drug Administration does serve as a bulwark against fakery in prescription medicine. And while celebrities like Gwyneth "Goop" Paltrow hawk countless questionable medical products on the Internet, scientists and physicians are fighting back by using social media as a tool to promote the truth. There's a bid to "flood the information highway with truth to turn the storm of fake promotional stuff into a trickle," said Dr. Randi Hutter Epstein, a writer in residence at Yale School of Medicine and author of 2018's "Aroused: The History of Hormones and How They Control Just About Everything."
What's next? Like death, taxes and Cher, charlatans are likely to always be with us. Boyle quoted the late William Jarvis, a pioneering quack-buster in the late 20th century who believed health fraud would never be eradicated: "Like any chronic disease, we will have to live with it while we do our best to fight it."
Five Memorable Animals Who Expanded the Scientific Frontier
Untold numbers of animals have contributed to science, in ways big and small. Studying cows and cowpox helped English doctor Edward Jenner create a smallpox vaccine; Ivan Pavlov's experiments on dogs' reactions to external stimuli heavily influenced modern behavioral psychology.
We have these five animals to thank for some of our most important scientific advancements, from space travel to better organ replacement options.
Scientists still work with rats, rabbits, and other mammals to test cosmetics and pharmaceuticals and to conduct infectious disease research. Most of these animals remain nameless and unknown to the public, but over the years, certain individuals have had an outsize effect. We have these five animals to thank for some of our most important scientific advancements, from space travel to better organ replacement options.
1) LAIKA THE DOG
Laika was the first living creature ever to orbit the Earth. In October 1957, the Soviet Sputnik I ship had made history as the first man-made object sent into Earth's orbit; Premier Nikita Khrushchev was keen to gain another Space Race victory by sending up a canine cosmonaut.
Laika ("barker" in Russian), was a stray dog, reportedly a husky-spitz mix, recruited among several other female strays for the trip. Although the scientists put extensive work into preparing Laika and the other canine finalists—evaluating their reactions to air-pressure variations, training them to adapt to pelvic sanitation devices meant to contain waste, and eventually having them live in pressurized capsules for weeks—there was no expectation that the dog would return to Earth, and only one meal's worth of food was sent up with her.
Laika the dog, with a mockup of her space capsule.
Sputnik II, six times heavier than its predecessor, launched on November 3, 1957. Soviet broadcasts reported that Laika, fitted out with surgically implanted devices to monitor her heart rate, blood pressure, and breathing rates, survived until November 12; the spacecraft stayed in orbit for five more months, burning up when it re-entered the atmosphere.
At the time, the Sputnik II team reassured the world that Laika had died painlessly of oxygen deprivation. It was only decades later, in the 1990s, that Oleg Gazenko—one of the scientists and dog trainers assigned to the mission—revealed that Laika had died 5 to 7 hours after launch from a combination of heat and stress. The capsule had overheated, probably as a result of the rushed preparation; after the fourth orbit, the temperature inside Sputnik was over 90 degrees, and it's doubtful she could have survived much past that. "The more time passes, the more I'm sorry about it. We shouldn't have done it," Gazenko said. "We did not learn enough from the mission to justify the death of the dog."
Yet even the four or five orbits that Laika did complete were enough to spur scientists to press on in the effort to send a human into space.
2) HAM THE CHIMP
Four years after Laika's ill-fated flight, a chimpanzee named Ham entered suborbital flight in the American Project Mercury MR-2 mission on January 31, 1961, becoming the first hominid in space—and unlike Laika, he returned to Earth, alive, after a 16-minute flight.
Even though Ham's flight was not destined for orbit, the spacecraft and booster used on his trip were the same combination intended for the first (human) American's trip later that year. If he came back unharmed, NASA's medical team would be prepared to okay astronaut Alan Shepard's flight.
Ham receives his well-deserved apple.
For approximately 18 months before liftoff, Ham was trained to perform simple tasks, like pushing levers, in response to visual and auditory cues. (If he failed, he received an electric shock; correct performance earned him a treat. Pavlov would have been pleased.)
At 37 pounds, Ham was also the heaviest animal to ever make it to space. His vital signs and movements were monitored from Earth, and after a light electric shock from the ground team reminded him of his tasks, he performed his lever-pushing just a bit slower than he had on Earth, verifying that motion would not be seriously impaired in space.
Less than three months after Ham returned to Earth, on April 12, 1961, Soviet cosmonaut Yuri Gagarin became the first human to complete an orbital flight; Shepard was close behind, successfully crewing the MR-3 mission on May 5. For his part, Ham "retired" to the National Zoo in Washington D.C. for 17 years, before being transferred to the North Carolina Zoological Park; he died of liver failure in 1983 at age 26. His grave is at the International Space Hall of Fame in New Mexico.
3) KOKO THE GORILLA
A western lowland gorilla born at the San Francisco Zoo, Hanabi-ko, or "Koko," became famous in the 1970s for her cognitive and communicative abilities. Psychologist Francine "Penny" Patterson, then a doctoral student at Stanford, chose Koko to work on a language research project, teaching her American Sign Language; by age four, Koko demonstrated the ability both to make up new words and to combine known words to express herself creatively, as opposed to simply mimicking her trainer.
Koko and Penny compare notes.
Koko's work with Patterson reflected levels of cognition that were higher than non-human primates had previously been thought to have; by the end of her life, her language skills were roughly equivalent to a young child's, with a vocabulary of around 1,000 signs and the ability to understand 2,000 words of spoken English.
An especially impactful study in 2012 showed that Koko had learned to play the recorder, revealing an ability for voluntary breath control that scientists had previously thought was linked closely to speech and could only be developed by humans. Barbara J. King, a biological anthropologist, suggested that Koko's immersion in a human environment may have helped her develop such a skill, and that it might be misleading to consider similar abilities "innate" or lacking in either humans or non-human primates.
Koko's displays of emotions also fascinated the public, especially those that seemed to closely mirror humans': she cared for pet kittens; appeared on Mr. Rogers' Neighborhood and untied the host's shoes for him; acted playfully with Robin Williams during a visit from him, and later expressed grief when told about the comedian's death. Koko died in her sleep in June 2018, at age 46. Patterson continues to run The Gorilla Foundation, which is dedicated to using inter-species communication to motivate conservation efforts.
4) DOLLY THE SHEEP
Dolly—named after country singer Dolly Parton—was the first mammal ever to be cloned from an adult somatic cell, using the process of nuclear transfer. She was born in 1996 as part of research by scientists Keith Campbell and Ian Wilmut of the University of Edinburgh.
Dolly the cloned sheep.
By taking a donor cell from an adult sheep's mammary gland, using it to replace the cell nucleus of an unfertilized, developing egg cell, and then bringing the resultant embryo to term, Campbell and Wilmut proved that even a mature cell (one that had developed to perform mammary gland functions) could revert to an embryonic state and go on to develop into any and all parts of a mammal.
Although cloned livestock are legal in the U.S.—the FDA approved the practice in 2008, after determining that there was no difference between the meat and milk of cattle, pigs, and goats—Dolly has had an even bigger impact on stem cell research. The successful test of nuclear transfer proved that it was possible to change a cell's gene expression by changing its nucleus.
Japanese stem cell biologist Shinya Yamanaka, inspired by the birth of Dolly, won the Nobel Prize in 2012 for his adaptation of the technique. He developed induced pluripotent stem cells (iPS cells) by chemically reverting mature cells back to an embryonic-like blank state that is highly desirable for disease research and treatment. This technique allows researchers to work with such stem cells without the ethically charged complication of having to destroy a human embryo in the process.
5) LAIKA THE PIG
Named in honor of the dog who made it to space, the second science-famous Laika was a genetically engineered pig born in China in 2015 as a result of gene editing carried out by Cambridge, MA startup eGenesis and collaborators.* eGenesis aims to create pigs whose organs—hearts, kidneys, lungs, and more—are safe to transplant into people.
Laika the gene-edited pig.
Using animal organs in humans (xenotransplantation) is tricky: the immune system is very good at recognizing interlopers, and the human body can start to reject an organ from another species in as little as five minutes. But pigs are otherwise exceptionally good potential donors for humans: their organs' sizes and functions are very similar, and their quick gestation and maturation make them attractive from an efficiency standpoint, given that twenty Americans die every day waiting for organ donors.
Perhaps unsurprisingly, Dolly the sheep helped move xenotransplantation forward. In the 1990s, immunologist David Sachs was able to use a similar cloning method to eliminate alpha-gal, an enzyme that is produced by most animals with immune systems, including pigs—but not humans. Since our immune systems don't recognize alpha-gal, attacks on that enzyme are a major cause of organ rejection. Sachs' experiments increased the survival time of pig organs in primates to weeks: a huge improvement, but not nearly enough for someone in need of a liver or heart.
The advent of CRISPR technology, and the ability to edit genes, has allowed another leap. In 2015, researchers at eGenesis used targeted gene-editing to eliminate the genes for porcine endogenous retroviruses from pig kidney cells. These viral elements are part of all pigs' genomes and pose a potentially high risk of infecting human cells. (After the HIV/AIDS crisis especially, there was a lot of anxiety about potentially introducing a new virus into the human population.)
The eGenesis lab used nuclear transfer to embed the edited nuclei into egg cells taken from a normal pig; and Laika was born months later—without the dangerous viral genes. eGenesis is now working to make the organs even more humanlike, with the goal of one day providing organs to every human patient in need.
*[Disclosure: In 2019, eGenesis received a series B investment from Leaps By Bayer, the funding sponsor of leapsmag. However, leapsmag is editorially independent of Bayer and is under no obligation to cover companies they invest in.]
[Correction, March 3, 2020: Laika the gene-edited pig was born in China, not Cambridge, and eGenesis is pursuing xenotransplant programs that include heart, kidney, and lung, but not skin, as originally written.]