Americans Fell for a Theranos-Style Scam 100 Years Ago. Will We Ever Learn?
The huckster understands what people want – an easy route to good health -- and figures out just how to provide it as long as no one asks too many questions.
"Americans are very much prone to this sort of thinking: Give me a pill or give me a magical bean that can make me lose weight!"
The keys to success: Hoopla, fancy technology, and gullibility. And oh yes, one more thing: a blood sample. Well, lots and lots of blood samples. Every testing fee counts.
Sound familiar? It could be the story of the preternaturally persuasive Elizabeth Holmes, the disgraced founder of Theranos who stands accused of perpetrating a massive blood-testing fraud. But this is a different story from a different time, one that dates back 100 years but sounds almost like it could unfold on the front page of The Wall Street Journal today.
The main difference: Back then, watchdogs thought they'd be able to vanquish fake medicine and scam science. Fat chance, it turned out. It seems like we're more likely to lose-weight-quick than make much of a dent into quackery and health fraud.
Why? Have we learned anything at all over the past century? As we sweep into a new decade, experts says we're not as advanced as we'd like to think. But the fight against fraud and fakery continues.
Quackery: As American As America Itself
In the 17th century, British healers of questionable reputation got a new name -- "quack," from the Dutch word "quacksalver," which originally referred to someone who treats others with home remedies but developed a new meaning along the lines of "charlatan." And these quacks got a new place to sell their wares: the American colonies.
By 1692, a Boston newspaper advertised a patent medicine that promised to cure "the Griping of the Guts, and the Wind Cholick" and – for good measure – "preventeth that woeful Distemper of the Dry Belly Ach." A couple centuries later, the most famous woman in the United States wasn't a first lady or feminist but a hawker of nostrums named Lydia Estes Pinkham whose "vegetable compound" promised to banish "female complaints." One advertisement suggested that the "sure cure" would have saved the life of a Connecticut clergyman whose wife killed him after suffering from feminine maladies for 16 years.
By the early 20th century, Americans were fascinated by electricity and radiation, and both healers and hucksters embraced the new high-tech era. Men with flagging libidos, for example, could irradiate their private parts with the radioactive Radiendocrinator or buy battery-powered electric belts equipped with dangling bits to supercharge their, um, dangling bits.
The Rise of the Radio Wave 'Cure'
Enter radionics, the (supposed) science of better health via radio waves. The idea was that "healthy people radiate healthy energy," and sickness could be reversed through diagnosis and re-tuning, write Dr. Lydia Kang and Nate Pedersen in their 2017 book "Quackery: A Brief History of the Worst Ways to Cure Everything."
Detecting illness and fixing it required machinery -- Dynamizers, Radioclasts and Oscillocasts – that could cost hundreds of dollars each. Thousands of physicians bought them. Fortunately, they could work remotely, for a fee. The worried-and-potentially-unwell just needed to send a blood sample and, of course, a personal check.
Sting operations revealed radionics to be bogus. A skeptic sent a blood sample to one radionics practitioner in Albuquerque who reported back with news of an infected fallopian tube. In fact, the blood sample came from a male guinea pig. As an American Medical Association leader reported, the guinea pig "had shown no female characteristics up to that time, and a postmortem examination yielded no evidence of ladylike attributes."
When Quackery Refused to Yield
The rise of bogus medical technology in the early 20th century spawned a watchdog industry as organizations like the American Medical Association swept into action, said medical historian Eric Boyle, author of 2012's "Quack Medicine: A History of Combating Health Fraud in Twentieth-Century America."
"When quackery was recognized as a major problem, the people who campaigned for its demise were confident that they could get rid of it," he said. "A lot of people believed that increased education, the truths of science, and laws designed to protect consumers would ultimately drive quackery from the marketplace. And then throughout the century, as modern medicine developed, and more effectively treated one disease after another, many observers remained confident in that prediction."
There's a bid to "flood the information highway with truth to turn the storm of fake promotional stuff into a trickle."
But fake medicine persisted as Americans continued their quest to get- healthy-quick… or get-rich-quick by promising to help others to get- healthy-quick. Even radionics refused to die. It's still around in various forms. And, as the Theranos scandal reveals, we're still hoping our blood can offer the keys to longevity and good health.
Why Do We Still Fall for Scams?
In our own era, the Theranos company rose to prominence when founder and CEO Elizabeth Holmes convinced journalists and investors that she'd found a way to cheaply test drops of blood for hundreds of conditions. Then it all fell apart, famously, when the world learned that the technology didn't work. The company has folded, and Holmes faces a federal trial on fraud charges this year.
"There were a lot of prominent, very smart people who bought into the myth of Elizabeth Holmes," a former employee told "60 Minutes," even though the blood tests never actually worked as advertised.
Shouldn't "prominent, very smart people" know better? "People are gullible," said Dr. Stephen Barrett, a psychiatrist and leading quack-buster who runs the QuackWatch website. But there's more to the story. According to him, we're uniquely vulnerable as individuals to bogus medicine.
Scam artists specifically pinpoint their target audiences, such as "smart people," desperate people and alienated people, he said.
Smart people, for example, might be overconfident about their ability to detect fraud and fall for bogus medicine. Alienated people may distrust the establishment, whether it's the medical field or government watchdogs, and be more receptive to alternative sources of information.
Dr. Barrett also points a finger at magical thinking, which comes in different forms. It could mean a New Age-style belief that our minds can control the world around us. Or, as professional quack-buster Alex Berezow said, it could refer to "our cultural obsession with quick fixes."
"Americans are very much prone to this sort of thinking: Give me a pill or give me a magical bean that can make me lose weight! But complex problems need complex solutions," said Berezow, a microbiologist who debunks junk science in his job as a spokesman for the American Council on Science & Health.
American mistrust of expertise makes matters worse, he said. "When I tell people they need to get vaccinated, I'm called a shill for the pharmaceutical industry," he said. "If I say dietary supplements generally don't work, I'm a shill for doctors who want to keep people sick."
What can ordinary citizens do to protect themselves from fake medicine? "You have to have a healthy skepticism of everything," Berezow said. "When you come across something new, is someone trying to take advantage of you? It's a horrible way to think about the world, but there's some truth to it."
"Like any chronic disease, we will have to live with it while we do our best to fight it."
The government and experts have their own roles to play via regulation and education, respectively. For all the criticism it gets, the Food & Drug Administration does serve as a bulwark against fakery in prescription medicine. And while celebrities like Gwyneth "Goop" Paltrow hawk countless questionable medical products on the Internet, scientists and physicians are fighting back by using social media as a tool to promote the truth. There's a bid to "flood the information highway with truth to turn the storm of fake promotional stuff into a trickle," said Dr. Randi Hutter Epstein, a writer in residence at Yale School of Medicine and author of 2018's "Aroused: The History of Hormones and How They Control Just About Everything."
What's next? Like death, taxes and Cher, charlatans are likely to always be with us. Boyle quoted the late William Jarvis, a pioneering quack-buster in the late 20th century who believed health fraud would never be eradicated: "Like any chronic disease, we will have to live with it while we do our best to fight it."
Technology is Redefining the Age of 'Older Mothers'
In October 2021, a woman from Gujarat, India, stunned the world when it was revealed she had her first child through in vitro fertilization (IVF) at age 70. She had actually been preceded by a compatriot of hers who, two years before, gave birth to twins at the age of 73, again with the help of IVF treatment. The oldest known mother to conceive naturally lived in the UK; in 1997, Dawn Brooke conceived a son at age 59.
These women may seem extreme outliers, almost freaks of nature; in the US, for example, the average age of first-time mothers is 26. A few decades from now, though, the sight of 70-year-old first-time mothers may not even raise eyebrows, say futurists.
“We could absolutely have more 70-year-old mothers because we are learning how to regulate the aging process better,” says Andrew Hessel, a microbiologist and geneticist, who cowrote "The Genesis Machine," a book about “rewriting life in the age of synthetic biology,” with Amy Webb, the futurist who recently wondered why 70-year-old women shouldn’t give birth.
Technically, we're already doing this, says Hessel, pointing to a technique known as in vitro gametogenesis (IVG). IVG refers to turning adult cells into sperm or egg cells. “You can think of it as the upgrade to IVF,” Hessel says. These vanguard stem cell research technologies can take even skin cells and turn them into induced pluripotent stem cells (iPSCs), which are basically master cells capable of maturing into any human cell, be it kidney cells, liver cells, brain cells or gametes, aka eggs and sperm, says Henry T. “Hank” Greely, a Stanford law professor who specializes in ethical, legal, and social issues in biosciences.
Mothers over 70 will be a minor blip, statistically speaking, Greely predicts.
In 2016, Greely wrote "The End of Sex," a book in which he described the science of making gametes out of iPSCs in detail. Greely says science will indeed enable us to see 70-year-old new mums fraternize with mothers several decades younger at kindergartens in the (not far) future. And it won’t be that big of a deal.
“An awful lot of children all around the world have been raised by grandmothers for millennia. To have 70-year-olds and 30-year-olds mingling in maternal roles is not new,” he says. That said, he doubts that many women will want to have a baby in the eighth decade of their life, even if science allows it. “Having a baby and raising a child is hard work. Even if 1% of all mothers are over 65, they aren’t going to change the world,” Greely says. Mothers over 70 will be a minor blip, statistically speaking, he predicts. But one thing is certain: the technology is here.
And more technologies for the same purpose could be on the way. In March 2021, researchers from Monash University in Melbourne, Australia, published research in Nature, where they successfully reprogrammed skin cells into a three-dimensional cellular structure that was morphologically and molecularly similar to a human embryo–the iBlastoid. In compliance with Australian law and international guidelines referencing the “primitive streak rule," which bans the use of embryos older than 14 days in scientific research, Monash scientists stopped growing their iBlastoids in vitro on day 11.
“The research was both cutting-edge and controversial, because it essentially created a new human life, not for the purpose of a patient who's wanting to conceive, but for basic research,” says Lindsay Wu, a senior lecturer in the School of Medical Sciences at the University of New South Wales (UNSW), in Kensington, Australia. If you really want to make sure what you are breeding is an embryo, you need to let it develop into a viable baby. “This is the real proof in the pudding,'' says Wu, who runs UNSW’s Laboratory for Ageing Research. Then you get to a stage where you decide for ethical purposes you have to abort it. “Fiddling here a bit too much?” he asks. Wu believes there are other approaches to tackling declining fertility due to older age that are less morally troubling.
He is actually working on them. Why would it be that women, who are at peak physical health in almost every other regard, in their mid- to late- thirties, have problems conceiving, asked Wu and his team in a research paper published in 2020 in Cell Reports. The simple answer is the egg cell. An average girl in puberty has between 300,000 and 400,000 eggs, while at around age 37, the same woman has only 25,000 eggs left. Things only go downhill from there. So, what torments the egg cells?
The UNSW team found that the levels of key molecules called NAD+ precursors, which are essential to the metabolism and genome stability of egg cells, decline with age. The team proceeded to add these vitamin-like substances back into the drinking water of reproductively aged, infertile lab mice, which then had babies.
“It's an important proof of concept,” says Wu. He is investigating how safe it is to replicate the experiment with humans in two ongoing studies. The ultimate goal is to restore the quality of egg cells that are left in patients in their late 30s and early- to mid-40s, says Wu. He sees the goal of getting pregnant for this age group as less ethically troubling, compared to 70-year-olds.
But what is ethical, anyway? “It is a tricky word,” says Hessel. He differentiates between ethics, which represent a personal position and may, thus, be more transient, and morality, longer lasting principles embraced across society such as, “Thou shalt not kill.” Unprecedented advances often bring out fear and antagonism until time passes and they just become…ordinary. When IVF pioneer Landrum Shettles tried to perform IVF in 1973, the chairman of Columbia’s College of Physicians and Surgeons interdicted the procedure at the last moment. Almost all countries in the world have IVF clinics today, and the global IVF services market is clearly a growth industry.
Besides, you don’t have a baby at 70 by accident: you really want it, Greely and Hessel agree. And by that age, mothers may be wiser and more financially secure, Hessel says (though he is quick to add that even the pregnancy of his own wife, who had her child at 40, was a high-risk one).
As a research question, figuring out whether older mothers are better than younger ones and vice-versa entails too many confounding variables, says Greely. And why should we focus on who’s the better mother anyway? “We've had 70-year-old and 80-year-old fathers forever–why should people have that much trouble getting used to mothers doing the same?” Greely wonders. For some women having a child at an old(er) age would be comforting; maybe that’s what matters.
And the technology to enable older women to have children is already here or coming very soon. That, perhaps, matters even more. Researchers have already created mice–and their offspring–entirely from scratch in the lab. “Doing this to produce human eggs is similar," says Hessel. "It is harder to collect tissues, and the inducing cocktails are different, but steady advances are being made." He predicts that the demand for fertility treatments will keep financing research and development in the area. He says that big leaps will be made if ethical concerns don’t block them: it is not far-fetched to believe that the first baby produced from lab-grown eggs will be born within the next decade.
In an op-ed in 2020 with Stat, Greely argued that we’ve already overcome the technical barrier for human cloning, but no one's really talking about it. Likewise, scientists are also working on enabling 70-year-old women to have babies, says Hessel, but most commentators are keeping really quiet about it. At least so far.
New Cell Therapies Give Hope to Diabetes Patients
For nearly four decades, George Huntley has thought constantly about his diabetes. Diagnosed in 1983 with Type 1 (insulin-dependent) diabetes, Huntley began managing his condition with daily finger sticks to check his blood glucose levels and doses of insulin that he injected into his abdomen. Even now, with an insulin pump and a device that continuously monitors his glucose, he must consider how every meal will affect his blood sugar, checking his monitor multiple times each hour.
Like many of those who depend on insulin injections, Huntley is simultaneously grateful for the technology that makes his condition easier to manage and tired of thinking about diabetes. If he could wave a magic wand, he says, he would make his diabetes disappear. So when he read about biotechs like ViaCyte and Vertex Pharmaceuticals developing new cell therapies that have the potential to cure Type 1 diabetes, Huntley was excited.
You also won’t see him signing up any time soon. The therapies under development by both companies would require a lifelong regimen of drugs for suppressing the immune system to prevent the body from rejecting the foreign cells. It’s a problem also seen in the transplant of insulin-producing cells of the pancreas – called islet cells – from deceased donors. To Howard Foyt, chief medical officer at ViaCyte, a San Diego-based biotech specializing in the development of cell therapies for diabetes, the tradeoff is worth it.
“A lot of the symptoms of diabetes are not something that you wear on your arm, so to speak. You’re not necessarily conscious of them until you’re successfully treated, and you feel better,” Foyt says.
For many with diabetes, managing these symptoms is a constant game of Whack-a-Mole. “Any form of treatment that gets someone closer to feeling good is a victory,” he says.
“Am I going to be trading diabetes for cancer? That’s not a chance I
want to take."
But not everyone is convinced. What’s more, it’s likely that the availability of these cell therapies will be limited to those with life-threatening diabetes symptoms, such as hypoglycemia unawareness. To Huntley, these therapies remain a bit of a Faustian bargain.
“Am I going to be trading diabetes for cancer? That’s not a chance I want to take,” he says.
The discovery of insulin in 1921 transformed Type 1 diabetes from a death sentence into a potentially manageable condition. Even as better versions of insulin hit the market—ones that weren’t derived from pigs and wouldn’t provoke an allergic response, longer-acting insulin, insulin pens—they didn’t change the reality that those with Type 1 diabetes remained dependent on insulin. Even the most advanced continuous glucose monitors (which tests blood sugar levels every few minutes, 24/7) and insulin pumps don’t perform as well as a healthy pancreas.
Whether by injection or pump, someone with diabetes needs to administer the insulin their body no longer makes. With advances in organ transplantation, the concept of transplanting insulin-producing pancreatic beta cells seemed obvious. After more than a decade of painstaking work, James Shapiro, who directs the Islet Transplant Program at the University of Albania, honed a process called the Edmonton Protocol for pancreas transplants. For a few patients who couldn’t control their blood sugars any other way, the Edmonton Protocol became a life saver. Some of these patients were even able to stop insulin completely, Shapiro says. But the high cost of organ transplant and a chronic shortage of donor organs, pancreas or otherwise, meant that only a small handful of patients could benefit.
Stem cells, however, can be grown in vats, meaning that supply would never be an issue. “We would be going from a very successful treatment of today to a potential cure tomorrow,” Shapiro says.
In 2014, spurred by his own children’s diagnoses with Type 1 Diabetes, stem cell biologist Doug Melton of Harvard University figured out a way to differentiate embryonic stem cells into functional pancreatic beta cells. It was a long process, explains immunoengineer Alice Tomei at the University of Miami, because “the islet is not one cell, it's like a mini-organ that has its own needs.”
Add on the risk of rejection and autoimmunity, and Tomei says that scientists soon realized that chronic and systemic immunosuppression was the only way forward. Over the next several years, Melton improved his approach to yield more cells with fewer impurities. Melton partnered with Boston-based Vertex Pharmaceuticals to create a cell therapy called VX-880.
The first patient received his dose earlier in 2021. In October, Vertex released 90-day results from the Phase 1/2 trial, which revealed the patient was able to reduce his insulin usage from an average of 34 units per day to just 2.9 units per day. The tradeoff is a lifelong need for immunosuppressive drugs to prevent the body from attacking both foreign cells and pancreatic beta cells. It’s what recipients of ViaCyte’s first-gen PEC-Direct will also need. For Foyt, it’s an easy choice.
“At this point in time, immunosuppression is the necessary evil,” he says. “For parents, would you like to worry about going into your child’s bedroom every morning and not knowing if they’re going to be alive or dead? It’s uncommon, but it does occur.”
Not everyone, however, finds the trade-off easy to swallow. Especially with COVID-19 cases reaching record highs, the prospect of reducing his immune function at a time when he needs it most doesn’t sit well with Huntley. The risks of immunosuppression also mean that diabetes cell therapies are limited to those patients with life-threatening complications.
It’s why ViaCyte has created two new iterations of cellular therapies that would eliminate this need. The ViaCyte-Encap contains the cells in a permeable container that allows oxygen, insulin, and nutrients to flow freely but prevents immune system access. Their latest model, PEC-QT, just began safety trials with Shapiro’s lab at the University of Alberta and uses gene editing to eliminate any cellular markers that would trigger an immune response.
Sanjoy Dutta, vice president of research at JDRF International, a nonprofit that funds the study of diabetes, is thrilled with the progress that’s been made around cell therapies, but he cautions it’s still early days. “We have proven that these cells can be made. What we haven’t seen is are they going to work for six months, two years, five years? It’s a challenge we still need to overcome,” he says.
Iowa social worker Jodi Lynn’s concerns echo Dutta’s. Lynn was diagnosed with diabetes in 1998 at age 14 after a bout of severe influenza, spends each day inventorying supplies, planning her food intake, and maintaining her insulin pump and glucose monitor. These newer technologies dramatically improved her blood sugar control but, like everyone with diabetes, Lynn remains at high risk for complications, such as diabetic ketoacidosis, heart disease, vision loss, and kidney failure. Lynn, already considered immunocompromised due to medications she takes for another autoimmune condition, is less concerned with immune suppression than the untested nature of these therapies.
“I want to know that they will work long-term,” she says.