Trading syphilis for malaria: How doctors treated one deadly disease by infecting patients with another
If you had lived one hundred years ago, syphilis – a bacterial infection spread by sexual contact – would likely have been one of your worst nightmares. Even though syphilis still exists, it can now be detected early and cured quickly with a course of antibiotics. Back then, however, before antibiotics and without an easy way to detect the disease, syphilis was very often a death sentence.
To understand how feared syphilis once was, it’s important to understand exactly what it does if it’s allowed to progress: the infections start off as small, painless sores or even a single sore near the vagina, penis, anus, or mouth. The sores disappear around three to six weeks after the initial infection – but untreated, syphilis moves into a secondary stage, often presenting as a mild rash in various areas of the body (such as the palms of a person’s hands) or through other minor symptoms. The disease progresses from there, often quietly and without noticeable symptoms, sometimes for decades before it reaches its final stages, where it can cause blindness, organ damage, and even dementia. Research indicates, in fact, that as much as 10 percent of psychiatric admissions in the early 20th century were due to dementia caused by syphilis, also known as neurosyphilis.
Like any bacterial disease, syphilis can affect kids, too. Though it’s spread primarily through sexual contact, it can also be transmitted from mother to child during birth, causing lifelong disability.
The poet-physician Aldabert Bettman, who wrote fictionalized poems based on his experiences as a doctor in the 1930s, described the effect syphilis could have on an infant in his poem Daniel Healy:
I always got away clean
when I went out
With the boys.
The night before
I was married
I went out,—But was not so fortunate;
And I infected
My bride.
When little Daniel
Was born
His eyes discharged;
And I dared not tell
That because
I had seen too much
Little Daniel sees not at all
Given the horrors of untreated syphilis, it’s maybe not surprising that people would go to extremes to try and treat it. One of the earliest remedies for syphilis, dating back to 15th century Naples, was using mercury – either rubbing it on the skin where blisters appeared, or breathing it in as a vapor. (Not surprisingly, many people who underwent this type of “treatment” died of mercury poisoning.)
Other primitive treatments included using tinctures made of a flowering plant called guaiacum, as well as inducing “sweat baths” to eliminate the syphilitic toxins. In 1910, an arsenic-based drug called Salvarsan hit the market and was hailed as a “magic bullet” for its ability to target and destroy the syphilis-causing bacteria without harming the patient. However, while Salvarsan was effective in treating early-stage syphilis, it was largely ineffective by the time the infection progressed beyond the second stage. Tens of thousands of people each year continued to die of syphilis or were otherwise shipped off to psychiatric wards due to neurosyphilis.
It was in one of these psychiatric units in the early 20th century that Dr. Julius Wagner-Juaregg got the idea for a potential cure.
Wagner-Juaregg was an Austrian-born physician trained in “experimental pathology” at the University of Vienna. Wagner-Juaregg started his medical career conducting lab experiments on animals and then moved on to work at different psychiatric clinics in Vienna, despite having no training in psychiatry or neurology.
Wagner-Juaregg’s work was controversial to say the least. At the time, medicine – particularly psychiatric medicine – did not have anywhere near the same rigorous ethical standards that doctors, researchers, and other scientists are bound to today. Wagner-Juaregg would devise wild theories about the cause of their psychiatric ailments and then perform experimental procedures in an attempt to cure them. (As just one example, Wagner-Juaregg would sterilize his adolescent male patients, thinking “excessive masturbation” was the cause of their schizophrenia.)
But sometimes these wild theories paid off. In 1883, during his residency, Wagner-Juaregg noted that a female patient with mental illness who had contracted a skin infection and suffered a high fever experienced a sudden (and seemingly miraculous) remission from her psychosis symptoms after the fever had cleared. Wagner-Juaregg theorized that inducing a high fever in his patients with neurosyphilis could help them recover as well.
Eventually, Wagner-Juaregg was able to put his theory to the test. Around 1890, Wagner-Juaregg got his hands on something called tuberculin, a therapeutic treatment created by the German microbiologist Robert Koch in order to cure tuberculosis. Tuberculin would later turn out to be completely ineffective for treating tuberculosis, often creating severe immune responses in patients – but for a short time, Wagner-Juaregg had some success in using tuberculin to help his dementia patients. Giving his patients tuberculin resulted in a high fever – and after completing the treatment, Wagner-Jauregg reported that his patient’s dementia was completely halted. The success was short-lived, however: Wagner-Juaregg eventually had to discontinue tuberculin as a treatment, as it began to be considered too toxic.
By 1917, Wagner-Juaregg’s theory about syphilis and fevers was becoming more credible – and one day a new opportunity presented itself when a wounded soldier, stricken with malaria and a related fever, was accidentally admitted to his psychiatric unit.
When his findings were published in 1918, Wagner-Juaregg’s so-called “fever therapy” swept the globe.
What Wagner-Juaregg did next was ethically deplorable by any standard: Before he allowed the soldier any quinine (the standard treatment for malaria at the time), Wagner-Juaregg took a small sample of the soldier’s blood and inoculated three syphilis patients with the sample, rubbing the blood on their open syphilitic blisters.
It’s unclear how well the malaria treatment worked for those three specific patients – but Wagner-Juaregg’s records show that in the span of one year, he inoculated a total of nine patients with malaria, for the sole purpose of inducing fevers, and six of them made a full recovery. Wagner-Juaregg’s treatment was so successful, in fact, that one of his inoculated patients, an actor who was unable to work due to his dementia, was eventually able to find work again and return to the stage. Two additional patients – a military officer and a clerk – recovered from their once-terminal illnesses and returned to their former careers as well.
When his findings were published in 1918, Wagner-Juaregg’s so-called “fever therapy” swept the globe. The treatment was hailed as a breakthrough – but it still had risks. Malaria itself had a mortality rate of about 15 percent at the time. Many people considered that to be a gamble worth taking, compared to dying a painful, protracted death from syphilis.
Malaria could also be effectively treated much of the time with quinine, whereas other fever-causing illnesses were not so easily treated. Triggering a fever by way of malaria specifically, therefore, became the standard of care.
Tens of thousands of people with syphilitic dementia would go on to be treated with fever therapy until the early 1940s, when a combination of Salvarsan and penicillin caused syphilis infections to decline. Eventually, neurosyphilis became rare, and then nearly unheard of.
Despite his contributions to medicine, it’s important to note that Wagner-Juaregg was most definitely not a person to idolize. In fact, he was an outspoken anti-Semite and proponent of eugenics, arguing that Jews were more prone to mental illness and that people who were mentally ill should be forcibly sterilized. (Wagner-Juaregg later became a Nazi sympathizer during Hitler’s rise to power even though, bizarrely, his first wife was Jewish.) Another problematic issue was that his fever therapy involved experimental treatments on many who, due to their cognitive issues, could not give informed consent.
Lack of consent was also a fundamental problem with the syphilis study at Tuskegee, appalling research that began just 14 years after Wagner-Juaregg published his “fever therapy” findings.
Still, despite his outrageous views, Wagner-Juaregg was awarded the Nobel Prize in Medicine or Physiology in 1927 – and despite some egregious human rights abuses, the miraculous “fever therapy” was partly responsible for taming one of the deadliest plagues in human history.
Matt Trau, a professor of chemistry at the University of Queensland, stunned the science world back in December when the prestigious journal Nature Communications published his lab's discovery about a unique property of cancer DNA that could lead to a simple, cheap, and accurate test to detect any type of cancer in under 10 minutes.
No one believed it. I didn't believe it. I thought, "Gosh, okay, maybe it's a fluke."
Trau granted very few interviews in the wake of the news, but he recently opened up to leapsmag about the significance of this promising early research. Here is his story in his own words, as told to Editor-in-Chief Kira Peikoff.
There's been an incredible explosion of knowledge over the past 20 years, particularly since the genome was sequenced. The area of diagnostics has a tremendous amount of promise and has caught our lab's interest. If you catch cancer early, you can improve survival rates to as high as 98 percent, sometimes even now surpassing that.
My lab is interested in devices to improve the trajectory of cancer patients. So, once people get diagnosed, can we get really sophisticated information about the molecular origins of the disease, and can we measure it in real time? And then can we match that with the best treatment and monitor it in real time, too?
I think those approaches, also coupled with immunotherapy, where one dreams of monitoring the immune system simultaneously with the disease progress, will be the future.
But currently, the methodologies for cancer are still pretty old. So, for example, let's talk about biopsies in general. Liquid biopsy just means using a blood test or a urine test, rather than extracting out a piece of solid tissue. Now consider breast cancer. Still, the cutting-edge screening method is mammography or the physical interrogation for lumps. This has had a big impact in terms of early detection and awareness, but it's still primitive compared to interrogating, forensically, blood samples to look at traces of DNA.
Large machines like CAT scans, PET scans, MRIs, are very expensive and very subjective in terms of the operator. They don't look at the root causes of the cancer. Cancer is caused by changes in DNA. These can be changes in the hard drive of the DNA (the genomic changes) or changes in the apps that the DNA are running (the epigenetics and the transcriptomics).
We don't look at that now, even though we have, emerging, all of these technologies to do it, and those technologies are getting so much cheaper. I saw some statistics at a conference just a few months ago that, in the United States, less than 1 percent of cancer patients have their DNA interrogated. That's the current state-of-the-art in the modern medical system.
Professor Matt Trau, a cancer researcher at the University of Queensland in Australia.
(Courtesy)
Blood, as the highway of the body, is carrying all of this information. Cancer cells, if they are present in the body, are constantly getting turned over. When they die, they release their contents into the blood. Many of these cells end up in the urine and saliva. Having technologies that can forensically scan the highways looking for evidence of cancer is little bit like looking for explosives at the airport. That's very valuable as a security tool.
The trouble is that there are thousands of different types of cancer. Going back to breast cancer, there's at least a dozen different types, probably more, and each of them change the DNA (the hard drive of the disease) and the epigenetics (or the RAM memory). So one of the problems for diagnostics in cancer is to find something that is a signature of all cancers. That's been a really, really, really difficult problem.
Ours was a completely serendipitous discovery. What we found in the lab was this one marker that just kept coming up in all of the types of breast cancers we were studying.
No one believed it. I didn't believe it. I thought, "Gosh, okay, maybe it's a fluke, maybe it works just for breast cancer." So we went on to test it in prostate cancer, which is also many different types of diseases, and it seemed to be working in all of those. We then tested it further in lymphoma. Again, many different types of lymphoma. It worked across all of those. We tested it in gastrointestinal cancer. Again, many different types, and still, it worked, but we were skeptical.
Then we looked at cell lines, which are cells that have come from previous cancer patients, that we grow in the lab, but are used as model experimental systems. We have many of those cell lines, both ones that are cancerous, and ones that are healthy. It was quite remarkable that the marker worked in all of the cancer cell lines and didn't work in the healthy cell lines.
What could possibly be going on?
Well, imagine DNA as a piece of string, that's your hard drive. Epigenetics is like the beads that you put on that string. Those beads you can take on and off as you wish and they control which apps are run, meaning which genetic programs the cell runs. We hypothesized that for cancer, those beads cluster together, rather than being randomly distributed across the string.
Ultimately, I see this as something that would be like a pregnancy test you could take at your doctor's office.
The implications of this are profound. It means that DNA from cancer folds in water into three-dimensional structures that are very different from healthy cells' DNA. It's quite literally the needle in a haystack. Because when you do a liquid biopsy for early detection of cancer, most of the DNA from blood contains a vast abundance of healthy DNA. And that's not of interest. What's of interest is to find the cancerous DNA. That's there only in trace.
Once we figured out what was going on, we could easily set up a system to detect the trace cancerous DNA. It binds to gold nanoparticles in water and changes color. The test takes 10 minutes, and you can detect it by eye. Red indicates cancer and blue doesn't.
We're very, very excited about where we go from here. We're starting to test the test on a greater number of cancers, in thousands of patient samples. We're looking to the scientific community to engage with us, and we're getting a really good response from groups around the world who are supplying more samples to us so we can test this more broadly.
We also are very interested in testing how early can we go with this test. Can we detect cancer through a simple blood test even before there are any symptoms whatsoever? If so, we might be able to convert a cancer diagnosis to something almost as good as a vaccine.
Of course, we have to watch what are called false positives. We don't want to be detecting people as positives when they don't have cancer, and so the technology needs to improve there. We see this version as the iPhone 1. We're interested in the iPhone 2, 3, 4, getting better and better.
Ultimately, I see this as something that would be like a pregnancy test you could take at your doctor's office. If it came back positive, your doctor could say, "Look, there's some news here, but actually, it's not bad news, it's good news. We've caught this so early that we will be able to manage this, and this won't be a problem for you."
If this were to be in routine use in the medical system, countless lives could be saved. Cancer is now becoming one of the biggest killers in the world. We're talking millions upon millions upon millions of people who are affected. This really motivates our work. We might make a difference there.
Kira Peikoff was the editor-in-chief of Leaps.org from 2017 to 2021. As a journalist, her work has appeared in The New York Times, Newsweek, Nautilus, Popular Mechanics, The New York Academy of Sciences, and other outlets. She is also the author of four suspense novels that explore controversial issues arising from scientific innovation: Living Proof, No Time to Die, Die Again Tomorrow, and Mother Knows Best. Peikoff holds a B.A. in Journalism from New York University and an M.S. in Bioethics from Columbia University. She lives in New Jersey with her husband and two young sons. Follow her on Twitter @KiraPeikoff.
Ethan Lindenberger, the Ohio teenager who sought out vaccinations after he was denied them as a child, recently testified before Congress about why his parents became anti-vaxxers. The trouble, he believes, stems from the pervasiveness of misinformation online.
There is evidence that 'educating' people with facts about the benefits of vaccination may not be effective.
"For my mother, her love and affection and care as a parent was used to push an agenda to create a false distress," he told the Senate Committee. His mother read posts on social media saying vaccines are dangerous, and that was enough to persuade her against them.
His story is an example of how widespread and harmful the current discourse on vaccinations is—and more importantly—how traditional strategies to convince people about the merits of vaccination have largely failed.
As responsible members of society, all of us have implicitly signed on to what ethicists call the "Social Contract" -- we agree to abide by certain moral and political rules of behavior. This is what our societal values, norms, and often governments are based upon. However, with the unprecedented rise of social media, alternative facts, and fake news, it is evident that our understanding—and application—of the social contract must also evolve.
Nowhere is this breakdown of societal norms more visible than in the failure to contain the spread of vaccine-preventable diseases like measles. What started off as unexplained episodes in New York City last October, mostly in communities that are under-vaccinated, has exploded into a national epidemic: 880 cases of measles across 24 states in 2019, according to the CDC (as of May 17, 2019). In fact, the Unites States is only eight months away from losing its "measles free" status, joining Venezuela as the second country out of North and South America with that status.
The U.S. is not the only country facing this growing problem. Such constant and perilous reemergence of measles and other vaccine-preventable diseases in various parts of the world raises doubts about the efficacy of current vaccination policies. In addition to the loss of valuable life, these outbreaks lead to loss of millions of dollars in unnecessary expenditure of scarce healthcare resources. While we may be living through an age of information, we are also navigating an era whose hallmark is a massive onslaught on truth.
There is ample evidence on how these outbreaks start: low-vaccination rates. At the same time, there is evidence that 'educating' people with facts about the benefits of vaccination may not be effective. Indeed, human reasoning has a limit, and facts alone rarely change a person's opinion. In a fascinating report by researchers from the University of Pennsylvania, a small experiment revealed how "behavioral nudges" could inform policy decisions around vaccination.
In the reported experiment, the vaccination rate for employees of a company increased by 1.5 percent when they were prompted to name the date when they planned to get their flu shot. In the same experiment, when employees were prompted to name both a date and a time for their planned flu shot, vaccination rate increased by 4 percent.
A randomized trial revealed the subtle power of "announcements" – direct, brief, assertive statements by physicians that assumed parents were ready to vaccinate their children.
This experiment is a part of an emerging field of behavioral economics—a scientific undertaking that uses insights from psychology to understand human decision-making. The field was born from a humbling realization that humans probably do not possess an unlimited capacity for processing information. Work in this field could inform how we can formulate vaccination policy that is effective, conserves healthcare resources, and is applicable to current societal norms.
Take, for instance, the case of Human Papilloma Virus (HPV) that can cause several types of cancers in both men and women. Research into the quality of physician communication has repeatedly revealed how lukewarm recommendations for HPV vaccination by primary care physicians likely contributes to under-immunization of eligible adolescents and can cause confusion for parents.
A randomized trial revealed the subtle power of "announcements" – direct, brief, assertive statements by physicians that assumed parents were ready to vaccinate their children. These announcements increased vaccination rates by 5.4 percent. Lengthy, open-ended dialogues demonstrated no benefit in vaccination rates. It seems that uncertainty from the physician translates to unwillingness from a parent.
Choice architecture is another compelling concept. The premise is simple: We hardly make any of our decisions in vacuum; the environment in which these decisions are made has an influence. If health systems were designed with these insights in mind, people would be more likely to make better choices—without being forced.
This theory, proposed by Richard Thaler, who won the 2017 Nobel Prize in Economics, was put to the test by physicians at the University of Pennsylvania. In their study, flu vaccination rates at primary care practices increased by 9.5 percent all because the staff implemented "active choice intervention" in their electronic health records—a prompt that nudged doctors and nurses to ask patients if they'd gotten the vaccine yet. This study illustrated how an intervention as simple as a reminder can save lives.
To be sure, some bioethicists do worry about implementing these policies. Are behavioral nudges akin to increased scrutiny or a burden for the disadvantaged? For example, would incentives to quit smoking unfairly target the poor, who are more likely to receive criticism for bad choices?
The measles outbreak is a sober reminder of how devastating it can be when the social contract breaks down.
While this is a valid concern, behavioral economics offers one of the only ethical solutions to increasing vaccination rates by addressing the most critical—and often legal—challenge to universal vaccinations: mandates. Choice architecture and other interventions encourage and inform a choice, allowing an individual to retain his or her right to refuse unwanted treatment. This distinction is especially important, as evidence suggests that people who refuse vaccinations often do so as a result of cognitive biases – systematic errors in thinking resulting from emotional attachment or a lack of information.
For instance, people are prone to "confirmation bias," or a tendency to selectively believe in information that confirms their preexisting theories, rather than the available evidence. At the same time, people do not like mandates. In such situations, choice architecture provides a useful option: people are nudged to make the right choice via the design of health delivery systems, without needing policies that rely on force.
The measles outbreak is a sober reminder of how devastating it can be when the social contract breaks down and people fall prey to misinformation. But all is not lost. As we fight a larger societal battle against alternative facts, we now have another option in the trenches to subtly encourage people to make better choices.
Using insights from research in decision-making, we can all contribute meaningfully in controversial conversations with family, friends, neighbors, colleagues, and our representatives — and push for policies that protect those we care about. A little more than a hundred years ago, thousands of lives were routinely lost to preventive illnesses. We've come too far to let ignorance destroy us now.