Trading syphilis for malaria: How doctors treated one deadly disease by infecting patients with another
If you had lived one hundred years ago, syphilis – a bacterial infection spread by sexual contact – would likely have been one of your worst nightmares. Even though syphilis still exists, it can now be detected early and cured quickly with a course of antibiotics. Back then, however, before antibiotics and without an easy way to detect the disease, syphilis was very often a death sentence.
To understand how feared syphilis once was, it’s important to understand exactly what it does if it’s allowed to progress: the infections start off as small, painless sores or even a single sore near the vagina, penis, anus, or mouth. The sores disappear around three to six weeks after the initial infection – but untreated, syphilis moves into a secondary stage, often presenting as a mild rash in various areas of the body (such as the palms of a person’s hands) or through other minor symptoms. The disease progresses from there, often quietly and without noticeable symptoms, sometimes for decades before it reaches its final stages, where it can cause blindness, organ damage, and even dementia. Research indicates, in fact, that as much as 10 percent of psychiatric admissions in the early 20th century were due to dementia caused by syphilis, also known as neurosyphilis.
Like any bacterial disease, syphilis can affect kids, too. Though it’s spread primarily through sexual contact, it can also be transmitted from mother to child during birth, causing lifelong disability.
The poet-physician Aldabert Bettman, who wrote fictionalized poems based on his experiences as a doctor in the 1930s, described the effect syphilis could have on an infant in his poem Daniel Healy:
I always got away clean
when I went out
With the boys.
The night before
I was married
I went out,—But was not so fortunate;
And I infected
My bride.
When little Daniel
Was born
His eyes discharged;
And I dared not tell
That because
I had seen too much
Little Daniel sees not at all
Given the horrors of untreated syphilis, it’s maybe not surprising that people would go to extremes to try and treat it. One of the earliest remedies for syphilis, dating back to 15th century Naples, was using mercury – either rubbing it on the skin where blisters appeared, or breathing it in as a vapor. (Not surprisingly, many people who underwent this type of “treatment” died of mercury poisoning.)
Other primitive treatments included using tinctures made of a flowering plant called guaiacum, as well as inducing “sweat baths” to eliminate the syphilitic toxins. In 1910, an arsenic-based drug called Salvarsan hit the market and was hailed as a “magic bullet” for its ability to target and destroy the syphilis-causing bacteria without harming the patient. However, while Salvarsan was effective in treating early-stage syphilis, it was largely ineffective by the time the infection progressed beyond the second stage. Tens of thousands of people each year continued to die of syphilis or were otherwise shipped off to psychiatric wards due to neurosyphilis.
It was in one of these psychiatric units in the early 20th century that Dr. Julius Wagner-Juaregg got the idea for a potential cure.
Wagner-Juaregg was an Austrian-born physician trained in “experimental pathology” at the University of Vienna. Wagner-Juaregg started his medical career conducting lab experiments on animals and then moved on to work at different psychiatric clinics in Vienna, despite having no training in psychiatry or neurology.
Wagner-Juaregg’s work was controversial to say the least. At the time, medicine – particularly psychiatric medicine – did not have anywhere near the same rigorous ethical standards that doctors, researchers, and other scientists are bound to today. Wagner-Juaregg would devise wild theories about the cause of their psychiatric ailments and then perform experimental procedures in an attempt to cure them. (As just one example, Wagner-Juaregg would sterilize his adolescent male patients, thinking “excessive masturbation” was the cause of their schizophrenia.)
But sometimes these wild theories paid off. In 1883, during his residency, Wagner-Juaregg noted that a female patient with mental illness who had contracted a skin infection and suffered a high fever experienced a sudden (and seemingly miraculous) remission from her psychosis symptoms after the fever had cleared. Wagner-Juaregg theorized that inducing a high fever in his patients with neurosyphilis could help them recover as well.
Eventually, Wagner-Juaregg was able to put his theory to the test. Around 1890, Wagner-Juaregg got his hands on something called tuberculin, a therapeutic treatment created by the German microbiologist Robert Koch in order to cure tuberculosis. Tuberculin would later turn out to be completely ineffective for treating tuberculosis, often creating severe immune responses in patients – but for a short time, Wagner-Juaregg had some success in using tuberculin to help his dementia patients. Giving his patients tuberculin resulted in a high fever – and after completing the treatment, Wagner-Jauregg reported that his patient’s dementia was completely halted. The success was short-lived, however: Wagner-Juaregg eventually had to discontinue tuberculin as a treatment, as it began to be considered too toxic.
By 1917, Wagner-Juaregg’s theory about syphilis and fevers was becoming more credible – and one day a new opportunity presented itself when a wounded soldier, stricken with malaria and a related fever, was accidentally admitted to his psychiatric unit.
When his findings were published in 1918, Wagner-Juaregg’s so-called “fever therapy” swept the globe.
What Wagner-Juaregg did next was ethically deplorable by any standard: Before he allowed the soldier any quinine (the standard treatment for malaria at the time), Wagner-Juaregg took a small sample of the soldier’s blood and inoculated three syphilis patients with the sample, rubbing the blood on their open syphilitic blisters.
It’s unclear how well the malaria treatment worked for those three specific patients – but Wagner-Juaregg’s records show that in the span of one year, he inoculated a total of nine patients with malaria, for the sole purpose of inducing fevers, and six of them made a full recovery. Wagner-Juaregg’s treatment was so successful, in fact, that one of his inoculated patients, an actor who was unable to work due to his dementia, was eventually able to find work again and return to the stage. Two additional patients – a military officer and a clerk – recovered from their once-terminal illnesses and returned to their former careers as well.
When his findings were published in 1918, Wagner-Juaregg’s so-called “fever therapy” swept the globe. The treatment was hailed as a breakthrough – but it still had risks. Malaria itself had a mortality rate of about 15 percent at the time. Many people considered that to be a gamble worth taking, compared to dying a painful, protracted death from syphilis.
Malaria could also be effectively treated much of the time with quinine, whereas other fever-causing illnesses were not so easily treated. Triggering a fever by way of malaria specifically, therefore, became the standard of care.
Tens of thousands of people with syphilitic dementia would go on to be treated with fever therapy until the early 1940s, when a combination of Salvarsan and penicillin caused syphilis infections to decline. Eventually, neurosyphilis became rare, and then nearly unheard of.
Despite his contributions to medicine, it’s important to note that Wagner-Juaregg was most definitely not a person to idolize. In fact, he was an outspoken anti-Semite and proponent of eugenics, arguing that Jews were more prone to mental illness and that people who were mentally ill should be forcibly sterilized. (Wagner-Juaregg later became a Nazi sympathizer during Hitler’s rise to power even though, bizarrely, his first wife was Jewish.) Another problematic issue was that his fever therapy involved experimental treatments on many who, due to their cognitive issues, could not give informed consent.
Lack of consent was also a fundamental problem with the syphilis study at Tuskegee, appalling research that began just 14 years after Wagner-Juaregg published his “fever therapy” findings.
Still, despite his outrageous views, Wagner-Juaregg was awarded the Nobel Prize in Medicine or Physiology in 1927 – and despite some egregious human rights abuses, the miraculous “fever therapy” was partly responsible for taming one of the deadliest plagues in human history.
The coronavirus pandemic exposed significant weaknesses in the country's food supply chain. Grocery store meat counters were bare. Transportation interruptions influenced supply. Finding beef, poultry, and pork at the store has been, in some places, as challenging as finding toilet paper.
In traditional agriculture models, it takes at least three months to raise chicken, six to nine months for pigs, and 18 months for cattle.
It wasn't a lack of supply -- millions of animals were in the pipeline.
"There's certainly enough food out there, but it can't get anywhere because of the way our system is set up," said Amy Rowat, an associate professor of integrative biology and physiology at UCLA. "Having a more self-contained, self-sufficient way to produce meat could make the supply chain more robust."
Cultured meat could be one way of making the meat supply chain more resilient despite disruptions due to pandemics such as COVID-19. But is the country ready to embrace lab-grown food?
According to a Good Food Institute study, GenZ is almost twice as likely to embrace meat alternatives for reasons related to social and environmental awareness, even prior to the pandemic. That's because this group wants food choices that reflect their values around food justice, equity, and animal welfare.
Largely, the interest in protein alternatives has been plant-based foods. However, factors directly related to COVID-19 may accelerate consumer interest in the scaling up of cell-grown products, according to Liz Specht, the associate director of science and technology at The Good Food Institute. The latter is a nonprofit organization that supports scientists, investors, and entrepreneurs working to develop food alternatives to conventional animal products.
While lab-grown food isn't ready yet to definitively crisis-proof the food supply chain, experts say it offers promise.
Matching Supply and Demand
Companies developing cell-grown meat claim it can take as few as two months to develop a cell into an edible product, according to Anthony Chow, CFA at Agronomics Limited, an investment company focused on meat alternatives. Tissue is taken from an animal and placed in a culture that contains nutrients and proteins the cells need to grow and expand. He cites a Good Food Institute report that claims a 2.5-millimeter sample can grow three and a half tons of meat in 40 days, allowing for exponential growth when needed.
In traditional agriculture models, it takes at least three months to raise chicken, six to nine months for pigs, and 18 months for cattle. To keep enough maturing animals in the pipeline, farms must plan the number of animals to raise months -- even years -- in advance. Lab-grown meat advocates say that because cultured meat supplies can be flexible, it theoretically allows for scaling up or down in significantly less time.
"Supply and demand has drastically changed in some way around the world and cultivated meat processing would be able to adapt much quicker than conventional farming," Chow said.
Scaling Up
Lab-grown meat may provide an eventual solution, but not in the immediate future, said Paul Mozdziak, a professor of physiology at North Carolina State University who researches animal cell culture techniques, transgenic animal production, and muscle biology.
"The challenge is in culture media," he said. "It's going to take some innovation to get the cells to grow at quantities that are going to be similar to what you can get from an animal. These are questions that everybody in the space is working on."
Chow says some of the most advanced cultured meat companies, such as BlueNal, anticipate introducing products to the market midway through next year. However, he thinks COVID-19 has slowed the process. Once introduced, they will be at a premium price, most likely available at restaurants before they hit grocery store shelves.
"I think in five years' time it will be in a different place," he said. "I don't think that this will have relevance for this pandemic, but certainly beyond that."
"Plant-based meats may be perceived as 'alternatives' to meat, whereas lab-grown meat is producing the same meat, just in a much more efficient manner, without the environmental implications."
Of course, all the technological solutions in the world won't solve the problem unless people are open-minded about embracing them. At least for now, a lab-grown burger or bluefin tuna might still be too strange for many people, especially in the U.S.
For instance, a 2019 article published by "Frontiers in Sustainable Food Systems" reflects results from a study of 3,030 consumers showing that 29 percent of U.S. customers, 59 percent of Chinese consumers, and 56 percent of Indian consumers were either 'very' or 'extremely likely' to try cultivated meat.
"Lab-grown meat is genuine meat, at the cellular level, and therefore will match conventional meat with regard to its nutritional content and overall sensory experience. It could be argued that plant-based meat will never be able to achieve this," says Laura Turner, who works with Chow at Agronomics Limited. "Plant-based meats may be perceived as 'alternatives' to meat, whereas lab-grown meat is producing the same meat, just in a much more efficient manner, without the environmental implications."
A Solution Beyond This Pandemic
The coronavirus has done more than raise awareness of the fragility of food supply chains. It has also been a wakeup call for consumers and policy makers that it is time to radically rethink our meat, Specht says. Those factors have elevated the profile of lab-grown meat.
"I think the economy is getting a little bit more steam and if I was an investor, I would be getting excited about it," adds Mozdziak.
Beyond crises, Mozdziak explains that as affluence continues to increase globally, meat consumption increases exponentially. Yet farm animals can only grow so quickly and traditional farming won't be able to keep up.
"Even Tyson is saying that by 2050, there's not going to be enough capacity in the animal meat space to meet demand," he notes. "If we don't look at some innovative technologies, how are we going to overcome that?"
By mid-March, Alpha Lee was growing restless. A pioneer of AI-driven drug discovery, Lee leads a team of researchers at the University of Cambridge, but his lab had been closed amidst the government-initiated lockdowns spreading inexorably across Europe.
If the Moonshot proves successful, they hope it could serve as a future benchmark for finding new medicines for chronic diseases.
Having spoken to his collaborators across the globe – many of whom were seeing their own experiments and research projects postponed indefinitely due to the pandemic – he noticed a similar sense of frustration and helplessness in the face of COVID-19.
While there was talk of finding a novel treatment for the virus, Lee was well aware the process was likely to be long and laborious. Traditional methods of drug discovery risked suffering the same fate as the efforts to find a cure for SARS in the early 2000, which took years and were ultimately abandoned long before a drug ever reached the market.
To avoid such an outcome, Lee was convinced that global collaboration was required. Together with a collection of scientists in the UK, US and Israel, he launched the 'COVID Moonshot' – a project which encouraged chemists worldwide to share their ideas for potential drug designs. If the Moonshot proves successful, they hope it could serve as a future benchmark for finding new medicines for chronic diseases.
Solving a Complex Jigsaw
In February, ShanghaiTech University published the first detailed snapshots of the SARS-CoV-2 coronavirus's proteins using a technique called X-ray crystallography. In particular, they revealed a high-resolution profile of the virus's main protease – the part of its structure that enables it to replicate inside a host – and the main drug target. The images were tantalizing.
"We could see all the tiny pieces sitting in the structure like pieces of a jigsaw," said Lee. "All we needed was for someone to come up with the best idea of joining these pieces together with a drug. Then you'd be left with a strong molecule which sits in the protease, and stops it from working, killing the virus in the process."
Normally, ideas for how best to design such a drug would be kept as carefully guarded secrets within individual labs and companies due to their potential value. But as a result, the steady process of trial and error to reach an optimum design can take years to come to fruition.
However, given the scale of the global emergency, Lee felt that the scientific community would be open to collective brainstorming on a mass scale. "Big Pharma usually wouldn't necessarily do this, but time is of the essence here," he said. "It was a case of, 'Let's just rethink every drug discovery stage to see -- ok, how can we go as fast as we can?'"
On March 13, he launched the COVID moonshot, calling for chemists around the globe to come up with the most creative ideas they could think of, on their laptops at home. No design was too weird or wacky to be considered, and crucially nothing would be patented. The entire project would be done on a not-for-profit basis, meaning that any drug that makes it to market will have been created simply for the good of humanity.
It caught fire: Within just two weeks, more than 2,300 potential drug designs had been submitted. By the middle of July, over 10,000 had been received from scientists around the globe.
The Road Toward Clinical Trials
With so many designs to choose from, the team has been attempting to whittle them down to a shortlist of the most promising. Computational drug discovery experts at Diamond and the Weizmann Institute of Science in Rehovot, Israel, have enabled the Moonshot team to develop algorithms for predicting how quick and easy each design would be to make, and to predict how well each proposed drug might bind to the virus in real life.
The latter is an approach known as computational covalent docking and has previously been used in cancer research. "This was becoming more popular even before COVID-19, with several covalent drugs approved by the FDA in recent years," said Nir London, professor of organic chemistry at the Weizmann Institute, and one of the Moonshot team members. "However, all of these were for oncology. A covalent drug against SARS-CoV-2 will certainly highlight covalent drug-discovery as a viable option."
Through this approach, the team have selected 850 compounds to date, which they have manufactured and tested in various preclinical trials already. Fifty of these compounds - which appear to be especially promising when it comes to killing the virus in a test tube – are now being optimized further.
Lee is hoping that at least one of these potential drugs will be shown to be effective in curing animals of COVID-19 within the next six months, a step that would allow the Moonshot team to reach out to potential pharmaceutical partners to test their compounds in humans.
Future Implications
If the project does succeed, some believe it could open the door to scientific crowdsourcing as a future means of generating novel medicine ideas for other diseases. Frank von Delft, professor of protein science and structural biology at the University of Oxford's Nuffield Department of Medicine, described it as a new form of 'citizen science.'
"There's a vast resource of expertise and imagination that is simply dying to be tapped into," he said.
Others are slightly more skeptical, pointing out that the uniqueness of the current crisis has meant that many scientists were willing to contribute ideas without expecting any future compensation in return. This meant that it was easy to circumvent the traditional hurdles that prevent large-scale global collaborations from happening – namely how to decide who will profit from the final product and who will hold the intellectual property (IP) rights.
"I think it is too early to judge if this is a viable model for future drug discovery," says London. "I am not sure that without the existential threat we would have seen so many contributions, and so many people and institutions willing to waive compensation and future royalties. Many scientists found themselves at home, frustrated that they don't have a way to contribute to the fight against COVID-19, and this project gave them an opportunity. Plus many can get behind the fact that this project has no associated IP and no one will get rich off of this effort. This breaks down a lot of the typical barriers and red-tape for wider collaboration."
"If a drug would sprout from one of these crowdsourced ideas, it would serve as a very powerful argument to consider this mode of drug discovery further in the future."
However the Moonshot team believes that if they can succeed, it will at the very least send a strong statement to policy makers and the scientific community that greater efforts should be made to make such large-scale collaborations more feasible.
"All across the scientific world, we've seen unprecedented adoption of open-science, collaboration and collegiality during this crisis, perhaps recognizing that only a coordinated global effort could address this global challenge," says London. "If a drug would sprout from one of these crowdsourced ideas, it would serve as a very powerful argument to consider this mode of drug discovery further in the future."
[An earlier version of this article was published on June 8th, 2020 as part of a standalone magazine called GOOD10: The Pandemic Issue. Produced as a partnership among LeapsMag, The Aspen Institute, and GOOD, the magazine is available for free online.]