Trading syphilis for malaria: How doctors treated one deadly disease by infecting patients with another
If you had lived one hundred years ago, syphilis – a bacterial infection spread by sexual contact – would likely have been one of your worst nightmares. Even though syphilis still exists, it can now be detected early and cured quickly with a course of antibiotics. Back then, however, before antibiotics and without an easy way to detect the disease, syphilis was very often a death sentence.
To understand how feared syphilis once was, it’s important to understand exactly what it does if it’s allowed to progress: the infections start off as small, painless sores or even a single sore near the vagina, penis, anus, or mouth. The sores disappear around three to six weeks after the initial infection – but untreated, syphilis moves into a secondary stage, often presenting as a mild rash in various areas of the body (such as the palms of a person’s hands) or through other minor symptoms. The disease progresses from there, often quietly and without noticeable symptoms, sometimes for decades before it reaches its final stages, where it can cause blindness, organ damage, and even dementia. Research indicates, in fact, that as much as 10 percent of psychiatric admissions in the early 20th century were due to dementia caused by syphilis, also known as neurosyphilis.
Like any bacterial disease, syphilis can affect kids, too. Though it’s spread primarily through sexual contact, it can also be transmitted from mother to child during birth, causing lifelong disability.
The poet-physician Aldabert Bettman, who wrote fictionalized poems based on his experiences as a doctor in the 1930s, described the effect syphilis could have on an infant in his poem Daniel Healy:
I always got away clean
when I went out
With the boys.
The night before
I was married
I went out,—But was not so fortunate;
And I infected
My bride.
When little Daniel
Was born
His eyes discharged;
And I dared not tell
That because
I had seen too much
Little Daniel sees not at all
Given the horrors of untreated syphilis, it’s maybe not surprising that people would go to extremes to try and treat it. One of the earliest remedies for syphilis, dating back to 15th century Naples, was using mercury – either rubbing it on the skin where blisters appeared, or breathing it in as a vapor. (Not surprisingly, many people who underwent this type of “treatment” died of mercury poisoning.)
Other primitive treatments included using tinctures made of a flowering plant called guaiacum, as well as inducing “sweat baths” to eliminate the syphilitic toxins. In 1910, an arsenic-based drug called Salvarsan hit the market and was hailed as a “magic bullet” for its ability to target and destroy the syphilis-causing bacteria without harming the patient. However, while Salvarsan was effective in treating early-stage syphilis, it was largely ineffective by the time the infection progressed beyond the second stage. Tens of thousands of people each year continued to die of syphilis or were otherwise shipped off to psychiatric wards due to neurosyphilis.
It was in one of these psychiatric units in the early 20th century that Dr. Julius Wagner-Juaregg got the idea for a potential cure.
Wagner-Juaregg was an Austrian-born physician trained in “experimental pathology” at the University of Vienna. Wagner-Juaregg started his medical career conducting lab experiments on animals and then moved on to work at different psychiatric clinics in Vienna, despite having no training in psychiatry or neurology.
Wagner-Juaregg’s work was controversial to say the least. At the time, medicine – particularly psychiatric medicine – did not have anywhere near the same rigorous ethical standards that doctors, researchers, and other scientists are bound to today. Wagner-Juaregg would devise wild theories about the cause of their psychiatric ailments and then perform experimental procedures in an attempt to cure them. (As just one example, Wagner-Juaregg would sterilize his adolescent male patients, thinking “excessive masturbation” was the cause of their schizophrenia.)
But sometimes these wild theories paid off. In 1883, during his residency, Wagner-Juaregg noted that a female patient with mental illness who had contracted a skin infection and suffered a high fever experienced a sudden (and seemingly miraculous) remission from her psychosis symptoms after the fever had cleared. Wagner-Juaregg theorized that inducing a high fever in his patients with neurosyphilis could help them recover as well.
Eventually, Wagner-Juaregg was able to put his theory to the test. Around 1890, Wagner-Juaregg got his hands on something called tuberculin, a therapeutic treatment created by the German microbiologist Robert Koch in order to cure tuberculosis. Tuberculin would later turn out to be completely ineffective for treating tuberculosis, often creating severe immune responses in patients – but for a short time, Wagner-Juaregg had some success in using tuberculin to help his dementia patients. Giving his patients tuberculin resulted in a high fever – and after completing the treatment, Wagner-Jauregg reported that his patient’s dementia was completely halted. The success was short-lived, however: Wagner-Juaregg eventually had to discontinue tuberculin as a treatment, as it began to be considered too toxic.
By 1917, Wagner-Juaregg’s theory about syphilis and fevers was becoming more credible – and one day a new opportunity presented itself when a wounded soldier, stricken with malaria and a related fever, was accidentally admitted to his psychiatric unit.
When his findings were published in 1918, Wagner-Juaregg’s so-called “fever therapy” swept the globe.
What Wagner-Juaregg did next was ethically deplorable by any standard: Before he allowed the soldier any quinine (the standard treatment for malaria at the time), Wagner-Juaregg took a small sample of the soldier’s blood and inoculated three syphilis patients with the sample, rubbing the blood on their open syphilitic blisters.
It’s unclear how well the malaria treatment worked for those three specific patients – but Wagner-Juaregg’s records show that in the span of one year, he inoculated a total of nine patients with malaria, for the sole purpose of inducing fevers, and six of them made a full recovery. Wagner-Juaregg’s treatment was so successful, in fact, that one of his inoculated patients, an actor who was unable to work due to his dementia, was eventually able to find work again and return to the stage. Two additional patients – a military officer and a clerk – recovered from their once-terminal illnesses and returned to their former careers as well.
When his findings were published in 1918, Wagner-Juaregg’s so-called “fever therapy” swept the globe. The treatment was hailed as a breakthrough – but it still had risks. Malaria itself had a mortality rate of about 15 percent at the time. Many people considered that to be a gamble worth taking, compared to dying a painful, protracted death from syphilis.
Malaria could also be effectively treated much of the time with quinine, whereas other fever-causing illnesses were not so easily treated. Triggering a fever by way of malaria specifically, therefore, became the standard of care.
Tens of thousands of people with syphilitic dementia would go on to be treated with fever therapy until the early 1940s, when a combination of Salvarsan and penicillin caused syphilis infections to decline. Eventually, neurosyphilis became rare, and then nearly unheard of.
Despite his contributions to medicine, it’s important to note that Wagner-Juaregg was most definitely not a person to idolize. In fact, he was an outspoken anti-Semite and proponent of eugenics, arguing that Jews were more prone to mental illness and that people who were mentally ill should be forcibly sterilized. (Wagner-Juaregg later became a Nazi sympathizer during Hitler’s rise to power even though, bizarrely, his first wife was Jewish.) Another problematic issue was that his fever therapy involved experimental treatments on many who, due to their cognitive issues, could not give informed consent.
Lack of consent was also a fundamental problem with the syphilis study at Tuskegee, appalling research that began just 14 years after Wagner-Juaregg published his “fever therapy” findings.
Still, despite his outrageous views, Wagner-Juaregg was awarded the Nobel Prize in Medicine or Physiology in 1927 – and despite some egregious human rights abuses, the miraculous “fever therapy” was partly responsible for taming one of the deadliest plagues in human history.
Your Future Smartphone May Detect Problems in Your Water
In 2014, the city of Flint, Michigan switched the residents' water supply to the Flint river, citing cheaper costs. However, due to improper filtering, lead contaminated this water, and according to the Associated Press, many of the city's residents soon reported health issues like hair loss and rashes. In 2015, a report found that children there had high levels of lead in their blood. The National Resource Defense Council recently discovered there could still be as many as twelve million lead pipes carrying water to homes across the U.S.
What if Flint residents and others in afflicted areas could simply flick water onto their phone screens and an app would tell them if they were about to drink contaminated water? This is what researchers at the University of Cambridge are working on to prevent catastrophes like what occurred in Flint, and to prepare for an uncertain future of scarcer resources.
Underneath the tough glass of our phone screen lies a transparent layer of electrodes. Because our bodies hold an electric charge, when our finger touches the screen, it disrupts the electric field created among the electrodes. This is how the screen can sense where a touch occurs. Cambridge scientists used this same idea to explore whether the screen could detect charges in water, too. Metals like arsenic and lead can appear in water in the form of ions, which are charged particles. When the ionic solution is placed on the screen's surface, the electrodes sense that charge like how they sense our finger.
Imagine a new generation of smartphones with a designated area of the screen responsible for detecting contamination—this is one of the possible futures the researchers propose.
The experiment measured charges in various electrolyte solutions on a touchscreen. The researchers found that a thin polymer layer between the electrodes and the sample solution helped pick up the charges.
"How can we get really close to the touch electrodes, and be better than a phone screen?" Horstmann, the lead scientist on the study, asked himself while designing the protective coating. "We found that when we put electrolytes directly on the electrodes, they were too close, even short-circuiting," he said. When they placed the polymer layer on top the electrodes, however, this short-circuiting did not occur. Horstmann speaks of the polymer layer as one of the key findings of the paper, as it allowed for optimum conductivity. The coating they designed was much thinner than what you'd see with a typical smartphone touchscreen, but because it's already so similar, he feels optimistic about the technology's practical applications in the real world.
While the Cambridge scientists were using touchscreens to measure water contamination, Dr. Baojun Wang, a synthetic biologist at the University of Edinburgh, along with his team, created a way to measure arsenic contamination in Bangladesh groundwater samples using what is called a cell-based biosensor. These biosensors use cornerstones of cellular activity like transcription and promoter sequences to detect the presence of metal ions in water. A promoter can be thought of as a "flag" that tells certain molecules where to begin copying genetic code. By hijacking this aspect of the cell's machinery and increasing the cell's sensing and signal processing ability, they were able to amplify the signal to detect tiny amounts of arsenic in the groundwater samples. All this was conducted in a 384-well plate, each well smaller than a pencil eraser.
They placed arsenic sensors with different sensitivities across part of the plate so it resembled a volume bar of increasing levels of arsenic, similar to diagnostics on a Fitbit or glucose monitor. The whole device is about the size of an iPhone, and can be scaled down to a much smaller size.
Dr. Wang says cell-based biosensors are bringing sensing technology closer to field applications, because their machinery uses inherent cellular activity. This makes them ideal for low-resource communities, and he expects his device to be affordable, portable, and easily stored for widespread use in households.
"It hasn't worked on actual phones yet, but I don't see any reason why it can't be an app," says Horstmann of their technology. Imagine a new generation of smartphones with a designated area of the screen responsible for detecting contamination—this is one of the possible futures the researchers propose. But industry collaborations will be crucial to making their advancements practical. The scientists anticipate that without collaborative efforts from the business sector, the public might have to wait ten years until this becomes something all our smartphones are capable of—but with the right partners, "it could go really quickly," says Dr. Elizabeth Hall, one of the authors on the touchscreen water contamination study.
"That's where the science ends and the business begins," Dr. Hall says. "There is a lot of interest coming through as a result of this paper. I think the people who make the investments and decisions are seeing that there might be something useful here."
As for Flint, according to The Detroit News, the city has entered the final stages in removing lead pipe infrastructure. It's difficult to imagine how many residents might fare better today if they'd had the technology that scientists are now creating.
Of all its tragedy, COVID-19 has increased demand for at-home testing methods, which has carried over to non-COVID-19-related devices. Various testing efforts are now in the public eye.
"I like that the public is watching these directions," says Horstmann. "I think there's a long way to go still, but it's exciting."
Fungus is the ‘New Black’ in Eco-Friendly Fashion
A natural material that looks and feels like real leather is taking the fashion world by storm. Scientists view mycelium—the vegetative part of a mushroom-producing fungus—as a planet-friendly alternative to animal hides and plastics.
Products crafted from this vegan leather are emerging, with others poised to hit the market soon. Among them are the Hermès Victoria bag, Lululemon's yoga accessories, Adidas' Stan Smith Mylo sneaker, and a Stella McCartney apparel collection.
The Adidas' Stan Smith Mylo concept sneaker, made in partnership with Bolt Threads, uses an alternative leather grown from mycelium; a commercial version is expected in the near future.
Adidas
Hermès has held presales on the new bag, says Philip Ross, co-founder and chief technology officer of MycoWorks, a San Francisco Bay area firm whose materials constituted the design. By year-end, Ross expects several more clients to debut mycelium-based merchandise. With "comparable qualities to luxury leather," mycelium can be molded to engineer "all the different verticals within fashion," he says, particularly footwear and accessories.
More than a half-dozen trailblazers are fine-tuning mycelium to create next-generation leather materials, according to the Material Innovation Initiative, a nonprofit advocating for animal-free materials in the fashion, automotive, and home-goods industries. These high-performance products can supersede items derived from leather, silk, down, fur, wool, and exotic skins, says A. Sydney Gladman, the institute's chief scientific officer.
That's only the beginning of mycelium's untapped prowess. "We expect to see an uptick in commercial leather alternative applications for mycelium-based materials as companies refine their R&D [research and development] and scale up," Gladman says, adding that "technological innovation and untapped natural materials have the potential to transform the materials industry and solve the enormous environmental challenges it faces."
In fewer than 10 days in indoor agricultural farms, "we grow large slabs of mycelium that are many feet wide and long. We are not confined to the shape or geometry of an animal."
Reducing our carbon footprint becomes possible because mycelium can flourish in indoor farms, using agricultural waste as feedstock and emitting inherently low greenhouse gas emissions. Carbon dioxide is the primary greenhouse gas. "We often think that when plant tissues like wood rot, that they go from something to nothing," says Jonathan Schilling, professor of plant and microbial biology at the University of Minnesota and a member of MycoWorks' Scientific Advisory Board.
But that assumption doesn't hold true for all carbon in plant tissues. When the fungi dominating the decomposition of plants fulfill their function, they transform a large portion of carbon into fungal biomass, Schilling says. That, in turn, ends up in the soil, with mycelium forming a network underneath that traps the carbon.
Unlike the large amounts of fossil fuels needed to produce styrofoam, leather and plastic, less fuel-intensive processing is involved in creating similar materials with a fungal organism. While some fungi consist of a single cell, others are multicellular and develop as very fine threadlike structures. A mass of them collectively forms a "mycelium" that can be either loose and low density or tightly packed and high density. "When these fungi grow at extremely high density," Schilling explains, "they can take on the feel of a solid material such as styrofoam, leather or even plastic."
Tunable and supple in the cultivation process, mycelium is also reliably sturdy in composition. "We believe that mycelium has some unique attributes that differentiate it from plastic-based and animal-derived products," says Gavin McIntyre, who co-founded Ecovative Design, an upstate New York-based biomaterials company, in 2007 with the goal of displacing some environmentally burdensome materials and making "a meaningful impact on our planet."
After inventing a type of mushroom-based packaging for all sorts of goods, in 2013 the firm ventured into manufacturing mycelium that can be adapted for textiles, he says, because mushrooms are "nature's recycling system."
The company aims for its material—which is "so tough and tenacious" that it doesn't require any plastic add-on as reinforcement—to be generally accessible from a pricing standpoint and not confined to a luxury space. The cost, McIntyre says, would approach that of bovine leather, not the more upscale varieties of lamb and goat skins.
Already, production has taken off by leaps and bounds. In fewer than 10 days in indoor agricultural farms, "we grow large slabs of mycelium that are many feet wide and long," he says. "We are not confined to the shape or geometry of an animal," so there's a much lower scrap rate.
Decreasing the scrap rate is a major selling point. "Our customers can order the pieces to the way that they want them, and there is almost no waste in the processing," explains Ross of MycoWorks. "We can make ours thinner or thicker," depending on a client's specific needs. Growing materials locally also results in a reduction in transportation, shipping, and other supply chain costs, he says.
Yet another advantage to making things out of mycelium is its biodegradability at the end of an item's lifecycle. When a pair of old sneakers lands in a compost pile or landfill, it decomposes thanks to microbial processes that, once again, involve fungi. "It is cool to think that the same organism used to create a product can also be what recycles it, perhaps building something else useful in the same act," says biologist Schilling. That amounts to "more than a nice business model—it is a window into how sustainability works in nature."
A product can be called "sustainable" if it's biodegradable, leaves a minimal carbon footprint during production, and is also profitable, says Preeti Arya, an assistant professor at the Fashion Institute of Technology in New York City and faculty adviser to a student club of the American Association of Textile Chemists and Colorists.
On the opposite end of the spectrum, products composed of petroleum-based polymers don't biodegrade—they break down into smaller pieces or even particles. These remnants pollute landfills, oceans, and rivers, contaminating edible fish and eventually contributing to the growth of benign and cancerous tumors in humans, Arya says.
Commending the steps a few designers have taken toward bringing more environmentally conscious merchandise to consumers, she says, "I'm glad that they took the initiative because others also will try to be part of this competition toward sustainability." And consumers will take notice. "The more people become aware, the more these brands will start acting on it."
A further shift toward mycelium-based products has the capability to reap tremendous environmental dividends, says Drew Endy, associate chair of bioengineering at Stanford University and president of the BioBricks Foundation, which focuses on biotechnology in the public interest.
The continued development of "leather surrogates on a scaled and sustainable basis will provide the greatest benefit to the greatest number of people, in perpetuity," Endy says. "Transitioning the production of leather goods from a process that involves the industrial-scale slaughter of vertebrate mammals to a process that instead uses renewable fungal-based manufacturing will be more just."