How thousands of first- and second-graders saved the world from a deadly disease
Exactly 67 years ago, in 1955, a group of scientists and reporters gathered at the University of Michigan and waited with bated breath for Dr. Thomas Francis Jr., director of the school’s Poliomyelitis Vaccine Evaluation Center, to approach the podium. The group had gathered to hear the news that seemingly everyone in the country had been anticipating for the past two years – whether the vaccine for poliomyelitis, developed by Francis’s former student Jonas Salk, was effective in preventing the disease.
Polio, at that point, had become a household name. As the highly contagious virus swept through the United States, cities closed their schools, movie theaters, swimming pools, and even churches to stop the spread. For most, polio presented as a mild illness, and was usually completely asymptomatic – but for an unlucky few, the virus took hold of the central nervous system and caused permanent paralysis of muscles in the legs, arms, and even people’s diaphragms, rendering the person unable to walk and breathe. It wasn’t uncommon to hear reports of people – mostly children – who fell sick with a flu-like virus and then, just days later, were relegated to spend the rest of their lives in an iron lung.
For two years, researchers had been testing a vaccine that would hopefully be able to stop the spread of the virus and prevent the 45,000 infections each year that were keeping the nation in a chokehold. At the podium, Francis greeted the crowd and then proceeded to change the course of human history: The vaccine, he reported, was “safe, effective, and potent.” Widespread vaccination could begin in just a few weeks. The nightmare was over.
The road to success
Jonas Salk, a medical researcher and virologist who developed the vaccine with his own research team, would rightfully go down in history as the man who eradicated polio. (Today, wild poliovirus circulates in just two countries, Afghanistan and Pakistan – with only 140 cases reported in 2020.) But many people today forget that the widespread vaccination campaign that effectively ended wild polio across the globe would have never been possible without the human clinical trials that preceded it.
As with the COVID-19 vaccine, skepticism and misinformation around the polio vaccine abounded. But even more pervasive than the skepticism was fear. The consequences of polio had arguably never been more visible.
The road to human clinical trials – and the resulting vaccine – was a long one. In 1938, President Franklin Delano Roosevelt launched the National Foundation for Infantile Paralysis in order to raise funding for research and development of a polio vaccine. (Today, we know this organization as the March of Dimes.) A polio survivor himself, Roosevelt elevated awareness and prevention into the national spotlight, even more so than it had been previously. Raising funds for a safe and effective polio vaccine became a cornerstone of his presidency – and the funds raked in by his foundation went primarily to Salk to fund his research.
The Trials Begin
Salk’s vaccine, which included an inactivated (killed) polio virus, was promising – but now the researchers needed test subjects to make global vaccination a possibility. Because the aim of the vaccine was to prevent paralytic polio, researchers decided that they had to test the vaccine in the population that was most vulnerable to paralysis – young children. And, because the rate of paralysis was so low even among children, the team required many children to collect enough data. Francis, who led the trial to evaluate Salk’s vaccine, began the process of recruiting more than one million school-aged children between the ages of six and nine in 272 counties that had the highest incidence of the disease. The participants were nicknamed the “Polio Pioneers.”
Double-blind, placebo-based trials were considered the “gold standard” of epidemiological research back in Francis's day - and they remain the best approach we have today. These rigorous scientific studies are designed with two participant groups in mind. One group, called the test group, receives the experimental treatment (such as a vaccine); the other group, called the control, receives an inactive treatment known as a placebo. The researchers then compare the effects of the active treatment against the effects of the placebo, and every researcher is “blinded” as to which participants receive what treatment. That way, the results aren’t tainted by any possible biases.
But the study was controversial in that only some of the individual field trials at the county and state levels had a placebo group. Researchers described this as a “calculated risk,” meaning that while there were risks involved in giving the vaccine to a large number of children, the bigger risk was the potential paralysis or death that could come with being infected by polio. In all, just 200,000 children across the US received a placebo treatment, while an additional 725,000 children acted as observational controls – in other words, researchers monitored them for signs of infection, but did not give them any treatment.
As with the COVID-19 vaccine, skepticism and misinformation around the polio vaccine abounded. But even more pervasive than the skepticism was fear. President Roosevelt, who had made many public and televised appearances in a wheelchair, served as a perpetual reminder of the consequences of polio, as an infection at age 39 had rendered him permanently unable to walk. The consequences of polio had arguably never been more visible, and parents signed up their children in droves to participate in the study and offer them protection.
The Polio Pioneer Legacy
In a little less than a year, roughly half a million children received a dose of Salk’s polio vaccine. While plenty of children were hesitant to get the shot, many former participants still remember the fear surrounding the disease. One former participant, a Polio Pioneer named Debbie LaCrosse, writes of her experience: “There was no discussion, no listing of pros and cons. No amount of concern over possible side effects or other unknowns associated with a new vaccine could compare to the terrifying threat of polio.” For their participation, each kid received a certificate – and sometimes a pin – with the words “Polio Pioneer” emblazoned across the front.
When Francis announced the results of the trial on April 12, 1955, people did more than just breathe a sigh of relief – they openly celebrated, ringing church bells and flooding into the streets to embrace. Salk, who had become the face of the vaccine at that point, was instantly hailed as a national hero – and teachers around the country had their students to write him ‘thank you’ notes for his years of diligent work.
But while Salk went on to win national acclaim – even accepting the Presidential Medal of Freedom for his work on the polio vaccine in 1977 – his success was due in no small part to the children (and their parents) who took a risk in order to advance medical science. And that risk paid off: By the early 1960s, the yearly cases of polio in the United States had gone down to just 910. Where before the vaccine polio had caused around 15,000 cases of paralysis each year, only ten cases of paralysis were recorded in the entire country throughout the 1970s. And in 1979, the virus that once shuttered entire towns was declared officially eradicated in this country. Thanks to the efforts of these brave pioneers, the nation – along with the majority of the world – remains free of polio even today.
The Friday Five: A new blood test to detect Alzheimer's
The Friday Five covers five stories in research that you may have missed this week. There are plenty of controversies and troubling ethical issues in science – and we get into many of them in our online magazine – but this news roundup focuses on scientific creativity and progress to give you a therapeutic dose of inspiration headed into the weekend.
Listen on Apple | Listen on Spotify | Listen on Stitcher | Listen on Amazon | Listen on Google
Here are the promising studies covered in this week's Friday Five:
- A blood test to detect Alzheimer's
- War vets can take their psychologist wherever they go
- Does intermittent fasting affect circadian rhythms?
- A new year's resolution for living longer
- 3-D printed eyes?
Staying well in the 21st century is like playing a game of chess
This article originally appeared in One Health/One Planet, a single-issue magazine that explores how climate change and other environmental shifts are increasing vulnerabilities to infectious diseases by land and by sea. The magazine probes how scientists are making progress with leaders in other fields toward solutions that embrace diverse perspectives and the interconnectedness of all lifeforms and the planet.
On July 30, 1999, the Centers for Disease Control and Prevention published a report comparing data on the control of infectious disease from the beginning of the 20th century to the end. The data showed that deaths from infectious diseases declined markedly. In the early 1900s, pneumonia, tuberculosis and diarrheal diseases were the three leading killers, accounting for one-third of total deaths in the U.S.—with 40 percent being children under five.
Mass vaccinations, the discovery of antibiotics and overall sanitation and hygiene measures eventually eradicated smallpox, beat down polio, cured cholera, nearly rid the world of tuberculosis and extended the U.S. life expectancy by 25 years. By 1997, there was a shift in population health in the U.S. such that cancer, diabetes and heart disease were now the leading causes of death.
The control of infectious diseases is considered to be one of the “10 Great Public Health Achievements.” Yet on the brink of the 21st century, new trouble was already brewing. Hospitals were seeing periodic cases of antibiotic-resistant infections. Novel viruses, or those that previously didn’t afflict humans, began to emerge, causing outbreaks of West Nile, SARS, MERS or swine flu.In the years that followed, tuberculosis made a comeback, at least in certain parts of the world. What we didn’t take into account was the very concept of evolution: as we built better protections, our enemies eventually boosted their attacking prowess, so soon enough we found ourselves on the defensive once again.
At the same time, new, previously unknown or extremely rare disorders began to rise, such as autoimmune or genetic conditions. Two decades later, scientists began thinking about health differently—not as a static achievement guaranteed to last, but as something dynamic and constantly changing—and sometimes, for the worse.
What emerged since then is a different paradigm that makes our interactions with the microbial world more like a biological chess match, says Victoria McGovern, a biochemist and program officer for the Burroughs Wellcome Fund’s Infectious Disease and Population Sciences Program. In this chess game, humans may make a clever strategic move, which could involve creating a new vaccine or a potent antibiotic, but that advantage is fleeting. At some point, the organisms we are up against could respond with a move of their own—such as developing resistance to medication or genetic mutations that attack our bodies. Simply eradicating the “opponent,” or the pathogenic microbes, as efficiently as possible isn’t enough to keep humans healthy long-term.
Instead, scientists should focus on studying the complexity of interactions between humans and their pathogens. “We need to better understand the lifestyles of things that afflict us,” McGovern says. “The solutions are going to be in understanding various parts of their biology so we can influence how they behave around our systems.”
Genetics and cell biology, combined with imaging techniques that allow one to see tissues and individual cells in actions, will enable scientists to define and quantify what it means to be healthy at the molecular level.
What is being proposed will require a pivot to basic biology and other disciplines that have suffered from lack of research funding in recent years. Yet, according to McGovern, the research teams of funded proposals are answering bigger questions. “We look for people exploring questions about hosts and pathogens, and what happens when they touch, but we’re also looking for people with big ideas,” she says. For example, if one specific infection causes a chain of pathological events in the body, can other infections cause them too? And if we find a way to break that chain for one pathogen, can we play the same trick on another? “We really want to see people thinking of not just one experiment but about big implications of their work,” McGovern says.
Jonah Cool, a cell biologist, geneticist and science officer at the Chan Zuckerberg Initiative, says that it’s necessary to define what constitutes a healthy organism and how it overcomes infections or environmental assaults, such as pollution from forest fires or toxins from industrial smokestacks. An organism that catches a disease isn’t necessarily an unhealthy one, as long as it fights it off successfully—an ability that arises from the complex interplay of its genes, the immune system, age, stress levels and other factors. Modern science allows many of these factors to be measured, recorded and compared. “We need a data-driven, deep-phenotyping approach to defining healthy biological systems and their responses to insults—which can be infectious disease or environmental exposures—and their ability to navigate their way through that space,” Cool says.
Genetics and cell biology, combined with imaging techniques that allow one to see tissues and individual cells in actions, will enable scientists to define and quantify what it means to be healthy at the molecular level. “As a geneticist and cell biologist, I believe in all these molecular underpinnings and how they arise in phenotypic differences in cells, genes, proteins—and how their combinations form complex cellular states,” Cool says.
Julie Graves, a physician, public health consultant, former adjunct professor of management, policy and community health at the University of Texas Health Science Center in Houston, stresses the necessity of nutritious diets. According to the Rockefeller Food Initiative, “poor diet is the leading risk factor for disease, disability and premature death in the majority of countries around the world.” Adequate nutrition is critical for maintaining human health and life. Yet, Western diets are often low in essential nutrients, high in calories and heavy on processed foods. Overconsumption of these foods has contributed to high rates of obesity and chronic disease in the U.S. In fact, more than half of American adults have at least one chronic disease, and 27 percent have more than one—which increases vulnerability to COVID-19 infections, according to the 2018 National Health Interview Survey.
Further, the contamination of our food supply with various agricultural and industrial toxins—petrochemicals, pesticides, PFAS and others—has implications for morbidity, mortality, and overall quality of life. “These chemicals are insidiously in everything, including our bodies,” Graves says—and they are interfering with our normal biological functions. “We need to stop how we manufacture food,” she adds, and rid our sustenance of these contaminants.
According to the Humane Society of the United States, factory farms result in nearly 40 percent of emissions of methane. Concentrated animal feeding operations or CAFOs may serve as breeding grounds for pandemics, scientists warn, so humans should research better ways to raise and treat livestock. Diego Rose, a professor of food and nutrition policy at Tulane University School of Public Health & Tropical Medicine, and his colleagues found that “20 percent of Americans’ diets account for about 45 percent of the environmental impacts [that come from food].” A subsequent study explored the impacts of specific foods and found that substituting beef for chicken lowers an individual’s carbon footprint by nearly 50 percent, with water usage decreased by 30 percent. Notably, however, eating too much red meat has been associated with a variety of illnesses.
In some communities, the option to swap food types is limited or impossible. For example, “many populations live in relative food deserts where there’s not a local grocery store that has any fresh produce,” says Louis Muglia, the president and CEO of Burroughs Wellcome. Individuals in these communities suffer from an insufficient intake of beneficial macronutrients, and they’re “probably being exposed to phenols and other toxins that are in the packaging.” An equitable, sustainable and nutritious food supply will be vital to humanity’s wellbeing in the era of climate change, unpredictable weather and spillover events.
A recent report by See Change Institute and the Climate Mental Health Network showed that people who are experiencing socioeconomic inequalities, including many people of color, contribute the least to climate change, yet they are impacted the most. For example, people in low-income communities are disproportionately exposed to vehicle emissions, Muglia says. Through its Climate Change and Human Health Seed Grants program, Burroughs Wellcome funds research that aims to understand how various factors related to climate change and environmental chemicals contribute to premature births, associated with health vulnerabilities over the course of a person’s life—and map such hot spots.
“It’s very complex, the combinations of socio-economic environment, race, ethnicity and environmental exposure, whether that’s heat or toxic chemicals,” Muglia explains. “Disentangling those things really requires a very sophisticated, multidisciplinary team. That’s what we’ve put together to describe where these hotspots are and see how they correlate with different toxin exposure levels.”
In addition to mapping the risks, researchers are developing novel therapeutics that will be crucial to our armor arsenal, but we will have to be smarter at designing and using them. We will need more potent, better-working monoclonal antibodies. Instead of directly attacking a pathogen, we may have to learn to stimulate the immune system—training it to fight the disease-causing microbes on its own. And rather than indiscriminately killing all bacteria with broad-scope drugs, we would need more targeted medications. “Instead of wiping out the entire gut flora, we will need to come up with ways that kill harmful bacteria but not healthy ones,” Graves says. Training our immune systems to recognize and react to pathogens by way of vaccination will keep us ahead of our biological opponents, too. “Continued development of vaccines against infectious diseases is critical,” says Graves.
With all of the unpredictable events that lie ahead, it is difficult to foresee what achievements in public health will be reported at the end of the 21st century. Yet, technological advances, better modeling and pursuing bigger questions in science, along with education and working closely with communities will help overcome the challenges. The Chan Zuckerberg Initiative displays an optimistic message on its website: “Is it possible to cure, prevent, or manage all diseases by the end of this century? We think so.” Cool shares the view of his employer—and believes that science can get us there. Just give it some time and a chance. “It’s a big, bold statement,” he says, “but the end of the century is a long way away.”Lina Zeldovich has written about science, medicine and technology for Popular Science, Smithsonian, National Geographic, Scientific American, Reader’s Digest, the New York Times and other major national and international publications. A Columbia J-School alumna, she has won several awards for her stories, including the ASJA Crisis Coverage Award for Covid reporting, and has been a contributing editor at Nautilus Magazine. In 2021, Zeldovich released her first book, The Other Dark Matter, published by the University of Chicago Press, about the science and business of turning waste into wealth and health. You can find her on http://linazeldovich.com/ and @linazeldovich.