How a Nobel-Prize Winner Fought Her Family, Nazis, and Bombs to Change our Understanding of Cells Forever
When Rita Levi-Montalcini decided to become a scientist, she was determined that nothing would stand in her way. And from the beginning, that determination was put to the test. Before Levi-Montalcini became a Nobel Prize-winning neurobiologist, the first to discover and isolate a crucial chemical called Neural Growth Factor (NGF), she would have to battle both the sexism within her own family as well as the racism and fascism that was slowly engulfing her country
Levi-Montalcini was born to two loving parents in Turin, Italy at the turn of the 20th century. She and her twin sister, Paola, were the youngest of the family's four children, and Levi-Montalcini described her childhood as "filled with love and reciprocal devotion." But while her parents were loving, supportive and "highly cultured," her father refused to let his three daughters engage in any schooling beyond the basics. "He loved us and had a great respect for women," she later explained, "but he believed that a professional career would interfere with the duties of a wife and mother."
At age 20, Levi-Montalcini had finally had enough. "I realized that I could not possibly adjust to a feminine role as conceived by my father," she is quoted as saying, and asked his permission to finish high school and pursue a career in medicine. When her father reluctantly agreed, Levi-Montalcini was ecstatic: In just under a year, she managed to catch up on her mathematics, graduate high school, and enroll in medical school in Turin.
By 1936, Levi-Montalcini had graduated medical school at the top of her class and decided to stay on at the University of Turin as a research assistant for histologist and human anatomy professor Guiseppe Levi. Levi-Montalcini started studying nerve cells and nerve fibers – the tiny, slender tendrils that are threaded throughout our nerves and that determine what information each nerve can transmit. But it wasn't long before another enormous obstacle to her scientific career reared its head.
Science Under a Fascist Regime
Two years into her research assistant position, Levi-Montalcini was fired, along with every other "non-Aryan Italian" who held an academic or professional career, thanks to a series of antisemitic laws passed by Italy's then-leader Benito Mussolini. Forced out of her academic position, Levi-Montalcini went to Belgium for a fellowship at a neurological institute in Brussels – but then was forced back to Turin when the German army invaded.
Levi-Montalcini decided to keep researching. She and Guiseppe Levi built a makeshift lab in Levi-Montalcini's apartment, borrowing chicken eggs from local farmers and using sewing needles to dissect them. By dissecting the chicken embryos from her bedroom laboratory, she was able to see how nerve fibers formed and died. The two continued this research until they were interrupted again – this time, by British air raids. Levi-Montalcini fled to a country cottage to continue her research, and then two years later was forced into hiding when the German army invaded Italy. Levi-Montalcini and her family assumed different identities and lived with non-Jewish friends in Florence to survive the Holocaust. Despite all of this, Levi-Montalcini continued her work, dissecting chicken embryos from her hiding place until the end of the war.
"The discovery of NGF really changed the world in which we live, because now we knew that cells talk to other cells, and that they use soluble factors. It was hugely important."
A Post-War Discovery
Several years after the war, when Levi-Montalcini was once again working at the University of Turin, a German embryologist named Viktor Hamburger invited her to Washington University in St. Louis. Hamburger was impressed by Levi-Montalcini's research with her chicken embryos, and secured an opportunity for her to continue her work in America. The invitation would "change the course of my life," Levi-Montalcini would later recall.
During her fellowship, Montalcini grew tumors in mice and then transferred them to chick embryos in order to see how it would affect the chickens. To her surprise, she noticed that introducing the tumor samples would cause nerve fibers to grow rapidly. From this, Levi-Montalcini discovered and was able to isolate a protein that she determined was able to cause this rapid growth. She later named this Nerve Growth Factor, or NGF.
From there, Levi-Montalcini and her team launched new experiments to test NGF, injecting it and repressing it to see the effect it had in a test subject's body. When the team injected NGF into embryonic mice, they observed nerve growth, as well as the mouse pups developing faster – their eyes opening earlier and their teeth coming in sooner – than the untreated group. When the team purified the NGF extract, however, it had no effect, leading the team to believe that something else in the crude extract of NGF was influencing the growth of the newborn mice. Stanley Cohen, Levi-Montalcini's colleague, identified another growth factor called EGF – epidermal growth factor – that caused the mouse pups' eyes and teeth to grow so quickly.
Levi-Montalcini continued to experiment with NGF for the next several decades at Washington University, illuminating how NGF works in our body. When Levi-Montalcini injected newborn mice with an antiserum for NGF, for example, her team found that it "almost completely deprived the animals of a sympathetic nervous system." Other experiments done by Levi-Montalcini and her colleagues helped show the role that NGF plays in other important biological processes, such as the regulation of our immune system and ovulation.
"The discovery of NGF really changed the world in which we live, because now we knew that cells talk to other cells, and that they use soluble factors. It was hugely important," said Bill Mobley, Chair of the Department of Neurosciences at the University of California, San Diego School of Medicine.
Her Lasting Legacy
After years of setbacks, Levi-Montalcini's groundbreaking work was recognized in 1986, when she was awarded the Nobel Prize in Medicine for her discovery of NGF (Cohen, her colleague who discovered EGF, shared the prize). Researchers continue to study NGF even to this day, and the continued research has been able to increase our understanding of diseases like HIV and Alzheimer's.
Levi-Montalcini never stopped researching either: In January 2012, at the age of 102, Levi-Montalcini published her last research paper in the journal PNAS, making her the oldest member of the National Academy of Science to do so. Before she died in December 2012, she encouraged other scientists who would suffer setbacks in their careers to keep pursuing their passions. "Don't fear the difficult moments," Levi-Montalcini is quoted as saying. "The best comes from them."
Pseudoscience Is Rampant: How Not to Fall for It
Whom to believe?
The relentless and often unpredictable coronavirus (SARS-CoV-2) has, among its many quirky terrors, dredged up once again the issue that will not die: science versus pseudoscience.
How does one learn to spot the con without getting a Ph.D. and spending years in a laboratory?
The scientists, experts who would be the first to admit they are not infallible, are now in danger of being drowned out by the growing chorus of pseudoscientists, conspiracy theorists, and just plain troublemakers that seem to be as symptomatic of the virus as fever and weakness.
How is the average citizen to filter this cacophony of information and misinformation posing as science alongside real science? While all that noise makes it difficult to separate the real stuff from the fakes, there is at least one positive aspect to it all.
A famous aphorism by one Charles Caleb Colton, a popular 19th-century English cleric and writer, says that "imitation is the sincerest form of flattery."
The frauds and the paranoid conspiracy mongers who would perpetrate false science on a susceptible public are at least recognizing the value of science—they imitate it. They imitate the ways in which science works and make claims as if they were scientists, because even they recognize the power of a scientific approach. They are inadvertently showing us how much we value science. Unfortunately they are just shabby counterfeits.
Separating real science from pseudoscience is not a new problem. Philosophers, politicians, scientists, and others have been worrying about this perhaps since science as we know it, a science based entirely on experiment and not opinion, arrived in the 1600s. The original charter of the British Royal Society, the first organized scientific society, stated that at their formal meetings there would be no discussion of politics, religion, or perpetual motion machines.
The first two of those for the obvious purpose of keeping the peace. But the third is interesting because at that time perpetual motion machines were one of the main offerings of the imitators, the bogus scientists who were sure that you could find ways around the universal laws of energy and make a buck on it. The motto adopted by the society was, and remains, Nullius in verba, Latin for "take nobody's word for it." Kind of an early version of Missouri's venerable state motto: "Show me."
You might think that telling phony science from the real thing wouldn't be so difficult, but events, historical and current, tell a very different story—often with tragic outcomes. Just one terrible example is the estimated 350,000 additional HIV deaths in South Africa directly caused by the now-infamous conspiracy theories of their own elected President no less (sound familiar?). It's surprisingly easy to dress up phony science as the real thing by simply adopting, or appearing to adopt, the trappings of science.
Thus, the anti-vaccine movement claims to be based on suspicion of authority, beginning with medical authority in this case, stemming from the fraudulent data published by the now-disgraced Andrew Wakefield, an English gastroenterologist. And it's true that much of science is based on suspicion of authority. Science got its start when the likes of Galileo and Copernicus claimed that the Church, the State, even Aristotle, could no longer be trusted as authoritative sources of knowledge.
But Galileo and those who followed him produced alternative explanations, and those alternatives were based on data that arose independently from many sources and generated a great deal of debate and, most importantly, could be tested by experiments that could prove them wrong. The anti-vaccine movement imitates science, still citing the discredited Wakefield report, but really offers nothing but suspicion—and that is paranoia, not science.
Similarly, there are those who try to cloak their nefarious motives in the trappings of science by claiming that they are taking the scientific posture of doubt. Science after all depends on doubt—every scientist doubts every finding they make. Every scientist knows that they can't possibly foresee all possible instances or situations in which they could be proven wrong, no matter how strong their data. Einstein was doubted for two decades, and cosmologists are still searching for experimental proofs of relativity. Science indeed progresses by doubt. In science revision is a victory.
But the imitators merely use doubt to suggest that science is not dependable and should not be used for informing policy or altering our behavior. They claim to be taking the legitimate scientific stance of doubt. Of course, they don't doubt everything, only what is problematic for their individual enterprises. They don't doubt the value of blood pressure medicine to control their hypertension. But they should, because every medicine has side effects and we don't completely understand how blood pressure is regulated and whether there may not be still better ways of controlling it.
But we use the pills we have because the science is sound even when it is not completely settled. Ask a hypertensive oil executive who would like you to believe that climate science should be ignored because there are too many uncertainties in the data, if he is willing to forgo his blood pressure medicine—because it, too, has its share of uncertainties and unwanted side effects.
The apparent success of pseudoscience is not due to gullibility on the part of the public. The problem is that science is recognized as valuable and that the imitators are unfortunately good at what they do. They take a scientific pose to gain your confidence and then distort the facts to their own purposes. How does one learn to spot the con without getting a Ph.D. and spending years in a laboratory?
"If someone claims to have the ultimate answer or that they know something for certain, the only thing for sure is that they are trying to fool you."
What can be done to make the distinction clearer? Several solutions have been tried—and seem to have failed. Radio and television shows about the latest scientific breakthroughs are a noble attempt to give the public a taste of good science, but they do nothing to help you distinguish between them and the pseudoscience being purveyed on the neighboring channel and its "scientific investigations" of haunted houses.
Similarly, attempts to inculcate what are called "scientific habits of mind" are of little practical help. These habits of mind are not so easy to adopt. They invariably require some amount of statistics and probability and much of that is counterintuitive—one of the great values of science is to help us to counter our normal biases and expectations by showing that the actual measurements may not bear them out.
Additionally, there is math—no matter how much you try to hide it, much of the language of science is math (Galileo said that). And half the audience is gone with each equation (Stephen Hawking said that). It's hard to imagine a successful program of making a non-scientifically trained public interested in adopting the rigors of scientific habits of mind. Indeed, I suspect there are some people, artists for example, who would be rightfully suspicious of changing their thinking to being habitually scientific. Many scientists are frustrated by the public's inability to think like a scientist, but in fact it is neither easy nor always desirable to do so. And it is certainly not practical.
There is a more intuitive and simpler way to tell the difference between the real thing and the cheap knock-off. In fact, it is not so much intuitive as it is counterintuitive, so it takes a little bit of mental work. But the good thing is it works almost all the time by following a simple, if as I say, counterintuitive, rule.
True science, you see, is mostly concerned with the unknown and the uncertain. If someone claims to have the ultimate answer or that they know something for certain, the only thing for sure is that they are trying to fool you. Mystery and uncertainty may not strike you right off as desirable or strong traits, but that is precisely where one finds the creative solutions that science has historically arrived at. Yes, science accumulates factual knowledge, but it is at its best when it generates new and better questions. Uncertainty is not a place of worry, but of opportunity. Progress lives at the border of the unknown.
How much would it take to alter the public perception of science to appreciate unknowns and uncertainties along with facts and conclusions? Less than you might think. In fact, we make decisions based on uncertainty every day—what to wear in case of 60 percent chance of rain—so it should not be so difficult to extend that to science, in spite of what you were taught in school about all the hard facts in those giant textbooks.
You can believe science that says there is clear evidence that takes us this far… and then we have to speculate a bit and it could go one of two or three ways—or maybe even some way we don't see yet. But like your blood pressure medicine, the stuff we know is reliable even if incomplete. It will lower your blood pressure, no matter that better treatments with fewer side effects may await us in the future.
Unsettled science is not unsound science. The honesty and humility of someone who is willing to tell you that they don't have all the answers, but they do have some thoughtful questions to pursue, are easy to distinguish from the charlatans who have ready answers or claim that nothing should be done until we are an impossible 100-percent sure.
Imitation may be the sincerest form of flattery.
The problem, as we all know, is that flattery will get you nowhere.
[Editor's Note: This article was originally published on June 8th, 2020 as part of a standalone magazine called GOOD10: The Pandemic Issue. Produced as a partnership among LeapsMag, The Aspen Institute, and GOOD, the magazine is available for free online.]
Henrietta Lacks' Cells Enabled Medical Breakthroughs. Is It Time to Finally Retire Them?
For Victoria Tokarz, a third-year PhD student at the University of Toronto, experimenting with cells is just part of a day's work. Tokarz, 26, is studying to be a cell biologist and spends her time inside the lab manipulating muscle cells sourced from rodents to try to figure out how they respond to insulin. She hopes this research could someday lead to a breakthrough in our understanding of diabetes.
"People like to use HeLa cells because they're easy to use."
But in all her research, there is one cell culture that Tokarz refuses to touch. The culture is called HeLa, short for Henrietta Lacks, named after the 31-year-old tobacco farmer the cells were stolen from during a tumor biopsy she underwent in 1951.
"In my opinion, there's no question or experiment I can think of that validates stealing from and profiting off of a black woman's body," Tokarz says. "We're not talking about a reagent we created in a lab, a mixture of some chemicals. We're talking about a human being who suffered indescribably so we could profit off of her misfortune."
Lacks' suffering is something that, until recently, was not widely known. Born to a poor family in Roanoke, VA, Lacks was sent to live with her grandfather on the family tobacco farm at age four, shortly after the death of her mother. She gave birth to her first child at just fourteen, and two years later had another child with profound developmental disabilities. Lacks married her first cousin, David, in 1941 and the family moved to Maryland where they had three additional children.
But the real misfortune came in 1951, when Lacks told her cousins that she felt a hard "knot" in her womb. When Lacks went to Johns Hopkins hospital to have the knot examined, doctors discovered that the hard lump Henrietta felt was a rapidly-growing cervical tumor.
Before the doctors treated the tumor – inserting radium tubes into her vagina, in the hopes they could kill the cancer, Lacks' doctors clipped two tissue samples from her cervix, without Lacks' knowledge or consent. While it's considered widely unethical today, taking tissue samples from patients was commonplace at the time. The samples were sent to a cancer researcher at Johns Hopkins and Lacks continued treatment unsuccessfully until she died a few months later of metastatic cancer.
Lacks' story was not over, however: When her tissue sample arrived at the lab of George Otto Gey, the Johns Hopkins cancer researcher, he noticed that the cancerous cells grew at a shocking pace. Unlike other cell cultures that would die within a day or two of arriving at the lab, Lacks' cells kept multiplying. They doubled every 24 hours, and to this day, have never stopped.
Scientists would later find out that this growth was due to an infection of Human Papilloma Virus, or HPV, which is known for causing aggressive cancers. Lacks' cells became the world's first-ever "immortalized" human cell line, meaning that as long as certain environmental conditions are met, the cells can replicate indefinitely. Although scientists have cultivated other immortalized cell lines since then, HeLa cells remain a favorite among scientists due to their resilience, Tokarz says.
"People like to use HeLa cells because they're easy to use," Tokarz says. "They're easy to manipulate, because they're very hardy, and they allow for transection, which means expressing a protein in a cell that's not normally there. Other cells, like endothelial cells, don't handle those manipulations well."
Once the doctors at Johns Hopkins discovered that Lacks' cells could replicate indefinitely, they started shipping them to labs around the world to promote medical research. As they were the only immortalized cell line available at the time, researchers used them for thousands of experiments — some of which resulted in life-saving treatments. Jonas Salk's polio vaccine, for example, was manufactured using HeLa cells. HeLa cell research was also used to develop a vaccine for HPV, and for the development of in vitro fertilization and gene mapping. Between 1951 and 2018, HeLa cells have been cited in over 110,000 publications, according to a review from the National Institutes of Health.
But while some scientists like Tokarz are thankful for the advances brought about by HeLa cells, they still believe it's well past time to stop using them in research.
"Am I thankful we have a polio vaccine? Absolutely. Do I resent the way we came to have that vaccine? Absolutely," Tokarz says. "We could have still arrived at those same advances by treating her as the human being she is, not just a specimen."
Ethical considerations aside, HeLa is no longer the world's only available cell line – nor, Tokarz argues, are her cells the most suitable for every type of research. "The closer you can get to the physiology of the thing you're studying, the better," she says. "Now we have the ability to use primary cells, which are isolated from a person and put right into the culture dish, and those don't have the same mutations as cells that have been growing for 20 years. We didn't have the expertise to do that initially, but now we do."
Raphael Valdivia, a professor of molecular genetics and microbiology at Duke University School of Medicine, agrees that HeLa cells are no longer optimal for most research. "A lot of scientists are moving away from HeLa cells because they're so unstable," he says. "They mutate, they rearrange chromosomes to become adaptive, and different batches of cells evolve separately from each other. The HeLa cells in my lab are very different than the ones down the hall, and that means sometimes we can't replicate our results. We have to go back to an earlier batch of cells in the freezer and re-test."
Still, the idea of retiring the cells completely doesn't make sense, Valdivia says: "To some extent, you're beholden to previous research. You need to be able to confirm findings that happen in earlier studies, and to do that you need to use the same cell line that other researchers have used."
"Ethics is not black and white, and sometimes there's no such thing as a straightforward ethical or unethical choice."
"The way in which the cells were taken – without patient consent – is completely inappropriate," says Yann Joly, associate professor at the Faculty of Medicine in Toronto and Research Director at the Centre of Genomics and Policy. "The question now becomes, what can we do about it now? What are our options?"
While scientists are not able to erase what was done to Henrietta Lacks, Joly argues that retiring her cells is also non-consensual, assuming – maybe incorrectly – what Henrietta would have wanted, without her input. Additionally, Joly points out that other immortalized human cell lines are fraught with what some people consider to be ethical concerns as well, such as the human embryonic kidney cell line, commonly referred to as HEK-293, that was derived from an aborted female fetus. "Just because you're using another kind of cell doesn't mean it's devoid of ethical issue," he says.
Seemingly, the one thing scientists can agree on is that Henrietta Lacks was mistreated by the medical community. But even so, retiring her cells from medical research is not an obvious solution. Scientists are now using HeLa cells to better understand how the novel coronavirus affects humans, and this knowledge will inform how researchers develop a COVID-19 vaccine.
"Ethics is not black and white, and sometimes there's no such thing as a straightforward ethical or unethical choice," Joly says. "If [ethics] were that easy, nobody would need to teach it."