Beyond Henrietta Lacks: How the Law Has Denied Every American Ownership Rights to Their Own Cells
The common perception is that Henrietta Lacks was a victim of poverty and racism when in 1951 doctors took samples of her cervical cancer without her knowledge or permission and turned them into the world's first immortalized cell line, which they called HeLa. The cell line became a workhorse of biomedical research and facilitated the creation of medical treatments and cures worth untold billions of dollars. Neither Lacks nor her family ever received a penny of those riches.
But racism and poverty is not to blame for Lacks' exploitation—the reality is even worse. In fact all patients, then and now, regardless of social or economic status, have absolutely no right to cells that are taken from their bodies. Some have called this biological slavery.
How We Got Here
The case that established this legal precedent is Moore v. Regents of the University of California.
John Moore was diagnosed with hairy-cell leukemia in 1976 and his spleen was removed as part of standard treatment at the UCLA Medical Center. On initial examination his physician, David W. Golde, had discovered some unusual qualities to Moore's cells and made plans prior to the surgery to have the tissue saved for research rather than discarded as waste. That research began almost immediately.
"On both sides of the case, legal experts and cultural observers cautioned that ownership of a human body was the first step on the slippery slope to 'bioslavery.'"
Even after Moore moved to Seattle, Golde kept bringing him back to Los Angeles to collect additional samples of blood and tissue, saying it was part of his treatment. When Moore asked if the work could be done in Seattle, he was told no. Golde's charade even went so far as claiming to find a low-income subsidy to pay for Moore's flights and put him up in a ritzy hotel to get him to return to Los Angeles, while paying for those out of his own pocket.
Moore became suspicious when he was asked to sign new consent forms giving up all rights to his biological samples and he hired an attorney to look into the matter. It turned out that Golde had been lying to his patient all along; he had been collecting samples unnecessary to Moore's treatment and had turned them into a cell line that he and UCLA had patented and already collected millions of dollars in compensation. The market for the cell lines was estimated at $3 billion by 1990.
Moore felt he had been taken advantage of and filed suit to claim a share of the money that had been made off of his body. "On both sides of the case, legal experts and cultural observers cautioned that ownership of a human body was the first step on the slippery slope to 'bioslavery,'" wrote Priscilla Wald, a professor at Duke University whose career has focused on issues of medicine and culture. "Moore could be viewed as asking to commodify his own body part or be seen as the victim of the theft of his most private and inalienable information."
The case bounced around different levels of the court system with conflicting verdicts for nearly six years until the California Supreme Court ruled on July 9, 1990 that Moore had no legal rights to cells and tissue once they were removed from his body.
The court made a utilitarian argument that the cells had no value until scientists manipulated them in the lab. And it would be too burdensome for researchers to track individual donations and subsequent cell lines to assure that they had been ethically gathered and used. It would impinge on the free sharing of materials between scientists, slow research, and harm the public good that arose from such research.
"In effect, what Moore is asking us to do is impose a tort duty on scientists to investigate the consensual pedigree of each human cell sample used in research," the majority wrote. In other words, researchers don't need to ask any questions about the materials they are using.
One member of the court did not see it that way. In his dissent, Stanley Mosk raised the specter of slavery that "arises wherever scientists or industrialists claim, as defendants have here, the right to appropriate and exploit a patient's tissue for their sole economic benefit—the right, in other words, to freely mine or harvest valuable physical properties of the patient's body. … This is particularly true when, as here, the parties are not in equal bargaining positions."
Mosk also cited the appeals court decision that the majority overturned: "If this science has become for profit, then we fail to see any justification for excluding the patient from participation in those profits."
But the majority bought the arguments that Golde, UCLA, and the nascent biotechnology industry in California had made in amici briefs filed throughout the legal proceedings. The road was now cleared for them to develop products worth billions without having to worry about or share with the persons who provided the raw materials upon which their research was based.
Critical Views
Biomedical research requires a continuous and ever-growing supply of human materials for the foundation of its ongoing work. If an increasing number of patients come to feel as John Moore did, that the system is ripping them off, then they become much less likely to consent to use of their materials in future research.
Some legal and ethical scholars say that donors should be able to limit the types of research allowed for their tissues and researchers should be monitored to assure compliance with those agreements. For example, today it is commonplace for companies to certify that their clothing is not made by child labor, their coffee is grown under fair trade conditions, that food labeled kosher is properly handled. Should we ask any less of our pharmaceuticals than that the donors whose cells made such products possible have been treated honestly and fairly, and share in the financial bounty that comes from such drugs?
Protection of individual rights is a hallmark of the American legal system, says Lisa Ikemoto, a law professor at the University of California Davis. "Putting the needs of a generalized public over the interests of a few often rests on devaluation of the humanity of the few," she writes in a reimagined version of the Moore decision that upholds Moore's property claims to his excised cells. The commentary is in a chapter of a forthcoming book in the Feminist Judgment series, where authors may only use legal precedent in effect at the time of the original decision.
"Why is the law willing to confer property rights upon some while denying the same rights to others?" asks Radhika Rao, a professor at the University of California, Hastings College of the Law. "The researchers who invest intellectual capital and the companies and universities that invest financial capital are permitted to reap profits from human research, so why not those who provide the human capital in the form of their own bodies?" It might be seen as a kind of sweat equity where cash strapped patients make a valuable in kind contribution to the enterprise.
The Moore court also made a big deal about inhibiting the free exchange of samples between scientists. That has become much less the situation over the more than three decades since the decision was handed down. Ironically, this decision, as well as other laws and regulations, have since strengthened the power of patents in biomedicine and by doing so have increased secrecy and limited sharing.
"Although the research community theoretically endorses the sharing of research, in reality sharing is commonly compromised by the aggressive pursuit and defense of patents and by the use of licensing fees that hinder collaboration and development," Robert D. Truog, Harvard Medical School ethicist and colleagues wrote in 2012 in the journal Science. "We believe that measures are required to ensure that patients not bear all of the altruistic burden of promoting medical research."
Additionally, the increased complexity of research and the need for exacting standardization of materials has given rise to an industry that supplies certified chemical reagents, cell lines, and whole animals bred to have specific genetic traits to meet research needs. This has been more efficient for research and has helped to ensure that results from one lab can be reproduced in another.
The Court's rationale of fostering collaboration and free exchange of materials between researchers also has been undercut by the changing structure of that research. Big pharma has shrunk the size of its own research labs and over the last decade has worked out cooperative agreements with major research universities where the companies contribute to the research budget and in return have first dibs on any findings (and sometimes a share of patent rights) that come out of those university labs. It has had a chilling effect on the exchange of materials between universities.
Perhaps tracking cell line donors and use restrictions on those donations might have been burdensome to researchers when Moore was being litigated. Some labs probably still kept their cell line records on 3x5 index cards, computers were primarily expensive room-size behemoths with limited capacity, the internet barely existed, and there was no cloud storage.
But that was the dawn of a new technological age and standards have changed. Now cell lines are kept in state-of-the-art sub zero storage units, tagged with the source, type of tissue, date gathered and often other information. Adding a few more data fields and contacting the donor if and when appropriate does not seem likely to disrupt the research process, as the court asserted.
Forging the Future
"U.S. universities are awarded almost 3,000 patents each year. They earn more than $2 billion each year from patent royalties. Sharing a modest portion of these profits is a novel method for creating a greater sense of fairness in research relationships that we think is worth exploring," wrote Mark Yarborough, a bioethicist at the University of California Davis Medical School, and colleagues. That was penned nearly a decade ago and those numbers have only grown.
The Michigan BioTrust for Health might serve as a useful model in tackling some of these issues. Dried blood spots have been collected from all newborns for half a century to be tested for certain genetic diseases, but controversy arose when the huge archive of dried spots was used for other research projects. As a result, the state created a nonprofit organization to in essence become a biobank and manage access to these spots only for specific purposes, and also to share any revenue that might arise from that research.
"If there can be no property in a whole living person, does it stand to reason that there can be no property in any part of a living person? If there were, can it be said that this could equate to some sort of 'biological slavery'?" Irish ethicist Asim A. Sheikh wrote several years ago. "Any amount of effort spent pondering the issue of 'ownership' in human biological materials with existing law leaves more questions than answers."
Perhaps the biggest question will arise when -- not if but when -- it becomes possible to clone a human being. Would a human clone be a legal person or the property of those who created it? Current legal precedent points to it being the latter.
Today, October 4, is the 70th anniversary of Henrietta Lacks' death from cancer. Over those decades her immortalized cells have helped make possible miraculous advances in medicine and have had a role in generating billions of dollars in profits. Surviving family members have spoken many times about seeking a share of those profits in the name of social justice; they intend to file lawsuits today. Such cases will succeed or fail on their own merits. But regardless of their specific outcomes, one can hope that they spark a larger public discussion of the role of patients in the biomedical research enterprise and lead to establishing a legal and financial claim for their contributions toward the next generation of biomedical research.
Kira Peikoff was the editor-in-chief of Leaps.org from 2017 to 2021. As a journalist, her work has appeared in The New York Times, Newsweek, Nautilus, Popular Mechanics, The New York Academy of Sciences, and other outlets. She is also the author of four suspense novels that explore controversial issues arising from scientific innovation: Living Proof, No Time to Die, Die Again Tomorrow, and Mother Knows Best. Peikoff holds a B.A. in Journalism from New York University and an M.S. in Bioethics from Columbia University. She lives in New Jersey with her husband and two young sons. Follow her on Twitter @KiraPeikoff.
With a deadly pandemic sweeping the planet, many are questioning the comfort and security we have taken for granted in the modern world.
A century ago, when an influenza pandemic struck, we barely knew what viruses were.
More than a century after the germ theory, we are still at the mercy of a microbe we can neither treat, nor control, nor immunize against. Even more discouraging is that technology has in some ways exacerbated the problem: cars and air travel allow a new disease to quickly encompass the globe.
Some say we have grown complacent, that we falsely assume the triumphs of the past ensure a happy and prosperous future, that we are oblivious to the possibility of unpredictable "black swan" events that could cause our destruction. Some have begun to lose confidence in progress itself, and despair of the future.
But the new coronavirus should not defeat our spirit—if anything, it should spur us to redouble our efforts, both in the science and technology of medicine, and more broadly in the advance of industry. Because the best way to protect ourselves against future disasters is more progress, faster.
Science and technology have overall made us much better able to deal with disease. In the developed world, we have already tamed most categories of infectious disease. Most bacterial infections, such as tuberculosis or bacterial pneumonia, are cured with antibiotics. Waterborne diseases such as cholera are eliminated through sanitation; insect-borne ones such as malaria through pest control. Those that are not contagious until symptoms appear, such as SARS, can be handled through case isolation and contact tracing. For the rest, such as smallpox, polio, and measles, we develop vaccines, given enough time. COVID-19 could start a pandemic only because it fits a narrow category: a new, viral disease that is highly contagious via pre-symptomatic droplet/aerosol transmission, and that has a high mortality rate compared to seasonal influenza.
A century ago, when an influenza pandemic struck, we barely knew what viruses were; no one had ever seen one. Today we know what COVID-19 is down to its exact genome; in fact, we have sequenced thousands of COVID-19 genomes, and can track its history and its spread through their mutations. We can create vaccines faster today, too: where we once developed them in live animals, we now use cell cultures; where we once had to weaken or inactivate the virus itself, we can now produce vaccines based on the virus's proteins. And even though we don't yet have a treatment, the last century-plus of pharmaceutical research has given us a vast catalog of candidate drugs, already proven safe. Even now, over 50 candidate vaccines and almost 100 candidate treatments are in the research pipeline.
It's not just our knowledge that has advanced, but our methods. When smallpox raged in the 1700s, even the idea of calculating a case-fatality rate was an innovation. When the polio vaccine was trialled in the 1950s, the use of placebo-controlled trials was still controversial. The crucial measure of contagiousness, "R0", was not developed in epidemiology until the 1980s. And today, all of these methods are made orders of magnitude faster and more powerful by statistical and data visualization software.
If you're seeking to avoid COVID-19, the hand sanitizer gel you carry in a pocket or purse did not exist until the 1960s. If you start to show symptoms, the pulse oximeter that tests your blood oxygenation was not developed until the 1970s. If your case worsens, the mechanical ventilator that keeps you alive was invented in the 1950s—in fact, no form of artificial respiration was widely available until the "iron lung" used to treat polio patients in the 1930s. Even the modern emergency medical system did not exist until recently: if during the 1918 flu pandemic you became seriously ill, there was no 911 hotline to call, and any ambulance that showed up would likely have been a modified van or hearse, with no equipment or trained staff.
As many of us "shelter in place", we are far more able to communicate and collaborate, to maintain some semblance of normal life, than we ever would have been. To compare again to 1918: long-distance telephone service barely existed at that time, and only about a third of homes in the US even had electricity; now we can videoconference over Zoom and Skype. And the enormous selection and availability provided by online retail and food delivery have kept us stocked and fed, even when we don't want to venture out to the store.
Let the virus push us to redouble our efforts to make scientific, technological, and industrial progress on all fronts.
"Black swan" calamities can strike without warning at any time. Indeed, humanity has always been subject to them—drought and frost, fire and flood, war and plague. But we are better equipped now to deal with them than ever before. And the more progress we make, the better prepared we'll be for the next one. The accumulation of knowledge, technology, industrial infrastructure, and surplus wealth is the best buffer against any shock—whether a viral pandemic, a nuclear war, or an asteroid impact. In fact, the more worried we are about future crises, the more energetically we should accelerate science, technology and industry.
In this sense, we have grown complacent. We take the modern world for granted, so much so that some question whether further progress is even still needed. The new virus proves how much we do need it, and how far we still have to go. Imagine how different things would be if we had broad-spectrum antiviral drugs, or a way to enhance the immune system to react faster to infection, or a way to detect infection even before symptoms appear. These technologies may seem to belong to a Star Trek future—but so, at one time, did cell phones.
The virus reminds us that nature is indifferent to us, leaving us to fend entirely for ourselves. As we go to war against it, let us not take the need for such a war as reason for despair. Instead, let it push us to redouble our efforts to make scientific, technological, and industrial progress on all fronts. No matter the odds, applied intelligence is our best weapon against disaster.