Researchers Behaving Badly: Known Frauds Are "the Tip of the Iceberg"
Last week, the whistleblowers in the Paolo Macchiarini affair at Sweden's Karolinska Institutet went on the record here to detail the retaliation they suffered for trying to expose a star surgeon's appalling research misconduct.
Scientific fraud of the type committed by Macchiarini is rare, but studies suggest that it's on the rise.
The whistleblowers had discovered that in six published papers, Macchiarini falsified data, lied about the condition of patients and circumvented ethical approvals. As a result, multiple patients suffered and died. But Karolinska turned a blind eye for years.
Scientific fraud of the type committed by Macchiarini is rare, but studies suggest that it's on the rise. Just this week, for example, Retraction Watch and STAT together broke the news that a Harvard Medical School cardiologist and stem cell researcher, Piero Anversa, falsified data in a whopping 31 papers, which now have to be retracted. Anversa had claimed that he could regenerate heart muscle by injecting bone marrow cells into damaged hearts, a result that no one has been able to duplicate.
A 2009 study published in the Public Library of Science (PLOS) found that about two percent of scientists admitted to committing fabrication, falsification or plagiarism in their work. That's a small number, but up to one third of scientists admit to committing "questionable research practices" that fall into a gray area between rigorous accuracy and outright fraud.
These dubious practices may include misrepresentations, research bias, and inaccurate interpretations of data. One common questionable research practice entails formulating a hypothesis after the research is done in order to claim a successful premise. Another highly questionable practice that can shape research is ghost-authoring by representatives of the pharmaceutical industry and other for-profit fields. Still another is gifting co-authorship to unqualified but powerful individuals who can advance one's career. Such practices can unfairly bolster a scientist's reputation and increase the likelihood of getting the work published.
The above percentages represent what scientists admit to doing themselves; when they evaluate the practices of their colleagues, the numbers jump dramatically. In a 2012 study published in the Journal of Research in Medical Sciences, researchers estimated that 14 percent of other scientists commit serious misconduct, while up to 72 percent engage in questionable practices. While these are only estimates, the problem is clearly not one of just a few bad apples.
In the PLOS study, Daniele Fanelli says that increasing evidence suggests the known frauds are "just the 'tip of the iceberg,' and that many cases are never discovered" because fraud is extremely hard to detect.
Essentially everyone wants to be associated with big breakthroughs, and they may overlook scientifically shaky foundations when a major advance is claimed.
In addition, it's likely that most cases of scientific misconduct go unreported because of the high price of whistleblowing. Those in the Macchiarini case showed extraordinary persistence in their multi-year campaign to stop his deadly trachea implants, while suffering serious damage to their careers. Such heroic efforts to unmask fraud are probably rare.
To make matters worse, there are numerous players in the scientific world who may be complicit in either committing misconduct or covering it up. These include not only primary researchers but co-authors, institutional executives, journal editors, and industry leaders. Essentially everyone wants to be associated with big breakthroughs, and they may overlook scientifically shaky foundations when a major advance is claimed.
Another part of the problem is that it's rare for students in science and medicine to receive an education in ethics. And studies have shown that older, more experienced and possibly jaded researchers are more likely to fudge results than their younger, more idealistic colleagues.
So, given the steep price that individuals and institutions pay for scientific misconduct, what compels them to go down that road in the first place? According to the JRMS study, individuals face intense pressures to publish and to attract grant money in order to secure teaching positions at universities. Once they have acquired positions, the pressure is on to keep the grants and publishing credits coming in order to obtain tenure, be appointed to positions on boards, and recruit flocks of graduate students to assist in research. And not to be underestimated is the human ego.
Paolo Macchiarini is an especially vivid example of a scientist seeking not only fortune, but fame. He liberally (and falsely) claimed powerful politicians and celebrities, even the Pope, as patients or admirers. He may be an extreme example, but we live in an age of celebrity scientists who bring huge amounts of grant money and high prestige to the institutions that employ them.
The media plays a significant role in both glorifying stars and unmasking frauds. In the Macchiarini scandal, the media first lifted him up, as in NBC's laudatory documentary, "A Leap of Faith," which painted him as a kind of miracle-worker, and then brought him down, as in the January 2016 documentary, "The Experiments," which chronicled the agonizing death of one of his patients.
Institutions can also play a crucial role in scientific fraud by putting more emphasis on the number and frequency of papers published than on their quality. The whole course of a scientist's career is profoundly affected by something called the h-index. This is a number based on both the frequency of papers published and how many times the papers are cited by other researchers. Raising one's ranking on the h-index becomes an overriding goal, sometimes eclipsing the kind of patient, time-consuming research that leads to true breakthroughs based on reliable results.
Universities also create a high-pressured environment that encourages scientists to cut corners. They, too, place a heavy emphasis on attracting large monetary grants and accruing fame and prestige. This can lead them, just as it led Karolinska, to protect a star scientist's sloppy or questionable research. According to Dr. Andrew Rosenberg, who is director of the Center for Science and Democracy at the U.S.-based Union of Concerned Scientists, "Karolinska defended its investment in an individual as opposed to the long-term health of the institution. People were dying, and they should have outsourced the investigation from the very beginning."
Having institutions investigate their own practices is a conflict of interest from the get-go, says Rosenberg.
Scientists, universities, and research institutions are also not immune to fads. "Hot" subjects attract grant money and confer prestige, incentivizing scientists to shift their research priorities in a direction that garners more grants. This can mean neglecting the scientist's true area of expertise and interests in favor of a subject that's more likely to attract grant money. In Macchiarini's case, he was allegedly at the forefront of the currently sexy field of regenerative medicine -- a field in which Karolinska was making a huge investment.
The relative scarcity of resources intensifies the already significant pressure on scientists. They may want to publish results rapidly, since they face many competitors for limited grant money, academic positions, students, and influence. The scarcity means that a great many researchers will fail while only a few succeed. Once again, the temptation may be to rush research and to show it in the most positive light possible, even if it means fudging or exaggerating results.
Though the pressures facing scientists are very real, the problem of misconduct is not inevitable.
Intense competition can have a perverse effect on researchers, according to a 2007 study in the journal Science of Engineering and Ethics. Not only does it place undue pressure on scientists to succeed, it frequently leads to the withholding of information from colleagues, which undermines a system in which new discoveries build on the previous work of others. Researchers may feel compelled to withhold their results because of the pressure to be the first to publish. The study's authors propose that more investment in basic research from governments could alleviate some of these competitive pressures.
Scientific journals, although they play a part in publishing flawed science, can't be expected to investigate cases of suspected fraud, says the German science blogger Leonid Schneider. Schneider's writings helped to expose the Macchiarini affair.
"They just basically wait for someone to retract problematic papers," he says.
He also notes that, while American scientists can go to the Office of Research Integrity to report misconduct, whistleblowers in Europe have no external authority to whom they can appeal to investigate cases of fraud.
"They have to go to their employer, who has a vested interest in covering up cases of misconduct," he says.
Science is increasingly international. Major studies can include collaborators from several different countries, and he suggests there should be an international body accessible to all researchers that will investigate suspected fraud.
Ultimately, says Rosenberg, the scientific system must incorporate trust. "You trust co-authors when you write a paper, and peer reviewers at journals trust that scientists at research institutions like Karolinska are acting with integrity."
Without trust, the whole system falls apart. It's the trust of the public, an elusive asset once it has been betrayed, that science depends upon for its very existence. Scientific research is overwhelmingly financed by tax dollars, and the need for the goodwill of the public is more than an abstraction.
The Macchiarini affair raises a profound question of trust and responsibility: Should multiple co-authors be held responsible for a lead author's misconduct?
Karolinska apparently believes so. When the institution at last owned up to the scandal, it vindictively found Karl Henrik-Grinnemo, one of the whistleblowers, guilty of scientific misconduct as well. It also designated two other whistleblowers as "blameworthy" for their roles as co-authors of the papers on which Macchiarini was the lead author.
As a result, the whistleblowers' reputations and employment prospects have become collateral damage. Accusations of research misconduct can be a career killer. Research grants dry up, employment opportunities evaporate, publishing becomes next to impossible, and collaborators vanish into thin air.
Grinnemo contends that co-authors should only be responsible for their discrete contributions, not for the data supplied by others.
"Different aspects of a paper are highly specialized," he says, "and that's why you have multiple authors. You cannot go through every single bit of data because you don't understand all the parts of the article."
This is especially true in multidisciplinary, translational research, where there are sometimes 20 or more authors. "You have to trust co-authors, and if you find something wrong you have to notify all co-authors. But you couldn't go through everything or it would take years to publish an article," says Grinnemo.
Though the pressures facing scientists are very real, the problem of misconduct is not inevitable. Along with increased support from governments and industry, a change in academic culture that emphasizes quality over quantity of published studies could help encourage meritorious research.
But beyond that, trust will always play a role when numerous specialists unite to achieve a common goal: the accumulation of knowledge that will promote human health, wealth, and well-being.
[Correction: An earlier version of this story mistakenly credited The New York Times with breaking the news of the Anversa retractions, rather than Retraction Watch and STAT, which jointly published the exclusive on October 14th. The piece in the Times ran on October 15th. We regret the error.]
Pseudoscience Is Rampant: How Not to Fall for It
Whom to believe?
The relentless and often unpredictable coronavirus (SARS-CoV-2) has, among its many quirky terrors, dredged up once again the issue that will not die: science versus pseudoscience.
How does one learn to spot the con without getting a Ph.D. and spending years in a laboratory?
The scientists, experts who would be the first to admit they are not infallible, are now in danger of being drowned out by the growing chorus of pseudoscientists, conspiracy theorists, and just plain troublemakers that seem to be as symptomatic of the virus as fever and weakness.
How is the average citizen to filter this cacophony of information and misinformation posing as science alongside real science? While all that noise makes it difficult to separate the real stuff from the fakes, there is at least one positive aspect to it all.
A famous aphorism by one Charles Caleb Colton, a popular 19th-century English cleric and writer, says that "imitation is the sincerest form of flattery."
The frauds and the paranoid conspiracy mongers who would perpetrate false science on a susceptible public are at least recognizing the value of science—they imitate it. They imitate the ways in which science works and make claims as if they were scientists, because even they recognize the power of a scientific approach. They are inadvertently showing us how much we value science. Unfortunately they are just shabby counterfeits.
Separating real science from pseudoscience is not a new problem. Philosophers, politicians, scientists, and others have been worrying about this perhaps since science as we know it, a science based entirely on experiment and not opinion, arrived in the 1600s. The original charter of the British Royal Society, the first organized scientific society, stated that at their formal meetings there would be no discussion of politics, religion, or perpetual motion machines.
The first two of those for the obvious purpose of keeping the peace. But the third is interesting because at that time perpetual motion machines were one of the main offerings of the imitators, the bogus scientists who were sure that you could find ways around the universal laws of energy and make a buck on it. The motto adopted by the society was, and remains, Nullius in verba, Latin for "take nobody's word for it." Kind of an early version of Missouri's venerable state motto: "Show me."
You might think that telling phony science from the real thing wouldn't be so difficult, but events, historical and current, tell a very different story—often with tragic outcomes. Just one terrible example is the estimated 350,000 additional HIV deaths in South Africa directly caused by the now-infamous conspiracy theories of their own elected President no less (sound familiar?). It's surprisingly easy to dress up phony science as the real thing by simply adopting, or appearing to adopt, the trappings of science.
Thus, the anti-vaccine movement claims to be based on suspicion of authority, beginning with medical authority in this case, stemming from the fraudulent data published by the now-disgraced Andrew Wakefield, an English gastroenterologist. And it's true that much of science is based on suspicion of authority. Science got its start when the likes of Galileo and Copernicus claimed that the Church, the State, even Aristotle, could no longer be trusted as authoritative sources of knowledge.
But Galileo and those who followed him produced alternative explanations, and those alternatives were based on data that arose independently from many sources and generated a great deal of debate and, most importantly, could be tested by experiments that could prove them wrong. The anti-vaccine movement imitates science, still citing the discredited Wakefield report, but really offers nothing but suspicion—and that is paranoia, not science.
Similarly, there are those who try to cloak their nefarious motives in the trappings of science by claiming that they are taking the scientific posture of doubt. Science after all depends on doubt—every scientist doubts every finding they make. Every scientist knows that they can't possibly foresee all possible instances or situations in which they could be proven wrong, no matter how strong their data. Einstein was doubted for two decades, and cosmologists are still searching for experimental proofs of relativity. Science indeed progresses by doubt. In science revision is a victory.
But the imitators merely use doubt to suggest that science is not dependable and should not be used for informing policy or altering our behavior. They claim to be taking the legitimate scientific stance of doubt. Of course, they don't doubt everything, only what is problematic for their individual enterprises. They don't doubt the value of blood pressure medicine to control their hypertension. But they should, because every medicine has side effects and we don't completely understand how blood pressure is regulated and whether there may not be still better ways of controlling it.
But we use the pills we have because the science is sound even when it is not completely settled. Ask a hypertensive oil executive who would like you to believe that climate science should be ignored because there are too many uncertainties in the data, if he is willing to forgo his blood pressure medicine—because it, too, has its share of uncertainties and unwanted side effects.
The apparent success of pseudoscience is not due to gullibility on the part of the public. The problem is that science is recognized as valuable and that the imitators are unfortunately good at what they do. They take a scientific pose to gain your confidence and then distort the facts to their own purposes. How does one learn to spot the con without getting a Ph.D. and spending years in a laboratory?
"If someone claims to have the ultimate answer or that they know something for certain, the only thing for sure is that they are trying to fool you."
What can be done to make the distinction clearer? Several solutions have been tried—and seem to have failed. Radio and television shows about the latest scientific breakthroughs are a noble attempt to give the public a taste of good science, but they do nothing to help you distinguish between them and the pseudoscience being purveyed on the neighboring channel and its "scientific investigations" of haunted houses.
Similarly, attempts to inculcate what are called "scientific habits of mind" are of little practical help. These habits of mind are not so easy to adopt. They invariably require some amount of statistics and probability and much of that is counterintuitive—one of the great values of science is to help us to counter our normal biases and expectations by showing that the actual measurements may not bear them out.
Additionally, there is math—no matter how much you try to hide it, much of the language of science is math (Galileo said that). And half the audience is gone with each equation (Stephen Hawking said that). It's hard to imagine a successful program of making a non-scientifically trained public interested in adopting the rigors of scientific habits of mind. Indeed, I suspect there are some people, artists for example, who would be rightfully suspicious of changing their thinking to being habitually scientific. Many scientists are frustrated by the public's inability to think like a scientist, but in fact it is neither easy nor always desirable to do so. And it is certainly not practical.
There is a more intuitive and simpler way to tell the difference between the real thing and the cheap knock-off. In fact, it is not so much intuitive as it is counterintuitive, so it takes a little bit of mental work. But the good thing is it works almost all the time by following a simple, if as I say, counterintuitive, rule.
True science, you see, is mostly concerned with the unknown and the uncertain. If someone claims to have the ultimate answer or that they know something for certain, the only thing for sure is that they are trying to fool you. Mystery and uncertainty may not strike you right off as desirable or strong traits, but that is precisely where one finds the creative solutions that science has historically arrived at. Yes, science accumulates factual knowledge, but it is at its best when it generates new and better questions. Uncertainty is not a place of worry, but of opportunity. Progress lives at the border of the unknown.
How much would it take to alter the public perception of science to appreciate unknowns and uncertainties along with facts and conclusions? Less than you might think. In fact, we make decisions based on uncertainty every day—what to wear in case of 60 percent chance of rain—so it should not be so difficult to extend that to science, in spite of what you were taught in school about all the hard facts in those giant textbooks.
You can believe science that says there is clear evidence that takes us this far… and then we have to speculate a bit and it could go one of two or three ways—or maybe even some way we don't see yet. But like your blood pressure medicine, the stuff we know is reliable even if incomplete. It will lower your blood pressure, no matter that better treatments with fewer side effects may await us in the future.
Unsettled science is not unsound science. The honesty and humility of someone who is willing to tell you that they don't have all the answers, but they do have some thoughtful questions to pursue, are easy to distinguish from the charlatans who have ready answers or claim that nothing should be done until we are an impossible 100-percent sure.
Imitation may be the sincerest form of flattery.
The problem, as we all know, is that flattery will get you nowhere.
[Editor's Note: This article was originally published on June 8th, 2020 as part of a standalone magazine called GOOD10: The Pandemic Issue. Produced as a partnership among LeapsMag, The Aspen Institute, and GOOD, the magazine is available for free online.]
Henrietta Lacks' Cells Enabled Medical Breakthroughs. Is It Time to Finally Retire Them?
For Victoria Tokarz, a third-year PhD student at the University of Toronto, experimenting with cells is just part of a day's work. Tokarz, 26, is studying to be a cell biologist and spends her time inside the lab manipulating muscle cells sourced from rodents to try to figure out how they respond to insulin. She hopes this research could someday lead to a breakthrough in our understanding of diabetes.
"People like to use HeLa cells because they're easy to use."
But in all her research, there is one cell culture that Tokarz refuses to touch. The culture is called HeLa, short for Henrietta Lacks, named after the 31-year-old tobacco farmer the cells were stolen from during a tumor biopsy she underwent in 1951.
"In my opinion, there's no question or experiment I can think of that validates stealing from and profiting off of a black woman's body," Tokarz says. "We're not talking about a reagent we created in a lab, a mixture of some chemicals. We're talking about a human being who suffered indescribably so we could profit off of her misfortune."
Lacks' suffering is something that, until recently, was not widely known. Born to a poor family in Roanoke, VA, Lacks was sent to live with her grandfather on the family tobacco farm at age four, shortly after the death of her mother. She gave birth to her first child at just fourteen, and two years later had another child with profound developmental disabilities. Lacks married her first cousin, David, in 1941 and the family moved to Maryland where they had three additional children.
But the real misfortune came in 1951, when Lacks told her cousins that she felt a hard "knot" in her womb. When Lacks went to Johns Hopkins hospital to have the knot examined, doctors discovered that the hard lump Henrietta felt was a rapidly-growing cervical tumor.
Before the doctors treated the tumor – inserting radium tubes into her vagina, in the hopes they could kill the cancer, Lacks' doctors clipped two tissue samples from her cervix, without Lacks' knowledge or consent. While it's considered widely unethical today, taking tissue samples from patients was commonplace at the time. The samples were sent to a cancer researcher at Johns Hopkins and Lacks continued treatment unsuccessfully until she died a few months later of metastatic cancer.
Lacks' story was not over, however: When her tissue sample arrived at the lab of George Otto Gey, the Johns Hopkins cancer researcher, he noticed that the cancerous cells grew at a shocking pace. Unlike other cell cultures that would die within a day or two of arriving at the lab, Lacks' cells kept multiplying. They doubled every 24 hours, and to this day, have never stopped.
Scientists would later find out that this growth was due to an infection of Human Papilloma Virus, or HPV, which is known for causing aggressive cancers. Lacks' cells became the world's first-ever "immortalized" human cell line, meaning that as long as certain environmental conditions are met, the cells can replicate indefinitely. Although scientists have cultivated other immortalized cell lines since then, HeLa cells remain a favorite among scientists due to their resilience, Tokarz says.
"People like to use HeLa cells because they're easy to use," Tokarz says. "They're easy to manipulate, because they're very hardy, and they allow for transection, which means expressing a protein in a cell that's not normally there. Other cells, like endothelial cells, don't handle those manipulations well."
Once the doctors at Johns Hopkins discovered that Lacks' cells could replicate indefinitely, they started shipping them to labs around the world to promote medical research. As they were the only immortalized cell line available at the time, researchers used them for thousands of experiments — some of which resulted in life-saving treatments. Jonas Salk's polio vaccine, for example, was manufactured using HeLa cells. HeLa cell research was also used to develop a vaccine for HPV, and for the development of in vitro fertilization and gene mapping. Between 1951 and 2018, HeLa cells have been cited in over 110,000 publications, according to a review from the National Institutes of Health.
But while some scientists like Tokarz are thankful for the advances brought about by HeLa cells, they still believe it's well past time to stop using them in research.
"Am I thankful we have a polio vaccine? Absolutely. Do I resent the way we came to have that vaccine? Absolutely," Tokarz says. "We could have still arrived at those same advances by treating her as the human being she is, not just a specimen."
Ethical considerations aside, HeLa is no longer the world's only available cell line – nor, Tokarz argues, are her cells the most suitable for every type of research. "The closer you can get to the physiology of the thing you're studying, the better," she says. "Now we have the ability to use primary cells, which are isolated from a person and put right into the culture dish, and those don't have the same mutations as cells that have been growing for 20 years. We didn't have the expertise to do that initially, but now we do."
Raphael Valdivia, a professor of molecular genetics and microbiology at Duke University School of Medicine, agrees that HeLa cells are no longer optimal for most research. "A lot of scientists are moving away from HeLa cells because they're so unstable," he says. "They mutate, they rearrange chromosomes to become adaptive, and different batches of cells evolve separately from each other. The HeLa cells in my lab are very different than the ones down the hall, and that means sometimes we can't replicate our results. We have to go back to an earlier batch of cells in the freezer and re-test."
Still, the idea of retiring the cells completely doesn't make sense, Valdivia says: "To some extent, you're beholden to previous research. You need to be able to confirm findings that happen in earlier studies, and to do that you need to use the same cell line that other researchers have used."
"Ethics is not black and white, and sometimes there's no such thing as a straightforward ethical or unethical choice."
"The way in which the cells were taken – without patient consent – is completely inappropriate," says Yann Joly, associate professor at the Faculty of Medicine in Toronto and Research Director at the Centre of Genomics and Policy. "The question now becomes, what can we do about it now? What are our options?"
While scientists are not able to erase what was done to Henrietta Lacks, Joly argues that retiring her cells is also non-consensual, assuming – maybe incorrectly – what Henrietta would have wanted, without her input. Additionally, Joly points out that other immortalized human cell lines are fraught with what some people consider to be ethical concerns as well, such as the human embryonic kidney cell line, commonly referred to as HEK-293, that was derived from an aborted female fetus. "Just because you're using another kind of cell doesn't mean it's devoid of ethical issue," he says.
Seemingly, the one thing scientists can agree on is that Henrietta Lacks was mistreated by the medical community. But even so, retiring her cells from medical research is not an obvious solution. Scientists are now using HeLa cells to better understand how the novel coronavirus affects humans, and this knowledge will inform how researchers develop a COVID-19 vaccine.
"Ethics is not black and white, and sometimes there's no such thing as a straightforward ethical or unethical choice," Joly says. "If [ethics] were that easy, nobody would need to teach it."