Researchers Behaving Badly: Known Frauds Are "the Tip of the Iceberg"
Last week, the whistleblowers in the Paolo Macchiarini affair at Sweden's Karolinska Institutet went on the record here to detail the retaliation they suffered for trying to expose a star surgeon's appalling research misconduct.
Scientific fraud of the type committed by Macchiarini is rare, but studies suggest that it's on the rise.
The whistleblowers had discovered that in six published papers, Macchiarini falsified data, lied about the condition of patients and circumvented ethical approvals. As a result, multiple patients suffered and died. But Karolinska turned a blind eye for years.
Scientific fraud of the type committed by Macchiarini is rare, but studies suggest that it's on the rise. Just this week, for example, Retraction Watch and STAT together broke the news that a Harvard Medical School cardiologist and stem cell researcher, Piero Anversa, falsified data in a whopping 31 papers, which now have to be retracted. Anversa had claimed that he could regenerate heart muscle by injecting bone marrow cells into damaged hearts, a result that no one has been able to duplicate.
A 2009 study published in the Public Library of Science (PLOS) found that about two percent of scientists admitted to committing fabrication, falsification or plagiarism in their work. That's a small number, but up to one third of scientists admit to committing "questionable research practices" that fall into a gray area between rigorous accuracy and outright fraud.
These dubious practices may include misrepresentations, research bias, and inaccurate interpretations of data. One common questionable research practice entails formulating a hypothesis after the research is done in order to claim a successful premise. Another highly questionable practice that can shape research is ghost-authoring by representatives of the pharmaceutical industry and other for-profit fields. Still another is gifting co-authorship to unqualified but powerful individuals who can advance one's career. Such practices can unfairly bolster a scientist's reputation and increase the likelihood of getting the work published.
The above percentages represent what scientists admit to doing themselves; when they evaluate the practices of their colleagues, the numbers jump dramatically. In a 2012 study published in the Journal of Research in Medical Sciences, researchers estimated that 14 percent of other scientists commit serious misconduct, while up to 72 percent engage in questionable practices. While these are only estimates, the problem is clearly not one of just a few bad apples.
In the PLOS study, Daniele Fanelli says that increasing evidence suggests the known frauds are "just the 'tip of the iceberg,' and that many cases are never discovered" because fraud is extremely hard to detect.
Essentially everyone wants to be associated with big breakthroughs, and they may overlook scientifically shaky foundations when a major advance is claimed.
In addition, it's likely that most cases of scientific misconduct go unreported because of the high price of whistleblowing. Those in the Macchiarini case showed extraordinary persistence in their multi-year campaign to stop his deadly trachea implants, while suffering serious damage to their careers. Such heroic efforts to unmask fraud are probably rare.
To make matters worse, there are numerous players in the scientific world who may be complicit in either committing misconduct or covering it up. These include not only primary researchers but co-authors, institutional executives, journal editors, and industry leaders. Essentially everyone wants to be associated with big breakthroughs, and they may overlook scientifically shaky foundations when a major advance is claimed.
Another part of the problem is that it's rare for students in science and medicine to receive an education in ethics. And studies have shown that older, more experienced and possibly jaded researchers are more likely to fudge results than their younger, more idealistic colleagues.
So, given the steep price that individuals and institutions pay for scientific misconduct, what compels them to go down that road in the first place? According to the JRMS study, individuals face intense pressures to publish and to attract grant money in order to secure teaching positions at universities. Once they have acquired positions, the pressure is on to keep the grants and publishing credits coming in order to obtain tenure, be appointed to positions on boards, and recruit flocks of graduate students to assist in research. And not to be underestimated is the human ego.
Paolo Macchiarini is an especially vivid example of a scientist seeking not only fortune, but fame. He liberally (and falsely) claimed powerful politicians and celebrities, even the Pope, as patients or admirers. He may be an extreme example, but we live in an age of celebrity scientists who bring huge amounts of grant money and high prestige to the institutions that employ them.
The media plays a significant role in both glorifying stars and unmasking frauds. In the Macchiarini scandal, the media first lifted him up, as in NBC's laudatory documentary, "A Leap of Faith," which painted him as a kind of miracle-worker, and then brought him down, as in the January 2016 documentary, "The Experiments," which chronicled the agonizing death of one of his patients.
Institutions can also play a crucial role in scientific fraud by putting more emphasis on the number and frequency of papers published than on their quality. The whole course of a scientist's career is profoundly affected by something called the h-index. This is a number based on both the frequency of papers published and how many times the papers are cited by other researchers. Raising one's ranking on the h-index becomes an overriding goal, sometimes eclipsing the kind of patient, time-consuming research that leads to true breakthroughs based on reliable results.
Universities also create a high-pressured environment that encourages scientists to cut corners. They, too, place a heavy emphasis on attracting large monetary grants and accruing fame and prestige. This can lead them, just as it led Karolinska, to protect a star scientist's sloppy or questionable research. According to Dr. Andrew Rosenberg, who is director of the Center for Science and Democracy at the U.S.-based Union of Concerned Scientists, "Karolinska defended its investment in an individual as opposed to the long-term health of the institution. People were dying, and they should have outsourced the investigation from the very beginning."
Having institutions investigate their own practices is a conflict of interest from the get-go, says Rosenberg.
Scientists, universities, and research institutions are also not immune to fads. "Hot" subjects attract grant money and confer prestige, incentivizing scientists to shift their research priorities in a direction that garners more grants. This can mean neglecting the scientist's true area of expertise and interests in favor of a subject that's more likely to attract grant money. In Macchiarini's case, he was allegedly at the forefront of the currently sexy field of regenerative medicine -- a field in which Karolinska was making a huge investment.
The relative scarcity of resources intensifies the already significant pressure on scientists. They may want to publish results rapidly, since they face many competitors for limited grant money, academic positions, students, and influence. The scarcity means that a great many researchers will fail while only a few succeed. Once again, the temptation may be to rush research and to show it in the most positive light possible, even if it means fudging or exaggerating results.
Though the pressures facing scientists are very real, the problem of misconduct is not inevitable.
Intense competition can have a perverse effect on researchers, according to a 2007 study in the journal Science of Engineering and Ethics. Not only does it place undue pressure on scientists to succeed, it frequently leads to the withholding of information from colleagues, which undermines a system in which new discoveries build on the previous work of others. Researchers may feel compelled to withhold their results because of the pressure to be the first to publish. The study's authors propose that more investment in basic research from governments could alleviate some of these competitive pressures.
Scientific journals, although they play a part in publishing flawed science, can't be expected to investigate cases of suspected fraud, says the German science blogger Leonid Schneider. Schneider's writings helped to expose the Macchiarini affair.
"They just basically wait for someone to retract problematic papers," he says.
He also notes that, while American scientists can go to the Office of Research Integrity to report misconduct, whistleblowers in Europe have no external authority to whom they can appeal to investigate cases of fraud.
"They have to go to their employer, who has a vested interest in covering up cases of misconduct," he says.
Science is increasingly international. Major studies can include collaborators from several different countries, and he suggests there should be an international body accessible to all researchers that will investigate suspected fraud.
Ultimately, says Rosenberg, the scientific system must incorporate trust. "You trust co-authors when you write a paper, and peer reviewers at journals trust that scientists at research institutions like Karolinska are acting with integrity."
Without trust, the whole system falls apart. It's the trust of the public, an elusive asset once it has been betrayed, that science depends upon for its very existence. Scientific research is overwhelmingly financed by tax dollars, and the need for the goodwill of the public is more than an abstraction.
The Macchiarini affair raises a profound question of trust and responsibility: Should multiple co-authors be held responsible for a lead author's misconduct?
Karolinska apparently believes so. When the institution at last owned up to the scandal, it vindictively found Karl Henrik-Grinnemo, one of the whistleblowers, guilty of scientific misconduct as well. It also designated two other whistleblowers as "blameworthy" for their roles as co-authors of the papers on which Macchiarini was the lead author.
As a result, the whistleblowers' reputations and employment prospects have become collateral damage. Accusations of research misconduct can be a career killer. Research grants dry up, employment opportunities evaporate, publishing becomes next to impossible, and collaborators vanish into thin air.
Grinnemo contends that co-authors should only be responsible for their discrete contributions, not for the data supplied by others.
"Different aspects of a paper are highly specialized," he says, "and that's why you have multiple authors. You cannot go through every single bit of data because you don't understand all the parts of the article."
This is especially true in multidisciplinary, translational research, where there are sometimes 20 or more authors. "You have to trust co-authors, and if you find something wrong you have to notify all co-authors. But you couldn't go through everything or it would take years to publish an article," says Grinnemo.
Though the pressures facing scientists are very real, the problem of misconduct is not inevitable. Along with increased support from governments and industry, a change in academic culture that emphasizes quality over quantity of published studies could help encourage meritorious research.
But beyond that, trust will always play a role when numerous specialists unite to achieve a common goal: the accumulation of knowledge that will promote human health, wealth, and well-being.
[Correction: An earlier version of this story mistakenly credited The New York Times with breaking the news of the Anversa retractions, rather than Retraction Watch and STAT, which jointly published the exclusive on October 14th. The piece in the Times ran on October 15th. We regret the error.]
As countries around the world combat the coronavirus outbreak, governments that already operated sophisticated surveillance programs are ramping up the tracking of their citizens.
"The potential for invasions of privacy, abuse, and stigmatization is enormous."
Countries like China, South Korea, Israel, Singapore and others are closely monitoring citizens to track the spread of the virus and prevent further infections, and policymakers in the United States have proposed similar steps. These shifts in policy have civil liberties defenders alarmed, as history has shown increases in surveillance tend to stick around after an emergency is over.
In China, where the virus originated and surveillance is already ubiquitous, the government has taken measures like having people scan a QR code and answer questions about their health and travel history to enter their apartment building. The country has also increased the tracking of cell phones, encouraged citizens to report people who appear to be sick, utilized surveillance drones, and developed facial recognition that can identify someone even if they're wearing a mask.
In Israel, the government has begun tracking people's cell phones without a court order under a program that was initially meant to counter terrorism. Singapore has also been closely tracking people's movements using cell phone data. In South Korea, the government has been monitoring citizens' credit card and cell phone data and has heavily utilized facial recognition to combat the spread of the coronavirus.
Here at home, the United States government and state governments have been using cell phone data to determine where people are congregating. White House senior adviser Jared Kushner's task force to combat the coronavirus outbreak has proposed using cell phone data to track coronavirus patients. Cities around the nation are also using surveillance drones to maintain social distancing orders. Companies like Apple and Google that work closely with the federal government are currently developing systems to track Americans' cell phones.
All of this might sound acceptable if you're worried about containing the outbreak and getting back to normal life, but as we saw when the Patriot Act was passed in 2001 in the wake of the 9/11 terrorist attacks, expansions of the surveillance state can persist long after the emergency that seemed to justify them.
Jay Stanley, senior policy analyst with the ACLU Speech, Privacy, and Technology Project, says that this public health emergency requires bold action, but he worries that actions may be taken that will infringe on our privacy rights.
"This is an extraordinary crisis that justifies things that would not be justified in ordinary times, but we, of course, worry that any such things would be made permanent," Stanley says.
Stanley notes that the 9/11 situation was different from this current situation because we still face the threat of terrorism today, and we always will. The Patriot Act was a response to that threat, even if it was an extreme response. With this pandemic, it's quite possible we won't face something like this again for some time.
"We know that for the last seven or eight decades, we haven't seen a microbe this dangerous become a pandemic, and it's reasonable to expect it's not going to be happening for a while afterward," Stanley says. "We do know that when a vaccine is produced and is produced widely enough, the COVID crisis will be over. This does, unlike 9/11, have a definitive ending."
The ACLU released a white paper last week outlining the problems with using location data from cell phones and how policymakers should proceed when they discuss the usage of surveillance to combat the outbreak.
"Location data contains an enormously invasive and personal set of information about each of us, with the potential to reveal such things as people's social, sexual, religious, and political associations," they wrote. "The potential for invasions of privacy, abuse, and stigmatization is enormous. Any uses of such data should be temporary, restricted to public health agencies and purposes, and should make the greatest possible use of available techniques that allow for privacy and anonymity to be protected, even as the data is used."
"The first thing you need to combat pervasive surveillance is to know that it's occurring."
Sara Collins, policy counsel at the digital rights organization Public Knowledge, says that one of the problems with the current administration is that there's not much transparency, so she worries surveillance could be increased without the public realizing it.
"You'll often see the White House come out with something—that they're going to take this action or an agency just says they're going to take this action—and there's no congressional authorization," Collins says. "There's no regulation. There's nothing there for the public discourse."
Collins says it's almost impossible to protect against infringements on people's privacy rights if you don't actually know what kind of surveillance is being done and at what scale.
"I think that's very concerning when there's no accountability and no way to understand what's actually happening," Collins says. "The first thing you need to combat pervasive surveillance is to know that it's occurring."
We should also be worried about corporate surveillance, Collins says, because the tech companies that keep track of our data work closely with the government and do not have a good track record when it comes to protecting people's privacy. She suspects these companies could use the coronavirus outbreak to defend the kind of data collection they've been engaging in for years.
Collins stresses that any increase in surveillance should be transparent and short-lived, and that there should be a limit on how long people's data can be kept. Otherwise, she says, we're risking an indefinite infringement on privacy rights. Her organization will be keeping tabs as the crisis progresses.
It's not that we shouldn't avail ourselves of modern technology to fight the pandemic. Indeed, once lockdown restrictions are gradually lifted, public health officials must increase their ability to isolate new cases and trace, test, and quarantine contacts.
But tracking the entire populace "Big Brother"-style is not the ideal way out of the crisis. Last week, for instance, a group of policy experts -- including former FDA Commissioner Scott Gottlieb -- published recommendations for how to achieve containment. They emphasized the need for widespread diagnostic and serologic testing as well as rapid case-based interventions, among other measures -- and they, too, were wary of pervasive measures to follow citizens.
The group wrote: "Improved capacity [for timely contact tracing] will be most effective if coordinated with health care providers, health systems, and health plans and supported by timely electronic data sharing. Cell phone-based apps recording proximity events between individuals are unlikely to have adequate discriminating ability or adoption to achieve public health utility, while introducing serious privacy, security, and logistical concerns."
The bottom line: Any broad increases in surveillance should be carefully considered before we go along with them out of fear. The Founders knew that privacy is integral to freedom; that's why they wrote the Fourth Amendment to protect it, and that right shouldn't be thrown away because we're in an emergency. Once you lose a right, you don't tend to get it back.
Kira Peikoff was the editor-in-chief of Leaps.org from 2017 to 2021. As a journalist, her work has appeared in The New York Times, Newsweek, Nautilus, Popular Mechanics, The New York Academy of Sciences, and other outlets. She is also the author of four suspense novels that explore controversial issues arising from scientific innovation: Living Proof, No Time to Die, Die Again Tomorrow, and Mother Knows Best. Peikoff holds a B.A. in Journalism from New York University and an M.S. in Bioethics from Columbia University. She lives in New Jersey with her husband and two young sons. Follow her on Twitter @KiraPeikoff.