Researchers Behaving Badly: Known Frauds Are "the Tip of the Iceberg"
Last week, the whistleblowers in the Paolo Macchiarini affair at Sweden's Karolinska Institutet went on the record here to detail the retaliation they suffered for trying to expose a star surgeon's appalling research misconduct.
Scientific fraud of the type committed by Macchiarini is rare, but studies suggest that it's on the rise.
The whistleblowers had discovered that in six published papers, Macchiarini falsified data, lied about the condition of patients and circumvented ethical approvals. As a result, multiple patients suffered and died. But Karolinska turned a blind eye for years.
Scientific fraud of the type committed by Macchiarini is rare, but studies suggest that it's on the rise. Just this week, for example, Retraction Watch and STAT together broke the news that a Harvard Medical School cardiologist and stem cell researcher, Piero Anversa, falsified data in a whopping 31 papers, which now have to be retracted. Anversa had claimed that he could regenerate heart muscle by injecting bone marrow cells into damaged hearts, a result that no one has been able to duplicate.
A 2009 study published in the Public Library of Science (PLOS) found that about two percent of scientists admitted to committing fabrication, falsification or plagiarism in their work. That's a small number, but up to one third of scientists admit to committing "questionable research practices" that fall into a gray area between rigorous accuracy and outright fraud.
These dubious practices may include misrepresentations, research bias, and inaccurate interpretations of data. One common questionable research practice entails formulating a hypothesis after the research is done in order to claim a successful premise. Another highly questionable practice that can shape research is ghost-authoring by representatives of the pharmaceutical industry and other for-profit fields. Still another is gifting co-authorship to unqualified but powerful individuals who can advance one's career. Such practices can unfairly bolster a scientist's reputation and increase the likelihood of getting the work published.
The above percentages represent what scientists admit to doing themselves; when they evaluate the practices of their colleagues, the numbers jump dramatically. In a 2012 study published in the Journal of Research in Medical Sciences, researchers estimated that 14 percent of other scientists commit serious misconduct, while up to 72 percent engage in questionable practices. While these are only estimates, the problem is clearly not one of just a few bad apples.
In the PLOS study, Daniele Fanelli says that increasing evidence suggests the known frauds are "just the 'tip of the iceberg,' and that many cases are never discovered" because fraud is extremely hard to detect.
Essentially everyone wants to be associated with big breakthroughs, and they may overlook scientifically shaky foundations when a major advance is claimed.
In addition, it's likely that most cases of scientific misconduct go unreported because of the high price of whistleblowing. Those in the Macchiarini case showed extraordinary persistence in their multi-year campaign to stop his deadly trachea implants, while suffering serious damage to their careers. Such heroic efforts to unmask fraud are probably rare.
To make matters worse, there are numerous players in the scientific world who may be complicit in either committing misconduct or covering it up. These include not only primary researchers but co-authors, institutional executives, journal editors, and industry leaders. Essentially everyone wants to be associated with big breakthroughs, and they may overlook scientifically shaky foundations when a major advance is claimed.
Another part of the problem is that it's rare for students in science and medicine to receive an education in ethics. And studies have shown that older, more experienced and possibly jaded researchers are more likely to fudge results than their younger, more idealistic colleagues.
So, given the steep price that individuals and institutions pay for scientific misconduct, what compels them to go down that road in the first place? According to the JRMS study, individuals face intense pressures to publish and to attract grant money in order to secure teaching positions at universities. Once they have acquired positions, the pressure is on to keep the grants and publishing credits coming in order to obtain tenure, be appointed to positions on boards, and recruit flocks of graduate students to assist in research. And not to be underestimated is the human ego.
Paolo Macchiarini is an especially vivid example of a scientist seeking not only fortune, but fame. He liberally (and falsely) claimed powerful politicians and celebrities, even the Pope, as patients or admirers. He may be an extreme example, but we live in an age of celebrity scientists who bring huge amounts of grant money and high prestige to the institutions that employ them.
The media plays a significant role in both glorifying stars and unmasking frauds. In the Macchiarini scandal, the media first lifted him up, as in NBC's laudatory documentary, "A Leap of Faith," which painted him as a kind of miracle-worker, and then brought him down, as in the January 2016 documentary, "The Experiments," which chronicled the agonizing death of one of his patients.
Institutions can also play a crucial role in scientific fraud by putting more emphasis on the number and frequency of papers published than on their quality. The whole course of a scientist's career is profoundly affected by something called the h-index. This is a number based on both the frequency of papers published and how many times the papers are cited by other researchers. Raising one's ranking on the h-index becomes an overriding goal, sometimes eclipsing the kind of patient, time-consuming research that leads to true breakthroughs based on reliable results.
Universities also create a high-pressured environment that encourages scientists to cut corners. They, too, place a heavy emphasis on attracting large monetary grants and accruing fame and prestige. This can lead them, just as it led Karolinska, to protect a star scientist's sloppy or questionable research. According to Dr. Andrew Rosenberg, who is director of the Center for Science and Democracy at the U.S.-based Union of Concerned Scientists, "Karolinska defended its investment in an individual as opposed to the long-term health of the institution. People were dying, and they should have outsourced the investigation from the very beginning."
Having institutions investigate their own practices is a conflict of interest from the get-go, says Rosenberg.
Scientists, universities, and research institutions are also not immune to fads. "Hot" subjects attract grant money and confer prestige, incentivizing scientists to shift their research priorities in a direction that garners more grants. This can mean neglecting the scientist's true area of expertise and interests in favor of a subject that's more likely to attract grant money. In Macchiarini's case, he was allegedly at the forefront of the currently sexy field of regenerative medicine -- a field in which Karolinska was making a huge investment.
The relative scarcity of resources intensifies the already significant pressure on scientists. They may want to publish results rapidly, since they face many competitors for limited grant money, academic positions, students, and influence. The scarcity means that a great many researchers will fail while only a few succeed. Once again, the temptation may be to rush research and to show it in the most positive light possible, even if it means fudging or exaggerating results.
Though the pressures facing scientists are very real, the problem of misconduct is not inevitable.
Intense competition can have a perverse effect on researchers, according to a 2007 study in the journal Science of Engineering and Ethics. Not only does it place undue pressure on scientists to succeed, it frequently leads to the withholding of information from colleagues, which undermines a system in which new discoveries build on the previous work of others. Researchers may feel compelled to withhold their results because of the pressure to be the first to publish. The study's authors propose that more investment in basic research from governments could alleviate some of these competitive pressures.
Scientific journals, although they play a part in publishing flawed science, can't be expected to investigate cases of suspected fraud, says the German science blogger Leonid Schneider. Schneider's writings helped to expose the Macchiarini affair.
"They just basically wait for someone to retract problematic papers," he says.
He also notes that, while American scientists can go to the Office of Research Integrity to report misconduct, whistleblowers in Europe have no external authority to whom they can appeal to investigate cases of fraud.
"They have to go to their employer, who has a vested interest in covering up cases of misconduct," he says.
Science is increasingly international. Major studies can include collaborators from several different countries, and he suggests there should be an international body accessible to all researchers that will investigate suspected fraud.
Ultimately, says Rosenberg, the scientific system must incorporate trust. "You trust co-authors when you write a paper, and peer reviewers at journals trust that scientists at research institutions like Karolinska are acting with integrity."
Without trust, the whole system falls apart. It's the trust of the public, an elusive asset once it has been betrayed, that science depends upon for its very existence. Scientific research is overwhelmingly financed by tax dollars, and the need for the goodwill of the public is more than an abstraction.
The Macchiarini affair raises a profound question of trust and responsibility: Should multiple co-authors be held responsible for a lead author's misconduct?
Karolinska apparently believes so. When the institution at last owned up to the scandal, it vindictively found Karl Henrik-Grinnemo, one of the whistleblowers, guilty of scientific misconduct as well. It also designated two other whistleblowers as "blameworthy" for their roles as co-authors of the papers on which Macchiarini was the lead author.
As a result, the whistleblowers' reputations and employment prospects have become collateral damage. Accusations of research misconduct can be a career killer. Research grants dry up, employment opportunities evaporate, publishing becomes next to impossible, and collaborators vanish into thin air.
Grinnemo contends that co-authors should only be responsible for their discrete contributions, not for the data supplied by others.
"Different aspects of a paper are highly specialized," he says, "and that's why you have multiple authors. You cannot go through every single bit of data because you don't understand all the parts of the article."
This is especially true in multidisciplinary, translational research, where there are sometimes 20 or more authors. "You have to trust co-authors, and if you find something wrong you have to notify all co-authors. But you couldn't go through everything or it would take years to publish an article," says Grinnemo.
Though the pressures facing scientists are very real, the problem of misconduct is not inevitable. Along with increased support from governments and industry, a change in academic culture that emphasizes quality over quantity of published studies could help encourage meritorious research.
But beyond that, trust will always play a role when numerous specialists unite to achieve a common goal: the accumulation of knowledge that will promote human health, wealth, and well-being.
[Correction: An earlier version of this story mistakenly credited The New York Times with breaking the news of the Anversa retractions, rather than Retraction Watch and STAT, which jointly published the exclusive on October 14th. The piece in the Times ran on October 15th. We regret the error.]
Taboo topics occupy a difficult place in the history of medicine. Society has long been reticent about confronting stigmatized conditions, forcing many patients to suffer in silence and isolation, often with poorer care.
"Classically, doctors don't purposely kill people. That is really the core of the resistance."
AIDS activists recognized this in the 1980s when they coined the phrase Silence = Death to generate public debate and action over a growing epidemic that until then had existed largely in the shadows. The slogan and the activists behind it were remarkably successful at changing the public discourse.
It is not a lone example. Post-World War II medicine is better because it came to deal more forthrightly with a broad range of medical conditions from conception/abortion, to cancer, to sexually transmitted infections. The most recent issue to face such scrutiny is physician-assisted dying (PAD).
"Classically, doctors don't purposely kill people…that is really the core of the resistance" to PAD from the provider perspective, says Neil Wenger, an internist and ethicist at the University of California Los Angeles who focuses on end-of-life issues.
But from the patient perspective, the option of PAD "provides important psychological benefits ... because it gives the terminally ill autonomy, control, and choice," argued the American Public Health Association in support of Oregon's death with dignity legislation.
Jack Kervorkian, "Dr. Death," was one of the first to broach the subject when few in polite society were willing to do so. The modern era truly began twenty years ago when the citizens of Oregon embraced the option of death with dignity in a public referendum, over the objections of their political leaders.
Expansion of the legal option in North America was incremental until 2016 when the Supreme Court in Canada and legislators in California decided that control over one's body extended to death, at least under certain explicit conditions.
An estimated 18 percent of Americans now live in jurisdictions that provide the legal option of assisted death, but exercising that right can be difficult. Only a fraction of one percent of deaths are by PAD, even in Oregon.
Stakeholder Roles
Few organizations of healthcare professionals in the U.S. support PAD; some actively oppose it, others have switched to a position of neutrality while they study the issue.
One doctor wanted to organize a discussion of physician-assisted dying at his hospital, but administrators forbade it.
But once a jurisdiction makes the political/legal decision that patients have a right to physician-assisted death, what are the roles and responsibilities of medical stakeholders? Can they simply opt out in a vow of silence? Or do organizations bear some sort of obligation to ensure access to that right, no matter their own position, particularly when they are both regulated by and receive operating funds from public sources?
The law in California and other U.S. jurisdictions reflects ambivalence about PAD by treating it differently from other medical practices, says David Magnus, an ethicist at Stanford University School of Medicine. It is allowed but "it's intentionally a very, very burdensome process."
Medical decisions, including withdrawing life support or a do not resuscitate [DNR] order, are between a physician and the patient or guardian. But PAD requires outside consultation and documentation that is quite rigorous, even burdensome, Magnus explains. He recalls one phone consult with a physician who had to re-have a conversation with a patient at home in order to meet the regulatory requirements for a request for assistance in dying. "So it is not surprising that it is utilized so infrequently."
The federal government has erected its own series of barriers. Roused by the experience in Oregon, opponents tried to ban PAD at the national level. They failed but did the next best thing; they prohibited use of federal funds to pay for or even discuss PAD. That includes Medicare, Medicaid, and the large health delivery systems run by the Pentagon and Veterans Affairs. The restrictions parallel those on federal funding for access to abortion and medical marijuana.
Even physicians who support and perform PAD are reluctant to talk about it. They are unwilling to initiate the discussion with patients, says Mara Buchbinder, a bioethicist at the University of North Carolina at Chapel Hill who has interviewed physicians, patients, and families about their experience with assisted dying in Vermont.
"There is a stigma for health care workers to talk about this; they feel that they are not supported," says Buchbinder. She relates how one doctor wanted to organize a discussion of PAD at his hospital, but administrators forbade it. And when the drug used to carry out the procedure became prohibitively expensive, other physicians were not aware of alternatives.
"This just points to large inadequacies in medical preparation around end-of-life conversations," says Buchbinder, a view endorsed by many experts interviewed for this article.
These inadequacies are reinforced when groups like the Coalition to Transform Advanced Care (C-TAC), a 140-member organizational alliance that champions improved end-of-life care, dodges the issue. A spokesman said simply, PAD "is not within the scope of our work."
The American Medical Association has had a policy in place opposing PAD since 1993. Two years ago, its House of Delegates voted to reevaluate their position in light of evolving circumstances. Earlier this year the Council of Ethical and Judicial Affairs recommended continued opposition, but in June, the House of Delegates rejected that recommendation (56 to 44 percent) and directed the Council to keep studying the issue.
Only those with the economic and social capital and network of advocates will succeed in exercising this option.
Kaiser Permanente has provided assisted dying to its members in multiple states beginning with Oregon and has done "a wonderful job" according to supporters of PAD. But it has declined to discuss those activities publicly despite a strenuous effort to get them to do so.
Rather than drawing upon formal structures for leadership and guidance, doctors who are interested in learning more about PAD are turning to the ad hoc wisdom of providers from Oregon and Washington who have prior experience. Magnus compares it with what usually happens when a new intervention or technology comes down the pike: "People who have done it, have mastered it, pass that knowledge on to other people so they know how to do it."
Buchbinder says it becomes an issue of social justice when providers are not adequately trained, and when patients are not ordinarily offered the option of a medical service in jurisdictions where it is their right.
Legalization of PAD "does not guarantee practical access, and well-intentioned policies designed to protect vulnerable groups may at times reinforce or exacerbate health care inequalities," she says. Only those with the economic and social capital and network of advocates will succeed in exercising this option.
O Canada
Canada provides a case study of how one might address PAD. They largely settled on the term medical aid in dying – often shortened to MAID – as the more neutral phrase for their law and civil discourse.
The Canadian Medical Association (CMA) decided early on to thread the needle; to not take a position on the core issue of morality but to proactively foster public discussion of those issues as the legal challenge to the ban on assisted dying headed to that country's Supreme Court.
"We just felt that it was too important for the profession to sit on the sidelines and not be part of the discussion," says Jeff Blackmer, CMA's vice president for medical professionalism.
It began by shifting the focus of discussion from a yes/no on the morality of MAID to the questions of, "If the court rules that the current laws are unconstitutional, and they allow assisted dying, how should the profession react and how should we respond? And how does the public think that the profession should respond?"
"I had to wear a flack jacket, a bulletproof vest, and there were plainclothes police officers with guns in the audience because it is really really very controversial."
The CMA teamed up with Maclean's magazine to host a series of five town hall meetings throughout the country. Assisted dying was discussed in a context of palliative care, advanced care planning, and other end-of-life issues.
There was fear that MAID might raise passions and even violence that has been seen in recent controversies over abortion. "I had to wear a flack jacket, a bulletproof vest, and there were plainclothes police officers with guns in the audience because it is really really very controversial," Blackmer recalls. Thankfully there were no major incidents.
The CMA also passed a resolution at its annual meeting supporting the right of its members to opt out of participating in MAID, within the confines of whatever law might emerge.
Once legislation and regulations began taking shape, the CMA created training materials on the ethical, legal, and practical consideration that doctors and patients might face. It ordinarily does not get involved with clinical education and training.
Stefanie Green is president of Canadian Association of MAID Assessors & Providers, a professional medical association that supports those working in the area of assisted dying, educates the public and health care community, and provides leadership on setting medical standards. Green acknowledges the internal pressures the CMA faced, and says, "I do understand their stance is as positive as it gets for medical associations."
Back in the USofA
Prohibitionism – the just say no approach – does not work when a substantial number of people want something, as demonstrated with alcohol, marijuana, opioids for pain relief, and reproductive control. Reason suggests a harm reduction strategy is the more viable approach.
"Right now we're stuck in the worst of all worlds because we've made [PAD] sort of part of medicine, but sort of illicit and sort of shameful. And we sort of allow it, but we sort of don't, we make it hard," says Stanford's Magnus. "And that's a no man's land where we are stuck."
Imagine this scenario: A couple is involved in a heated custody dispute over their only child. As part of the effort to make the case of being a better guardian, one parent goes on a "genetic fishing expedition": this parent obtains a DNA sample from the other parent with the hope that such data will identify some genetic predisposition to a psychiatric condition (e.g., schizophrenia) and tilt the judge's custody decision in his or her favor.
As knowledge of psychiatric genetics is growing, it is likely to be introduced in civil cases, such as child custody disputes and education-related cases, raising a tangle of ethical and legal questions.
This is an example of how "behavioral genetic evidence" -- an umbrella term for information gathered from family history and genetic testing about pathological behaviors, including psychiatric conditions—may in the future be brought by litigants in court proceedings. Such evidence has been discussed primarily when criminal defendants sought to introduce it to make the claim that they are not responsible for their behavior or to justify their request for reduced sentencing and more lenient punishment.
However, civil cases are an emerging frontier for behavioral genetic evidence. It has already been introduced in tort litigation, such as personal injury claims, and as knowledge of psychiatric genetics is growing, it is further likely to be introduced in other civil cases, such as child custody disputes and education-related cases. But the introduction of such evidence raises a tangle of ethical and legal questions that civil courts will need to address. For example: how should such data be obtained? Who should get to present it and under what circumstances? And does the use of such evidence fit with the purposes of administering justice?
How Did We Get Here?
That behavioral genetic evidence is entering courts is unsurprising. Scientific evidence is a common feature of judicial proceedings, and genetic information may reveal relevant findings. For example, genetic evidence may elucidate whether a child's medical condition is due to genetic causes or medical malpractice, and it has been routinely used to identify alleged offenders or putative fathers. But behavioral genetic evidence is different from such other genetic data – it is shades of gray, instead of black and white.
Although efforts to understand the nature and origins of human behavior are ongoing, existing and likely future knowledge about behavioral genetics is limited. Behavioral disorders are highly complex and diverse. They commonly involve not one but multiple genes, each with a relatively small effect. They are impacted by many, yet unknown, interactions between genes, familial, and environmental factors such as poverty and childhood adversity.
And a specific gene variant may be associated with more than one behavioral disorder and be manifested with significantly different symptoms. Thus, biomarkers about "predispositions" for behavioral disorders cannot generally provide a diagnosis or an accurate estimate of whether, when, and at what severity a behavioral disorder will occur. And, unlike genetic testing that can confirm litigants' identity with 99.99% probability, behavioral genetic evidence is far more speculative.
Genetic theft raises questions about whose behavioral data are being obtained, by whom, and with what authority.
Whether judges, jurors, and other experts understand the nuances of behavioral genetics is unclear. Many people over-estimate the deterministic nature of genetics, and under-estimate the role of environments, especially with regards to mental health status. The U.S. individualistic culture of self-reliance and independence may further tilt the judicial scales because litigants in civil courts may be unjustly blamed for their "bad genes" while structural and societal determinants that lead to poor behavioral outcomes are ignored.
These concerns were recently captured in the Netflix series "13 Reasons Why," depicting a negligence lawsuit against a school brought by parents of a high-school student there (Hannah) who committed suicide. The legal tides shifted from the school's negligence in tolerating a culture of bullying to parental responsibility once cross-examination of Hannah's mother revealed a family history of anxiety, and the possibility that Hannah had a predisposition for mental illness, which (arguably) required therapy even in the absence of clear symptoms.
Where Is This Going?
The concerns are exacerbated given the ways in which behavioral genetic evidence may come to court in the future. One way is through "genetic theft," where genetic evidence is obtained from deserted property, such as soft-drink cans. This method is often used for identification purposes such as criminal and paternity proceedings, and it will likely expand to behavioral genetic data once available through "home kits" that are offered by direct-to-consumer companies.
Genetic theft raises questions about whose behavioral data are being obtained, by whom, and with what authority. In the scenario of child-custody dispute, for example, the sequencing of the other parent's DNA will necessarily intrude on the privacy of that parent, even as the scientific value of such information is limited. A parent on a "genetic fishing expedition" can also secretly sequence their child for psychiatric genetic predispositions, arguably, in order to take preventative measures to reduce the child's risk for developing a behavioral disorder. But should a parent be allowed to sequence the child without the other parent's consent, or regardless of whether the results will provide medical benefits to the child?
Similarly, although schools are required, and may be held accountable for failing to identify children with behavioral disabilities and to evaluate their educational needs, some parents may decline their child's evaluation by mental health professionals. Should schools secretly obtain a sample and sequence children for behavioral disorders, regardless of parental consent? My study of parents found that the overwhelming majority opposed imposed genetic testing by school authorities. But should parental preference or the child's best interests be the determinative factor? Alternatively, could schools use secretly obtained genetic data as a defense that they are fulfilling the child-find requirement under the law?
The stigma associated with behavioral disorders may intimidate some people enough that they back down from just claims.
In general, samples obtained through genetic theft may not meet the legal requirements for admissible evidence, and as these examples suggest, they also involve privacy infringement that may be unjustified in civil litigation. But their introduction in courts may influence judicial proceedings. It is hard to disregard such evidence even if decision-makers are told to ignore it.
The costs associated with genetic testing may further intensify power differences among litigants. Because not everyone can pay for DNA sequencing, there is a risk that those with more resources will be "better off" in court proceedings. Simultaneously, the stigma associated with behavioral disorders may intimidate some people enough that they back down from just claims. For example, a good parent may give up a custody claim to avoid disclosure of his or her genetic predispositions for psychiatric conditions. Regulating this area of law is necessary to prevent misuses of scientific technologies and to ensure that powerful actors do not have an unfair advantage over weaker litigants.
Behavioral genetic evidence may also enter the courts through subpoena of data obtained in clinical, research or other commercial genomic settings such as ancestry testing (similar to the genealogy database recently used to identify the Golden State Killer). Although court orders to testify or present evidence are common, their use for obtaining behavioral genetic evidence raises concerns.
One worry is that it may be over-intrusive. Because behavioral genetics are heritable, such data may reveal information not only about the individual litigant but also about other family members who may subsequently be stigmatized as well. And, even if we assume that many people may be willing for their data in genomic databases to be used to identify relatives who committed crimes (e.g., a rapist or a murderer), we can't assume the same for civil litigation, where the public interest in disclosure is far weaker.
Another worry is that it may deter people from participating in activities that society has an interest in advancing, including medical treatment involving genetic testing and genomic research. To address this concern, existing policy provides expanded privacy protections for NIH-funded genomic research by automatically issuing a Certificate of Confidentiality that prohibits disclosure of identifiable information in any Federal, State, or local civil, criminal, and other legal proceedings.
But this policy has limitations. It applies only to specific research settings and does not cover non-NIH funded research or clinical testing. The Certificate's protections can also be waived under certain circumstances. People who volunteer to participate in non-NIH-funded genomic research for the public good may thus find themselves worse-off if embroiled in legal proceedings.
Consider the following: if a parent in a child custody dispute had participated in a genetic study on schizophrenia years earlier, should the genetic results be subpoenaed by the court – and weaponized by the other parent? Public policy should aim to reduce the risks for such individuals. The end of obtaining behavioral genetic evidence cannot, and should not, always justify the means.