Researchers Behaving Badly: Known Frauds Are "the Tip of the Iceberg"
Last week, the whistleblowers in the Paolo Macchiarini affair at Sweden's Karolinska Institutet went on the record here to detail the retaliation they suffered for trying to expose a star surgeon's appalling research misconduct.
Scientific fraud of the type committed by Macchiarini is rare, but studies suggest that it's on the rise.
The whistleblowers had discovered that in six published papers, Macchiarini falsified data, lied about the condition of patients and circumvented ethical approvals. As a result, multiple patients suffered and died. But Karolinska turned a blind eye for years.
Scientific fraud of the type committed by Macchiarini is rare, but studies suggest that it's on the rise. Just this week, for example, Retraction Watch and STAT together broke the news that a Harvard Medical School cardiologist and stem cell researcher, Piero Anversa, falsified data in a whopping 31 papers, which now have to be retracted. Anversa had claimed that he could regenerate heart muscle by injecting bone marrow cells into damaged hearts, a result that no one has been able to duplicate.
A 2009 study published in the Public Library of Science (PLOS) found that about two percent of scientists admitted to committing fabrication, falsification or plagiarism in their work. That's a small number, but up to one third of scientists admit to committing "questionable research practices" that fall into a gray area between rigorous accuracy and outright fraud.
These dubious practices may include misrepresentations, research bias, and inaccurate interpretations of data. One common questionable research practice entails formulating a hypothesis after the research is done in order to claim a successful premise. Another highly questionable practice that can shape research is ghost-authoring by representatives of the pharmaceutical industry and other for-profit fields. Still another is gifting co-authorship to unqualified but powerful individuals who can advance one's career. Such practices can unfairly bolster a scientist's reputation and increase the likelihood of getting the work published.
The above percentages represent what scientists admit to doing themselves; when they evaluate the practices of their colleagues, the numbers jump dramatically. In a 2012 study published in the Journal of Research in Medical Sciences, researchers estimated that 14 percent of other scientists commit serious misconduct, while up to 72 percent engage in questionable practices. While these are only estimates, the problem is clearly not one of just a few bad apples.
In the PLOS study, Daniele Fanelli says that increasing evidence suggests the known frauds are "just the 'tip of the iceberg,' and that many cases are never discovered" because fraud is extremely hard to detect.
Essentially everyone wants to be associated with big breakthroughs, and they may overlook scientifically shaky foundations when a major advance is claimed.
In addition, it's likely that most cases of scientific misconduct go unreported because of the high price of whistleblowing. Those in the Macchiarini case showed extraordinary persistence in their multi-year campaign to stop his deadly trachea implants, while suffering serious damage to their careers. Such heroic efforts to unmask fraud are probably rare.
To make matters worse, there are numerous players in the scientific world who may be complicit in either committing misconduct or covering it up. These include not only primary researchers but co-authors, institutional executives, journal editors, and industry leaders. Essentially everyone wants to be associated with big breakthroughs, and they may overlook scientifically shaky foundations when a major advance is claimed.
Another part of the problem is that it's rare for students in science and medicine to receive an education in ethics. And studies have shown that older, more experienced and possibly jaded researchers are more likely to fudge results than their younger, more idealistic colleagues.
So, given the steep price that individuals and institutions pay for scientific misconduct, what compels them to go down that road in the first place? According to the JRMS study, individuals face intense pressures to publish and to attract grant money in order to secure teaching positions at universities. Once they have acquired positions, the pressure is on to keep the grants and publishing credits coming in order to obtain tenure, be appointed to positions on boards, and recruit flocks of graduate students to assist in research. And not to be underestimated is the human ego.
Paolo Macchiarini is an especially vivid example of a scientist seeking not only fortune, but fame. He liberally (and falsely) claimed powerful politicians and celebrities, even the Pope, as patients or admirers. He may be an extreme example, but we live in an age of celebrity scientists who bring huge amounts of grant money and high prestige to the institutions that employ them.
The media plays a significant role in both glorifying stars and unmasking frauds. In the Macchiarini scandal, the media first lifted him up, as in NBC's laudatory documentary, "A Leap of Faith," which painted him as a kind of miracle-worker, and then brought him down, as in the January 2016 documentary, "The Experiments," which chronicled the agonizing death of one of his patients.
Institutions can also play a crucial role in scientific fraud by putting more emphasis on the number and frequency of papers published than on their quality. The whole course of a scientist's career is profoundly affected by something called the h-index. This is a number based on both the frequency of papers published and how many times the papers are cited by other researchers. Raising one's ranking on the h-index becomes an overriding goal, sometimes eclipsing the kind of patient, time-consuming research that leads to true breakthroughs based on reliable results.
Universities also create a high-pressured environment that encourages scientists to cut corners. They, too, place a heavy emphasis on attracting large monetary grants and accruing fame and prestige. This can lead them, just as it led Karolinska, to protect a star scientist's sloppy or questionable research. According to Dr. Andrew Rosenberg, who is director of the Center for Science and Democracy at the U.S.-based Union of Concerned Scientists, "Karolinska defended its investment in an individual as opposed to the long-term health of the institution. People were dying, and they should have outsourced the investigation from the very beginning."
Having institutions investigate their own practices is a conflict of interest from the get-go, says Rosenberg.
Scientists, universities, and research institutions are also not immune to fads. "Hot" subjects attract grant money and confer prestige, incentivizing scientists to shift their research priorities in a direction that garners more grants. This can mean neglecting the scientist's true area of expertise and interests in favor of a subject that's more likely to attract grant money. In Macchiarini's case, he was allegedly at the forefront of the currently sexy field of regenerative medicine -- a field in which Karolinska was making a huge investment.
The relative scarcity of resources intensifies the already significant pressure on scientists. They may want to publish results rapidly, since they face many competitors for limited grant money, academic positions, students, and influence. The scarcity means that a great many researchers will fail while only a few succeed. Once again, the temptation may be to rush research and to show it in the most positive light possible, even if it means fudging or exaggerating results.
Though the pressures facing scientists are very real, the problem of misconduct is not inevitable.
Intense competition can have a perverse effect on researchers, according to a 2007 study in the journal Science of Engineering and Ethics. Not only does it place undue pressure on scientists to succeed, it frequently leads to the withholding of information from colleagues, which undermines a system in which new discoveries build on the previous work of others. Researchers may feel compelled to withhold their results because of the pressure to be the first to publish. The study's authors propose that more investment in basic research from governments could alleviate some of these competitive pressures.
Scientific journals, although they play a part in publishing flawed science, can't be expected to investigate cases of suspected fraud, says the German science blogger Leonid Schneider. Schneider's writings helped to expose the Macchiarini affair.
"They just basically wait for someone to retract problematic papers," he says.
He also notes that, while American scientists can go to the Office of Research Integrity to report misconduct, whistleblowers in Europe have no external authority to whom they can appeal to investigate cases of fraud.
"They have to go to their employer, who has a vested interest in covering up cases of misconduct," he says.
Science is increasingly international. Major studies can include collaborators from several different countries, and he suggests there should be an international body accessible to all researchers that will investigate suspected fraud.
Ultimately, says Rosenberg, the scientific system must incorporate trust. "You trust co-authors when you write a paper, and peer reviewers at journals trust that scientists at research institutions like Karolinska are acting with integrity."
Without trust, the whole system falls apart. It's the trust of the public, an elusive asset once it has been betrayed, that science depends upon for its very existence. Scientific research is overwhelmingly financed by tax dollars, and the need for the goodwill of the public is more than an abstraction.
The Macchiarini affair raises a profound question of trust and responsibility: Should multiple co-authors be held responsible for a lead author's misconduct?
Karolinska apparently believes so. When the institution at last owned up to the scandal, it vindictively found Karl Henrik-Grinnemo, one of the whistleblowers, guilty of scientific misconduct as well. It also designated two other whistleblowers as "blameworthy" for their roles as co-authors of the papers on which Macchiarini was the lead author.
As a result, the whistleblowers' reputations and employment prospects have become collateral damage. Accusations of research misconduct can be a career killer. Research grants dry up, employment opportunities evaporate, publishing becomes next to impossible, and collaborators vanish into thin air.
Grinnemo contends that co-authors should only be responsible for their discrete contributions, not for the data supplied by others.
"Different aspects of a paper are highly specialized," he says, "and that's why you have multiple authors. You cannot go through every single bit of data because you don't understand all the parts of the article."
This is especially true in multidisciplinary, translational research, where there are sometimes 20 or more authors. "You have to trust co-authors, and if you find something wrong you have to notify all co-authors. But you couldn't go through everything or it would take years to publish an article," says Grinnemo.
Though the pressures facing scientists are very real, the problem of misconduct is not inevitable. Along with increased support from governments and industry, a change in academic culture that emphasizes quality over quantity of published studies could help encourage meritorious research.
But beyond that, trust will always play a role when numerous specialists unite to achieve a common goal: the accumulation of knowledge that will promote human health, wealth, and well-being.
[Correction: An earlier version of this story mistakenly credited The New York Times with breaking the news of the Anversa retractions, rather than Retraction Watch and STAT, which jointly published the exclusive on October 14th. The piece in the Times ran on October 15th. We regret the error.]
Here's something to chew on. Can a gulp of water help save the planet? If you're drinking *and* eating your water at the same time, the answer may be yes.
The tasteless packaging is made from brown seaweed that biodegrades naturally in four to six weeks.
The Lowdown
A start-up company called Skipping Rocks Lab has created a "water bubble" encased in an edible sachet that you can pop in your mouth whole. Or if you're not into swallowing it, you can tear off the edge, drink up, and toss the rest in a composter. The tasteless packaging is made from brown seaweed that biodegrades naturally in four to six weeks, whereas plastic water bottles can linger for hundreds of years.
The founders of the London-based company are determined to "make plastic packaging disappear." They had two foodie inspirations: molecular gastronomists and fruit. They tried to emulate the way chefs used edible membranes to encase bubbles of liquid to make things like fake caviar and fake egg yolks; and they also considered the peel of an orange or banana, which protects the tasty insides but can be composted.
The sachets can also contain other liquids that come in single-serve plastic containers -- think packets of condiments with takeout meals, specialty cocktails at parties, and especially single servings of water for sporting events. The London Marathon last month gave out the water bubble pods at a station along the route, using them to replace 200,000 plastic bottles that would have likely ended up first in the street, and ultimately in the ocean.
Next Up
The engineers and chemists at Skipping Rocks intend to lease their machines to others who can then manufacture their own sachets on-site to fill with whatever they desire. The new material, which is dubbed "Notpla" (not plastic), also has other applications beyond holding liquids. It can be used to replace the plastic lining in cardboard takeout boxes, for example. And the startup is working on additional materials to replace other types of ubiquitous plastic packaging, like the netting that encases garlic and onions, and the sachets that hold nails and screws.
Edible water bubbles may be the future of drinks at sporting events and festivals.
Open Questions
One hurdle is that the pods are not very hardy, so while they work fine to hand out along a marathon route, they wouldn't really be viable for a hiker to throw in her backpack. Another issue concerns the retail market: to be stable on a shelf, they'd have to be protected from all that handling, which brings us back to the problem the engineers tried to solve in the first place -- disposable packaging.
So while Skipping Rocks may not achieve their ultimate goal of ridding the world of plastic waste, a little progress can still go a long way. If edible water bubbles are the future of drinks at sporting events and festivals, the environment will certainly benefit from their presence -- and absence.
The Internet has made it easier than ever to misguide people. The anti-vaxx movement, climate change denial, protests against stem cell research, and other movements like these are rooted in the spread of misinformation and a distrust of science.
"I had been taught intelligent design and young-earth creationism instead of evolution, geology, and biology."
Science illiteracy is pervasive in the communities responsible for these movements. For the mainstream, the challenge lies not in sharing the facts, but in combating the spread of misinformation and facilitating an open dialogue between experts and nonexperts.
I grew up in a household that was deeply skeptical of science and medicine. My parents are evangelical Christians who believe the word of the Bible is law. To protect my four siblings and me from secular influence, they homeschooled some of us and put the others in private Christian schools. When my oldest brother left for a Christian college and the tuition began to add up, I was placed in a public charter school to offset the costs.
There, I became acutely aware of my ignorant upbringing. I had been taught intelligent design and young-earth creationism instead of evolution, geology, and biology. My mother skipped over world religions, and much of my history curriculum was more biblical-based than factual. She warned me that stem cell research, vaccines, genetic modification of crops, and other areas of research in biological science were examples of humans trying to be like God. At the time, biologist Richard Dawkins' The God Delusion was a bestseller and science seemed like an excuse to not believe in God, so she and my father discouraged me from studying it.
The gaps in my knowledge left me feeling frustrated and embarrassed. The solution was to learn about the things that had been censored from my education, but several obstacles stood in the way.
"When I first learned about fundamentalism, my parents' behavior finally made sense."
I lacked a good foundation in basic mathematics after being taught by my mother, who never graduated college. My father, who holds a graduate degree in computer science, repeatedly told me that I inherited my mother's "bad math genes" and was therefore ill-equipped for science. While my brothers excelled at math under his supervision and were even encouraged toward careers in engineering and psychology, I was expected to do well in other subjects, such as literature. When I tried to change this by enrolling in honors math and science classes, they scolded me -- so reluctantly, I dropped math. By the time I graduated high school, I was convinced that math and science were beyond me.
When I look back at my high school transcripts, that sense of failure was unfounded: my grades were mostly A's and B's, and I excelled in honors biology. Even my elementary standardized test scores don't reflect a student disinclined toward STEM, because I consistently scored in the top percentile for sciences. Teachers often encouraged me to consider studying science in college. Why then, I wondered, did my parents reject that idea? Why did they work so hard to sway me from that path? It wasn't until I moved away from my parents' home and started working to put myself through community college that I discovered my passion for both biology and science writing.
As a young adult venturing into the field of science communication, I've become fascinated with understanding communities that foster antagonistic views toward science. When I first learned about fundamentalism, my parents' behavior finally made sense. It is the foundation of the Religious Right, a right-wing Christian group which heavily influences the Republican party in the United States. The Religious Right crusades against secular education, stem cell research, abortion, evolution, and other controversial issues in science and medicine on the basis that they contradict Christian beliefs. They are quietly overturning the separation of church and state in order to enforce their religion as policy -- at the expense of science and progress.
Growing up in this community, I learned that strong feelings about these issues arise from both a lack of science literacy and a distrust of experts. Those who are against genetic modification of crops don't understand that GMO research aims to produce more, and longer-lasting, food for a growing planet. The anti-vaxx movement is still relying on a deeply flawed study that was ultimately retracted. Those who are against stem cell research don't understand how it works or the important benefits it provides the field of medicine, such as discovering new treatment methods.
In fact, at one point the famous Christian radio show Focus on the Family spread anti-vaxx mentality when they discussed vaccines that, long ago, were derived from aborted fetal cells. Although Focus on the Family now endorses vaccines, at the time it was enough to convince my own mother, who listened to the show every morning, not to vaccinate us unless the law required it.
"In everyday interactions with skeptics, science communicators need to shift their focus from convincing to discussing."
We can help clear up misunderstandings by sharing the facts, but the real challenge lies in willful ignorance. It was hard for me to accept, but I've come to understand that I'm not going to change anyone's mind. It's up to an individual to evaluate the facts, consider the arguments for and against, and make his or her own decision.
As my parents grew older and my siblings and I introduced them to basic concepts in science, they came around to trusting the experts a little more. They now see real doctors instead of homeopathic practitioners. They acknowledge our world's changing climate instead of denying it. And they even applaud two of their children for pursuing careers in science. Although they have held on to their fundamentalism and we still disagree on many issues, these basic changes give me hope that people in deeply skeptical communities are not entirely out of reach.
In everyday interactions with skeptics, science communicators need to shift their focus from convincing to discussing. This means creating an open dialogue with the intention of being understanding and helpful, not persuasive. This approach can be beneficial in both personal and online interactions. There are people within these movements who have doubts, and their doubts will grow as we continue to feed them through discussion.
People will only change their minds when it is the right time for them to do so. We need to be there ready to hold their hand and lead them toward truth when they reach out. Until then, all we can do is keep the channels of communication open, keep sharing the facts, and fight the spread of misinformation. Science is the pursuit of truth, and as scientists and science communicators, sometimes we need to let the truth speak for itself. We're just there to hold the megaphone.