Researchers Behaving Badly: Known Frauds Are "the Tip of the Iceberg"
Last week, the whistleblowers in the Paolo Macchiarini affair at Sweden's Karolinska Institutet went on the record here to detail the retaliation they suffered for trying to expose a star surgeon's appalling research misconduct.
Scientific fraud of the type committed by Macchiarini is rare, but studies suggest that it's on the rise.
The whistleblowers had discovered that in six published papers, Macchiarini falsified data, lied about the condition of patients and circumvented ethical approvals. As a result, multiple patients suffered and died. But Karolinska turned a blind eye for years.
Scientific fraud of the type committed by Macchiarini is rare, but studies suggest that it's on the rise. Just this week, for example, Retraction Watch and STAT together broke the news that a Harvard Medical School cardiologist and stem cell researcher, Piero Anversa, falsified data in a whopping 31 papers, which now have to be retracted. Anversa had claimed that he could regenerate heart muscle by injecting bone marrow cells into damaged hearts, a result that no one has been able to duplicate.
A 2009 study published in the Public Library of Science (PLOS) found that about two percent of scientists admitted to committing fabrication, falsification or plagiarism in their work. That's a small number, but up to one third of scientists admit to committing "questionable research practices" that fall into a gray area between rigorous accuracy and outright fraud.
These dubious practices may include misrepresentations, research bias, and inaccurate interpretations of data. One common questionable research practice entails formulating a hypothesis after the research is done in order to claim a successful premise. Another highly questionable practice that can shape research is ghost-authoring by representatives of the pharmaceutical industry and other for-profit fields. Still another is gifting co-authorship to unqualified but powerful individuals who can advance one's career. Such practices can unfairly bolster a scientist's reputation and increase the likelihood of getting the work published.
The above percentages represent what scientists admit to doing themselves; when they evaluate the practices of their colleagues, the numbers jump dramatically. In a 2012 study published in the Journal of Research in Medical Sciences, researchers estimated that 14 percent of other scientists commit serious misconduct, while up to 72 percent engage in questionable practices. While these are only estimates, the problem is clearly not one of just a few bad apples.
In the PLOS study, Daniele Fanelli says that increasing evidence suggests the known frauds are "just the 'tip of the iceberg,' and that many cases are never discovered" because fraud is extremely hard to detect.
Essentially everyone wants to be associated with big breakthroughs, and they may overlook scientifically shaky foundations when a major advance is claimed.
In addition, it's likely that most cases of scientific misconduct go unreported because of the high price of whistleblowing. Those in the Macchiarini case showed extraordinary persistence in their multi-year campaign to stop his deadly trachea implants, while suffering serious damage to their careers. Such heroic efforts to unmask fraud are probably rare.
To make matters worse, there are numerous players in the scientific world who may be complicit in either committing misconduct or covering it up. These include not only primary researchers but co-authors, institutional executives, journal editors, and industry leaders. Essentially everyone wants to be associated with big breakthroughs, and they may overlook scientifically shaky foundations when a major advance is claimed.
Another part of the problem is that it's rare for students in science and medicine to receive an education in ethics. And studies have shown that older, more experienced and possibly jaded researchers are more likely to fudge results than their younger, more idealistic colleagues.
So, given the steep price that individuals and institutions pay for scientific misconduct, what compels them to go down that road in the first place? According to the JRMS study, individuals face intense pressures to publish and to attract grant money in order to secure teaching positions at universities. Once they have acquired positions, the pressure is on to keep the grants and publishing credits coming in order to obtain tenure, be appointed to positions on boards, and recruit flocks of graduate students to assist in research. And not to be underestimated is the human ego.
Paolo Macchiarini is an especially vivid example of a scientist seeking not only fortune, but fame. He liberally (and falsely) claimed powerful politicians and celebrities, even the Pope, as patients or admirers. He may be an extreme example, but we live in an age of celebrity scientists who bring huge amounts of grant money and high prestige to the institutions that employ them.
The media plays a significant role in both glorifying stars and unmasking frauds. In the Macchiarini scandal, the media first lifted him up, as in NBC's laudatory documentary, "A Leap of Faith," which painted him as a kind of miracle-worker, and then brought him down, as in the January 2016 documentary, "The Experiments," which chronicled the agonizing death of one of his patients.
Institutions can also play a crucial role in scientific fraud by putting more emphasis on the number and frequency of papers published than on their quality. The whole course of a scientist's career is profoundly affected by something called the h-index. This is a number based on both the frequency of papers published and how many times the papers are cited by other researchers. Raising one's ranking on the h-index becomes an overriding goal, sometimes eclipsing the kind of patient, time-consuming research that leads to true breakthroughs based on reliable results.
Universities also create a high-pressured environment that encourages scientists to cut corners. They, too, place a heavy emphasis on attracting large monetary grants and accruing fame and prestige. This can lead them, just as it led Karolinska, to protect a star scientist's sloppy or questionable research. According to Dr. Andrew Rosenberg, who is director of the Center for Science and Democracy at the U.S.-based Union of Concerned Scientists, "Karolinska defended its investment in an individual as opposed to the long-term health of the institution. People were dying, and they should have outsourced the investigation from the very beginning."
Having institutions investigate their own practices is a conflict of interest from the get-go, says Rosenberg.
Scientists, universities, and research institutions are also not immune to fads. "Hot" subjects attract grant money and confer prestige, incentivizing scientists to shift their research priorities in a direction that garners more grants. This can mean neglecting the scientist's true area of expertise and interests in favor of a subject that's more likely to attract grant money. In Macchiarini's case, he was allegedly at the forefront of the currently sexy field of regenerative medicine -- a field in which Karolinska was making a huge investment.
The relative scarcity of resources intensifies the already significant pressure on scientists. They may want to publish results rapidly, since they face many competitors for limited grant money, academic positions, students, and influence. The scarcity means that a great many researchers will fail while only a few succeed. Once again, the temptation may be to rush research and to show it in the most positive light possible, even if it means fudging or exaggerating results.
Though the pressures facing scientists are very real, the problem of misconduct is not inevitable.
Intense competition can have a perverse effect on researchers, according to a 2007 study in the journal Science of Engineering and Ethics. Not only does it place undue pressure on scientists to succeed, it frequently leads to the withholding of information from colleagues, which undermines a system in which new discoveries build on the previous work of others. Researchers may feel compelled to withhold their results because of the pressure to be the first to publish. The study's authors propose that more investment in basic research from governments could alleviate some of these competitive pressures.
Scientific journals, although they play a part in publishing flawed science, can't be expected to investigate cases of suspected fraud, says the German science blogger Leonid Schneider. Schneider's writings helped to expose the Macchiarini affair.
"They just basically wait for someone to retract problematic papers," he says.
He also notes that, while American scientists can go to the Office of Research Integrity to report misconduct, whistleblowers in Europe have no external authority to whom they can appeal to investigate cases of fraud.
"They have to go to their employer, who has a vested interest in covering up cases of misconduct," he says.
Science is increasingly international. Major studies can include collaborators from several different countries, and he suggests there should be an international body accessible to all researchers that will investigate suspected fraud.
Ultimately, says Rosenberg, the scientific system must incorporate trust. "You trust co-authors when you write a paper, and peer reviewers at journals trust that scientists at research institutions like Karolinska are acting with integrity."
Without trust, the whole system falls apart. It's the trust of the public, an elusive asset once it has been betrayed, that science depends upon for its very existence. Scientific research is overwhelmingly financed by tax dollars, and the need for the goodwill of the public is more than an abstraction.
The Macchiarini affair raises a profound question of trust and responsibility: Should multiple co-authors be held responsible for a lead author's misconduct?
Karolinska apparently believes so. When the institution at last owned up to the scandal, it vindictively found Karl Henrik-Grinnemo, one of the whistleblowers, guilty of scientific misconduct as well. It also designated two other whistleblowers as "blameworthy" for their roles as co-authors of the papers on which Macchiarini was the lead author.
As a result, the whistleblowers' reputations and employment prospects have become collateral damage. Accusations of research misconduct can be a career killer. Research grants dry up, employment opportunities evaporate, publishing becomes next to impossible, and collaborators vanish into thin air.
Grinnemo contends that co-authors should only be responsible for their discrete contributions, not for the data supplied by others.
"Different aspects of a paper are highly specialized," he says, "and that's why you have multiple authors. You cannot go through every single bit of data because you don't understand all the parts of the article."
This is especially true in multidisciplinary, translational research, where there are sometimes 20 or more authors. "You have to trust co-authors, and if you find something wrong you have to notify all co-authors. But you couldn't go through everything or it would take years to publish an article," says Grinnemo.
Though the pressures facing scientists are very real, the problem of misconduct is not inevitable. Along with increased support from governments and industry, a change in academic culture that emphasizes quality over quantity of published studies could help encourage meritorious research.
But beyond that, trust will always play a role when numerous specialists unite to achieve a common goal: the accumulation of knowledge that will promote human health, wealth, and well-being.
[Correction: An earlier version of this story mistakenly credited The New York Times with breaking the news of the Anversa retractions, rather than Retraction Watch and STAT, which jointly published the exclusive on October 14th. The piece in the Times ran on October 15th. We regret the error.]
Isaac Asimov on the History of Infectious Disease—and How Humanity Learned to Fight Back
[EDITOR'S FORWARD: Humanity has always faced existential threats from dangerous microbes, and though this is the first pandemic in our lifetimes, it won't be the last our species will ever face. This newly relevant work by beloved sci-fi writer Isaac Asimov, an excerpt from his 1979 book, A Choice of Catastrophes, establishes that reality in its historical context and makes clear how far we have come since ancient times. But by some measures, we are still in the earliest stages of figuring out how to effectively neutralize such threats. Advancing progress as fast as we can—by leveraging all the insights of modern science—offers our best hope for containing this pandemic and those that will inevitably follow.]
Infectious Disease
An even greater danger to humanity than the effect of small, fecund pests on human beings, their food, and their possessions, is their tendency to spread some forms of infectious disease.
Every living organism is subject to disease of various sorts, where disease is defined in its broadest sense as "dis-ease," that is, as any malfunction or alteration of the physiology or biochemistry that interferes with the smooth workings of the organism. In the end, the cumulative effect of malfunctions, misfunctions, nonfunctions, even though much of it is corrected or patched up, produces irreversible damage—we call it old age—and, even with the best care in the world, brings on inevitable death.
Civilization has meant the development and growth of cities and the crowding of people into close quarters.
There are some individual trees that may live five thousand years, some cold-blooded animals that may live two hundred years, some warm-blooded animals that may live one hundred years, but for each multicellular individual death comes as the end.
This is an essential part of the successful functioning of life. New individuals constantly come into being with new combinations of chromosomes and genes, and with mutated genes, too. These represent new attempts, so to speak, at fitting the organism to the environment. Without the continuing arrival of new organisms that are not mere copies of the old, evolution would come to a halt. Naturally, the new organisms cannot perform their role properly unless the old ones are removed from the scene after they have performed their function of producing the new. In short, the death of the individual is essential to the life of the species.
It is essential, however, that the individual not die before the new generation has been produced; at least, not in so many cases as to ensure the population dwindling to extinction.
The human species cannot have the relative immunity to harm from individual death possessed by the small and fecund species. Human beings are comparatively large, long-lived, and slow to reproduce, so that too rapid individual death holds within it the specter of catastrophe. The rapid death of unusually high numbers of human beings through disease can seriously dent the human population. Carried to an extreme, it is not too hard to imagine it wiping out the human species.
Most dangerous in this respect is that class of malfunction referred to as "infectious disease." There are many disorders that affect a particular human being for one reason or another and may kill him or her, too, but which will not, in itself, offer a danger to the species, because it is strictly confined to the suffering individual. Where, however, a disease can, in some way travel from one human being to another, and where its occurrence in a single individual may lead to the death of not that one alone but of millions of others as well, then there is the possibility of catastrophe.
And indeed, infectious disease has come closer to destroying the human species in historic times than have the depredations of any animals. Although infectious disease, even at its worst, has never yet actually put an end to human beings as a living species (obviously), it can seriously damage a civilization and change the course of history. It has, in fact, done so not once, but many times.
What's more, the situation has perhaps grown worse with the coming of civilization. Civilization has meant the development and growth of cities and the crowding of people into close quarters. Just as fire can spread much more rapidly from tree to tree in a dense forest than in isolated stands, so can infectious disease spread more quickly in crowded quarters than in sparse settlements.
To mention a few notorious cases in history:
In 431 B.C., Athens and its allies went to war with Sparta and its allies. It was a twenty-seven-year war that ruined Athens and, to a considerable extent, all of Greece. Since Sparta controlled the land, the entire Athenian population crowded into the walled city of Athens. There they were safe and could be provisioned by sea, which was controlled by the Athenian navy. Athens would very likely have won a war of attrition before long and Greece might have avoided ruin, but for disease.
In 430 B.C., an infectious plague struck the crowded Athenian population and killed 20 percent of them, including the charismatic leader, Pericles. Athens kept on fighting but it never recovered its population or its strength and in the end it lost.
Plagues very frequently started in eastern and southern Asia, where population was densest, and spread westward. In A.D. 166, when the Roman Empire was at its peak of strength and civilization under the hard-working philosopher-emperor Marcus Aurelius, the Roman armies, fighting on the eastern borders in Asia Minor, began to suffer from an epidemic disease (possibly smallpox). They brought it back with them to other provinces and to Rome itself. At its height, 2,000 people were dying in the city of Rome each day. The population began to decline and did not reach its preplague figure again until the twentieth century. There are a great many reasons advanced for the long, slow decline of Rome that followed the reign of Marcus Aurelius, but the weakening effect of the plague of 166 surely played a part.
Even after the western provinces of the empire were torn away by invasions of the German tribes, and Rome itself was lost, the eastern half of the Roman Empire continued to exist, with its capital at Constantinople. Under the capable emperor Justinian I, who came to the throne in 527, Africa, Italy, and parts of Spain were taken and, for a while, it looked as though the empire might be reunited. In 541, however, the bubonic plague struck. It was a disease that attacked rats primarily, but one that fleas could spread to human beings by biting first a sick rat and then a healthy human being. Bubonic disease was fast-acting and often quickly fatal. It may even have been accompanied by a more deadly variant, pneumonic plague, which can leap directly from one person to another.
For two years the plague raged, and between one-third and one-half of the population of the city of Constantinople died, together with many people in the countryside outside the city. There was no hope of uniting the empire thereafter and the eastern portion, which came to be known as the Byzantine Empire, continued to decline thereafter (with occasional rallies).
The very worst epidemic in the history of the human species came in the fourteenth century. Sometime in the 1330s, a new variety of bubonic plague, a particularly deadly one, appeared in central Asia. People began to die and the plague spread outward, inexorably, from its original focus.
Eventually, it reached the Black Sea. There on the Crimean peninsula, jutting into the north-central coast of that sea, was a seaport called Kaffa where the Italian city of Genoa had established a trading post. In October, 1347, a Genoese ship just managed to make it back to Genoa from Kaffa. The few men on board who were not dead of the plague were dying. They were carried ashore and thus the plague entered Europe and began to spread rapidly.
Sometimes one caught a mild version of the disease, but often it struck violently. In the latter case, the patient was almost always dead within one to three days after the onset of the first symptoms. Because the extreme dangers were marked by hemorrhagic spots that turned dark, the disease was called the "Black Death."
The Black Death spread unchecked. It is estimated to have killed some 25 million people in Europe before it died down and many more than that in Africa and Asia. It may have killed a third of all the human population of the planet, perhaps 60 million people altogether or even more. Never before or after do we know of anything that killed so large a percentage of the population as did the Black Death.
It is no wonder that it inspired abject terror among the populace. Everyone walked in fear. A sudden attack of shivering or giddiness, a mere headache, might mean that death had marked one for its own and that no more than a couple of dozen hours were left in which to die. Whole towns were depopulated, with the first to die lying unburied while the survivors fled to spread the disease. Farms lay untended; domestic animals wandered uncared for. Whole nations—Aragon, for instance, in what is now eastern Spain—were afflicted so badly that they never truly recovered.
Distilled liquors had been first developed in Italy about 1100. Now, two centuries later they grew popular. The theory was that strong drink acted as a preventive against contagion. It didn't, but it made the drinker less concerned which, under the circumstances, was something. Drunkenness set in over Europe and it stayed even after the plague was gone; indeed, it has never left. The plague also upset the feudal economy by cutting down on the labor supply very drastically. This did as much to destroy feudalism as did the invention of gunpowder. (Perhaps the most distressing sidelight of the Black Death is the horrible insight into human nature that it offers. England and France were in the early decades of the Hundred Years War at the time. Although the Black Death afflicted both nations and nearly destroyed each, the war continued right on. There was no thought of peace in this greatest of all crises faced by the human species.)
There have been other great plagues since, though none to match the Black Death in unrivaled terror and destruction. In 1664 and 1665, the bubonic plague struck London and killed 75,000.
Cholera, which always simmered just below the surface in India (where it is "endemic") would occasionally explode and spread outward into an "epidemic." Europe was visited by deadly cholera epidemics in 1831 and again in 1848 and 1853. Yellow fever, a tropical disease, would be spread by sailors to more northern seaports, and periodically American cities would be decimated by it. Even as late as 1905, there was a bad yellow fever epidemic in New Orleans.
The most serious epidemic since the Black Death, was one of "Spanish influenza" which struck the world in 1918 and in one year killed 30 million people the world over, and about 600,000 of them in the United States. In comparison, four years of World War I, just preceding 1918, had killed 8 million. However, the influenza epidemic killed less than 2 percent of the world's population, so that the Black Death remains unrivaled.
What stands between such a catastrophe and us is the new knowledge we have gained in the last century and a half concerning the causes of infectious disease and methods for fighting it.
[…] Infectious disease is clearly more dangerous to human existence than any animal possibly could be, and we might be right to wonder whether it might not produce a final catastrophe before the glaciers ever have a chance to invade again and certainly before the sun begins to inch its way toward red gianthood.
What stands between such a catastrophe and us is the new knowledge we have gained in the last century and a half concerning the causes of infectious disease and methods for fighting it.
Microorganisms
People, throughout most of history, had no defense whatever against infectious disease. Indeed, the very fact of infection was not recognized in ancient and medieval times. When people began dying in droves, the usual theory was that an angry god was taking vengeance for some reason or other. Apollo's arrows were flying, so that one death was not responsible for another; Apollo was responsible for all, equally.
The Bible tells of a number of epidemics and in each case it is the anger of God kindled against sinners, as in 2 Samuel 24. In New Testament times, the theory of demonic possession as an explanation of disease was popular, and both Jesus and others cast our devils. The biblical authority for this has caused the theory to persist to this day, as witness by the popularity of such movies as The Exorcist.
As long as disease was blamed on divine or demonic influences, something as mundane as contagion was overlooked. Fortunately, the Bible also contains instructions for isolating those with leprosy (a name given not only to leprosy itself, but to other, less serious skin conditions). The biblical practice of isolation was for religious rather than hygienic reasons, for leprosy has a very low infectivity. On biblical authority, lepers were isolated in the Middle Ages, while those with really infectious disease were not. The practice of isolation, however, caused some physicians to think of it in connection with disease generally. In particular, the ultimate terror of the Black Death helped spread the notion of quarantine, a name which referred originally to isolation for forty (quarante in French) days.
The fact that isolation did slow the spread of a disease made it look as though contagion was a factor. The first to deal with this possibility in detail was an Italian physician, Girolamo Fracastoro (1478–1553). In 1546, he suggested that disease could be spread by direct contact of a well person with an ill one or by indirect contact of a well person with infected articles or even through transmission over a distance. He suggested that minute bodies, too small to be seen, passed from an ill person to a well one and that the minute bodies had the power of self-multiplication.
It was a remarkable bit of insight, but Fracastoro had no firm evidence to support his theory. If one is going to accept minute unseen bodies leaping from one body to another and do it on nothing more than faith, one might as well accept unseen demons.
Minute bodies did not, however, remain unseen. Already in Fracastoro's time, the use of lenses to aid vision was well established. By 1608, combinations of lenses were used to magnify distant objects and the telescope came into existence. It didn't take much of a modification to have lenses magnify tiny objects. The Italian physiologist Marcello Malpighi (1628–94) was the first to use a microscope for important work, reporting his observations in the 1650s.
The Dutch microscopist Anton van Leeuwenhoek (1632–1723) laboriously ground small but excellent lenses, which gave him a better view of the world of tiny objects than anyone else in his time had had. In 1677, he placed ditch water at the focus of one of his small lenses and found living organisms too small to see with the naked eye but each one as indisputably alive as a whale or an elephant—or as a human being. These were the one-celled animals we now call "protozoa."
In 1683, van Leeuwenhoek discovered structures still tinier than protozoa. They were at the limit of visibility with even his best lenses, but from his sketches of what he saw, it is clear that he had discovered bacteria, the smallest cellular creatures that exist.
To do any better than van Leeuwenhoek, one had to have distinctly better microscopes and these were slow to be developed. The next microscopist to describe bacteria was the Danish biologist Otto Friedrich Müller (1730–84) who described them in a book on the subject, published posthumously, in 1786.
In hindsight, it seems that one might have guessed that bacteria represented Fracastoro's infectious agents, but there was no evidence of that and even Müller's observations were so borderline that there was no general agreement that bacteria even existed, or that they were alive if they did.
The English optician Joseph Jackson Lister (1786–1869) developed an achromatic microscope in in 1830. Until then, the lenses used had refracted light into rainbows so that tiny objects were rimmed in color and could not be seen clearly. Lister combined lenses of different kinds of glass in such a way as to remove the colors.
With the colors gone, tiny objects stood out sharply and in the 1860s, the German botanist Ferdinand Julius Cohn (1828–98) saw and described bacteria with the first really convincing success. It was only with Cohn's work that the science of bacteriology was founded and that there came to be general agreement that bacteria existed.
Meanwhile, even without a clear indication of the existence of Fracastoro's agents, some physicians were discovering methods of reducing infection.
The Hungarian physician Ignaz Philipp Semmelweiss (1818–65) insisted that childbed fever which killed so many mothers in childbirth, was spread by the doctors themselves, since they went from autopsies straight to women in labor. He fought to get the doctors to wash their hands before attending the women, and when he managed to enforce this, in 1847, the incidence of childbed fever dropped precipitously. The insulted doctors, proud of their professional filth, revolted at this, however and finally managed to do their work with dirty hands again. The incidence of childbed fever climbed as rapidly as it had fallen—but that didn't bother the doctors.
The crucial moment came with the work of the French chemist Louis Pasteur (1822–95). Although he was a chemist his work had turned him more and more toward microscopes and microorganisms, and in 1865 he set to work studying a silkworm disease that was destroying France's silk industry. Using his microscope, he discovered a tiny parasite infesting the silkworms and the mulberry leaves that were fed to them. Pasteur's solution was drastic but rational. All infested worms and infested food must be destroyed. A new beginning must be made with healthy worms and the disease would be wiped out. His advice was followed and it worked. The silk industry was saved.
This turned Pasteur's interest to contagious diseases. It seemed to him that if the silkworm disease was the product of microscopic parasites other diseases might be, and thus was born the "germ theory of disease." Fracastoro's invisible infectious agents were microorganisms, often the bacteria that Cohn was just bringing clearly into the light of day.
It now became possible to attack infectious disease rationally, making use of a technique that had been introduced to medicine over half a century before. In 1798, the English physician Edward Jenner (1749–1823) had shown that people inoculated with the mild disease, cowpox, or vaccinia in Latin, acquired immunity not only to cowpox itself but also to the related but very virulent and dreaded disease, smallpox. The technique of "vaccination" virtually ended most of the devastation of smallpox.
Unfortunately, no other diseases were found to occur in such convenient pairs, with the mild one conferring immunity from the serious one. Nevertheless, with the notion of the germ theory the technique could be extended in another way.
Pasteur located specific germs associated with specific diseases, then weakened those germs by heating them or in other ways, and used the weakened germs for inoculation. Only a very mild disease was produced but immunity was conferred against the dangerous one. The first disease treated in this way was the deadly anthrax that ravaged herds of domestic animals.
Similar work was pursued even more successfully by the German bacteriologist Robert Koch (1843–1910). Antitoxins designed to neutralize bacterial poisons were also developed.
Meanwhile, the English surgeon Joseph Lister (1827–1912), the son of the inventor of the achromatic microscope, had followed up Semmelweiss's work. Once he learned of Pasteur's research he had a convincing rationale as excuse and began to insist that, before operating, surgeons wash their hands in solutions of chemicals known to kill bacteria. From 1867 on, the practice of "antiseptic surgery" spread quickly.
The germ theory also sped the adoption of rational preventive measures—personal hygiene, such as washing and bathing; careful disposal of wastes; the guarding of the cleanliness of food and water. Leaders in this were the German scientist Max Joseph von Pettenkofer (1818–1901) and Rudolph Virchow (1821–1902). They themselves did not accept the germ theory of disease but their recommendations would not have been followed as readily were it not that others did.
In addition, it was discovered that diseases such as yellow fever and malaria were transmitted by mosquitoes, typhus fever by lice, Rocky Mountain spotted fever by ticks, bubonic plague by fleas and so on. Measures against these small germ-transferring organisms acted to reduce the incidence of the diseases. Men such as the Americans Walter Reed (1851–1902) and Howard Taylor Ricketts (1871–1910) and the Frenchman Charles J. Nicolle (1866–1936) were involved in such discoveries.
The German bacteriologist Paul Ehrlich (1854–1915) pioneered the use of specific chemicals that would kill particular bacteria without killing the human being in which it existed. His most successful discovery came in 1910, when he found an arsenic compound that was active against the bacterium that causes syphilis.
This sort of work culminated in the discovery of the antibacterial effect of sulfanilamide and related compounds, beginning with the work of the German biochemist Gerhard Domagk (1895–1964) in 1935 and of antibiotics, beginning with the work of the French-American microbiologist René Jules Dubos (1901–[1982]) in 1939.
As late as 1955 came a victory over poliomyelitis, thanks to a vaccine prepared by the American microbiologist Jonas Edward Salk (1914–[1995]).
And yet victory is not total. Right now, the once ravaging disease of smallpox seems to be wiped out. Not one case exists, as far as we know, in the entire world. There are however infectious diseases such as a few found in Africa that are very contagious, virtually 100 percent fatal, and for which no cure exists. Careful hygienic measures have made it possible for such diseases to be studied without their spreading, and no doubt effective countermeasures will be worked out.
New Disease
It would seem, then, that as long as our civilization survives and our medical technology is not shattered there is no longer any danger that infectious disease will produce catastrophe or even anything like the disasters of the Black Death and the Spanish influenza. Yet, old familiar diseases have, within them, the potentiality of arising in new forms.
The human body (and all living organisms) have natural defenses against the invasion of foreign organisms. Antibodies are developed in the bloodstream that neutralize toxins or the microorganisms themselves. White cells in the blood stream physically attack bacteria.
Every few years a new strain of flu rises to pester us. It is possible, however, to produce vaccines against such a new strain once it makes an appearance.
Evolutionary processes generally make the fight an even one. Those organisms more efficient at self-protection against microorganisms tend to survive and pass on their efficiency to their offspring. Nevertheless, microorganisms are far smaller even than insects and far more fecund. They evolve much more quickly, with individual microorganisms almost totally unimportant in the scheme of things.
Considering the uncounted numbers of microorganisms of any particular species that are continually multiplying by cell fission, large numbers of mutations must be produced just as continually. Every once in a while such a mutation may act to make a particular disease far more infectious and deadly. Furthermore, it may sufficiently alter the chemical nature of the microorganism so that the antibodies which the host organism is capable of manufacturing are no longer usable. The result is the sudden onslaught of an epidemic. The Black Death was undoubtedly brought about by a mutant strain of the microorganism causing it.
Eventually, though, those human beings who are most susceptible die, and the relatively resistant survive, so that the virulence of the diseases dies down. In that case, is the human victory over the pathogenic microorganism permanent? Might not new strains of germs arise? They might and they do. Every few years a new strain of flu rises to pester us. It is possible, however, to produce vaccines against such a new strain once it makes an appearance. Thus, when a single case of "swine flu" appeared in 1976, a full scale mass-vaccination was set in action. It turned out not to be needed, but it showed what could be done.
Copyright © 1979 by Isaac Asimov, A Choice of Catastrophes: The Disasters That Threaten Our World, originally published by Simon & Schuster. Reprinted with permission from the Asimov estate.
[This article was originally published on June 8th, 2020 as part of a standalone magazine called GOOD10: The Pandemic Issue. Produced as a partnership among LeapsMag, The Aspen Institute, and GOOD, the magazine is available for free online.]
“Disinfection Tunnels” Are Popping Up Around the World, Fueled By Misinformation and Fear
In an incident that sparked widespread outrage across India in late March, officials in the north Indian state of Uttar Pradesh sprayed hundreds of migrant workers, including women and children, with a chemical solution to sanitize them, in a misguided attempt to contain the spread of the novel coronavirus.
Since COVID-19 is a respiratory disorder, disinfecting a person's body or clothes cannot protect them from contracting the novel coronavirus, or help in containing the pathogen's spread.
Health officials reportedly doused the group with a diluted mixture of sodium hypochlorite – a bleaching agent harmful to humans, which led to complaints of skin rashes and eye irritation. The opposition termed the instance 'inhuman', compelling the state government to order an investigation into the mass 'chemical bath.'
"I don't think the officials thought this through," says Thomas Abraham, a professor with The University of Hong Kong, and a former consultant for the World Health Organisation (WHO) on risk communication. "Spraying people with bleach can prove to be harmful, and there is no guideline … that recommends it. This was some sort of a kneejerk reaction."
Although spraying individuals with chemicals led to a furor in the South Asian nation owing to its potential dangers, so-called "disinfection tunnels" have sprung up in crowded public places around the world, including malls, offices, airports, railway stations and markets. Touted as mass disinfectants, these tunnels spray individuals with chemical disinfectant liquids, mists or fumes through nozzles for a few seconds, purportedly to sanitize them -- though experts strongly condemn their use. The tunnels have appeared in at least 16 countries: India, Malaysia, Scotland, Albania, Argentina, Colombia, Singapore, China, Pakistan, France, Vietnam, Bosnia and Herzegovina, Chile, Mexico, Sri Lanka and Indonesia. Russian President Vladimir Putin even reportedly has his own tunnel at his residence.
While U.S. visitors to Mexico are "disinfected" through these sanitizing tunnels, there is no evidence that the mechanism is currently in use within the United States. However, the situation could rapidly change with international innovators like RD Pack, an Israeli start-up, pushing for their deployment. Many American and multinational companies like Stretch Structures, Guilio Barbieri and Inflatable Design Works are also producing these systems. As countries gradually ease lockdown restrictions, their demand is on the rise -- despite a stringent warning from the WHO against their potential health hazards.
"Spraying individuals with disinfectants (such as in a tunnel, cabinet, or chamber) is not recommended under any circumstances," the WHO warned in a report on May 15. "This could be physically and psychologically harmful and would not reduce an infected person's ability to spread the virus through droplets or contact. Moreover, spraying individuals with chlorine and other toxic chemicals could result in eye and skin irritation, bronchospasm due to inhalation, and gastrointestinal effects such as nausea and vomiting."
Disinfection tunnels largely spray a diluted mixture of sodium hypochlorite, a chlorine compound commonly known as bleach, often used to disinfect inanimate surfaces. Known for its hazardous properties, the WHO, in a separate advisory on COVID-19, warns that spraying bleach or any other disinfectant on individuals can prove to be poisonous if ingested, and that such substances should be used only to disinfect surfaces.
Considering the effect of sodium hypochlorite on mucous membranes, the European Centre for Disease Prevention and Control, an EU agency focussed on infectious diseases, recommends limited use of the chemical compound even when disinfecting surfaces – only 0.05 percent for cleaning surfaces, and 0.1 percent for toilets and bathroom sinks. The Indian health ministry also cautioned against spraying sodium hypochlorite recently, stating that its inhalation can lead to irritation of mucous membranes of the nose, throat, and respiratory tract.
In addition to the health hazards that such sterilizing systems pose, they have little utility, argues Indian virologist T. Jacob John. Since COVID-19 is a respiratory disorder, disinfecting a person's body or clothes cannot protect them from contracting the novel coronavirus, or help in containing the pathogen's spread.
"It's a respiratory infection, which means that you have the virus in your respiratory tract, and of course, that shows in your throat, therefore saliva, etc.," says John. "The virus does not survive outside the body for a long time, unless it is in freezing temperatures. Disinfecting a person's clothes or their body makes no sense."
Disinfection tunnels have limited, if any, impact on the main modes of coronavirus transmission, adds Craig Janes, director, School of Public Health and Health Systems at Canada's University of Waterloo. He explains that the nature of COVID-19 transmission is primarily from person-to-person, either directly, or via an object that is shared between two individuals. Measures like physical distancing and handwashing take care of these transmission risks.
"My view of these kinds of actions are that they are principally symbolic, indicating to a concerned population that 'something is being done,' to martial support for government or health system efforts," says Janes. "So perhaps a psychological benefit, but I'm not sure that this benefit would outweigh the risks."
"They may make people feel that their risk of infection has been reduced, and also that they do not have to worry about infecting others."
A recent report by Health Care Without Harm (HCWH), an international not-for-profit organization focused on sustainable health care around the world, states that disinfection tunnels have little evidence to demonstrate their efficacy or safety.
"If the goal is to reduce the spread of the virus by decontaminating the exterior clothing, shoes, and skin of the general public, there is no evidence that clothes are an important vector for transmission. If the goal is to attack the virus in the airways, what is the evidence that a 20-30 second external application is efficacious and safe?" the report questions. "The World Health Organization recommends more direct and effective ways to address hand hygiene, with interventions known to be effective."
If an infected person walks through a disinfection tunnel, he would still be infectious, as the chemicals will only disinfect the surfaces, says Gerald Keusch, a professor of medicine and international health at Boston University's Schools of Medicine and Public Health.
"While we know that viruses can be "disinfected" from surfaces and hands, disinfectants can be harmful to health if ingested or inhaled. The underlying principle of medicine is to do no harm, and we always measure benefit against risk when approving interventions. I don't know if this has been followed and assessed with respect to these devices," says Keusch. "It's a really bad idea."
Experts warn that such tunnels may also create a false sense of security, discouraging people from adopting best practice methods like handwashing, social distancing, avoiding crowded places, and using masks to combat the spread of COVID-19.
"They may make people feel that their risk of infection has been reduced, and also that they do not have to worry about infecting others," says Janes. "These are false assumptions, and may lead to increasing rather than reducing transmission."