SCOOP: Largest Cryobank in the U.S. to Offer Ancestry Testing
Sharon Kochlany and Vanessa Colimorio's four-year-old twin girls had a classic school assignment recently: make a family tree. They drew themselves and their one-year-old brother branching off from their moms, with aunts, uncles, and grandparents forking off to the sides.
The recently-gained sovereignty of queer families stands to be lost if a consumer DNA test brings a stranger's identity out of the woodwork.
What you don't see in the invisible space between Kochlany and Colimorio, however, is the sperm donor they used to conceive all three children.
To look at a family tree like this is to see in its purest form that kinship can supersede biology—the boundaries of where this family starts and stops are clear to everyone in it, in spite of a third party's genetic involvement. This kind of self-definition has always been synonymous with LGBTQ families, especially those that rely on donor gametes (sperm or eggs) to exist.
But the world around them has changed quite suddenly: The recent consumer DNA testing boom has made it more complicated than ever for families built through reproductive technology—openly, not secretively—to maintain the strong sense of autonomy and privacy that can be crucial for their emotional security. Prospective parents and cryobanks are now mulling how best to bring a new generation of donor-conceived people into this world in a way that leaves open the choice to know more about their ancestry without obliterating an equally important choice: the right not to know about biological relatives.
For queer parents who have long fought for social acceptance, having a biological relationship to their children has been revolutionary, and using an unknown donor as a means to this end especially so. Getting help from a friend often comes with the expectation that the friend will also have social involvement in the family, which some people are comfortable with, but being able to access sperm from an unknown donor—which queer parents have only been able to openly do since the early 1980s—grants them the reproductive autonomy to create families seemingly on their own. That recently-gained sovereignty stands to be lost if a consumer DNA test brings a stranger's identity out of the woodwork.
At the same time, it's natural for donor-conceived people to want to know more about where they come from ethnically, even if they don't want to know the identity of their donor. As a donor-conceived person myself, I know my donor's self-reported ethnicity, but have often wondered how accurate it is.
Opening the Pandora's box of a consumer DNA test as a way to find out has always felt profoundly unappealing to me, however. Many people have accidentally learned they're donor-conceived by unwittingly using these tools, but I already know that about myself going in, and subsequently know I'll be connected to a large web of people whose existence I'm not interested in learning about. In addition to possibly identifying my anonymous donor, his family could also show up, along with any donor-siblings—other people with whom I share a donor. My single lesbian mom is enough for me, and the trade off to learn more about my ethnic ancestry has never seemed worth it.
In 1992, when I was born, no one was planning for how consumer DNA tests might upend or illuminate one's sense of self. But the donor community has always had to stay nimble with balancing privacy concerns and psychological well-being, so it should come as no surprise that figuring out how to do so in 2020 includes finding a way to offer ancestry insight while circumventing consumer DNA tests.
A New Paradigm
This is the rationale behind unprecedented industry news that LeapsMag can exclusively break: Within the next few weeks, California Cryobank, the largest cryobank in the country, will begin offering genetically-verified ancestry information on the free public part of every donor's anonymous profile in its database, something no other cryobanks yet offer (an exact launch date was not available at the time of publication). Currently, California Cryobank's donor profiles include a short self-reported list that might merely say, "Ancestry: German, Lebanese, Scottish."
The new information will be a report in pie chart form that details exactly what percentages of a donor's DNA come from up to 26 ethnicities—it's analogous to, but on a smaller scale than, the format offered by consumer DNA testing companies, and uses the same base technology that looks for single nucleotide polymorphisms in DNA that are associated with specific ethnicities. But crucially, because the donor takes the DNA test through California Cryobank, not a consumer-facing service, the information is not connected in a network to anyone else's DNA test. It's also taken before any offspring exist so there's no chance of revealing a donor-conceived person's identity this way.
Later, when a donor-conceived person is born, grows up, and wants information about their ethnicity from the donor side, all they need is their donor's anonymous ID number to look it up. The donor-conceived person never takes a genetic test, and therefore also can't accidentally find donor siblings this way. People who want to be connected to donor siblings can use a sibling registry where other people who want to be found share donor ID numbers and look for matches (this is something that's been available for decades, and remains so).
"With genetic testing, you have no control over who reaches out to you, and at what point in your life."
California Cryobank will require all new donors to consent to this extra level of genetic testing, setting a new standard for what information prospective parents and donor-conceived people can expect to have. In the immediate, this information will be most useful for prospective parents looking for donors with specific backgrounds, possibly ones similar to their own.
It's a solution that was actually hiding in plain sight. Two years ago, California Cryobank's partner Sema4, the company handling the genetic carrier testing that's used to screen for heritable diseases, started analyzing ethnic data in its samples. That extra information was being collected because it can help calculate a more accurate assessment of genetic risks that run in certain populations—like Ashkenazi Jews and Tay Sachs disease—than relying on oral family histories. Shortly after a plan to start collecting these extra data, Jamie Shamonki, chief medical officer of California Cryobank, realized the companies would be sitting on a goldmine for a different reason.
"I didn't want to use one of these genetic testing companies like Ancestry to accomplish this," says Shamonki. "The whole thing we're trying to accomplish is also privacy."
Consumer-facing DNA testing companies are not HIPAA compliant (whereas Sema4, which isn't direct-to-consumer, is HIPAA compliant), which means there are no legal privacy protections covering people who add their DNA to these databases. Although some companies, like 23andMe, allow users to opt-out of being connected with genetic relatives, the language can be confusing to navigate, requires a high level of knowledge and self-advocacy on the user's part, and, as an opt-out system, is not set up to protect the user from unwanted information by default; many unwittingly walk right into such information as a result.
Additionally, because consumer-facing DNA testing companies operate outside the legal purview that applies to other health care entities, like hospitals, even a person who does opt-out of being linked to genetic relatives is not protected in perpetuity from being re-identified in the future by a change in company policy. The safest option for people with privacy concerns is to stay out of these databases altogether.
For California Cryobank, the new information about donor heritage won't retroactively be added to older profiles in the system, so donor-conceived people who already exist won't benefit from the ancestry tool, but it'll be the new standard going forward. The company has about 500 available donors right now, many of which have been in their registry for a while; about 100 of those donors, all new, will have this ancestry data on their profiles.
Shamonki says it has taken about two years to get to the point of publicly including ancestry information on a donor's profile because it takes about nine months of medical and psychological screening for a donor to go from walking through the door to being added to their registry. The company wanted to wait to launch until it could offer this information for a significant number of donors. As more new donors come online under the new protocol, the number with ancestry information on their profiles will go up.
For Parents: An Unexpected Complication
While this change will no doubt be welcome progress for LGBTQ families contemplating parenthood, it'll never be possible to put this entire new order back in the box. What are such families who already have donor-conceived children losing in today's world of widespread consumer genetic testing?
Kochlany and Colimorio's twins aren't themselves much older than the moment at-home DNA testing really started to take off. They were born in 2015, and two years later the industry saw its most significant spike. By now, more than 26 million people's DNA is in databases like 23andMe and Ancestry; as a result, it's estimated that within a year, 90 percent of Americans of European descent will be identifiable through these consumer databases, by way of genetic third cousins, even if they didn't want to be found and never took the test themselves. This was the principle behind solving the Golden State Killer cold case.
The waning of privacy through consumer DNA testing fundamentally clashes with the priorities of the cyrobank industry, which has long sought to protect the privacy of donor-conceived people, even as open identification became standard. Since the 1980s, donors have been able to allow their identity to be released to any offspring who is at least 18 and wants the information. Lesbian moms pushed for this option early on so their children—who would obviously know they couldn't possibly be the biological product of both parents—would never feel cut off from the chance to know more about themselves. But importantly, the openness is not a two-way street: the donors can't ever ask for the identities of their offspring. It's the latter that consumer DNA testing really puts at stake.
"23andMe basically created the possibility that there will be donors who will have contact with their donor-conceived children, and that's not something that I think the donor community is comfortable with," says I. Glenn Cohen, director of Harvard Law School's Center for Health Law Policy, Biotechnology & Bioethics. "That's about the donor's autonomy, not the rearing parents' autonomy, or the donor-conceived child's autonomy."
Kochlany and Colimorio have an open identification donor and fully support their children reaching out to California Cryobank to get more information about him if they want to when they're 18, but having a singular name revealed isn't the same thing as having contact, nor is it the same thing as revealing a web of dozens of extended genetic relations. Their concern now is that if their kids participate in genetic testing, a stranger—someone they're careful to refer to as only "the donor" and never "dad"—will reach out to the children to begin some kind of relationship. They know other people who are contemplating giving their children DNA tests, and feel staunchly that it wouldn't be right for their family.
"With genetic testing, you have no control over who reaches out to you, and at what point in your life," Kochlany says. "[People] reaching out and trying to say, 'Hey I know who your dad is' throws a curveball. It's like, 'Wait, I never thought I had a dad.' It might put insecurities in their minds."
"We want them to have the opportunity to choose whether or not they want to reach out," Colimorio adds.
Kochlany says that when their twins are old enough to start asking questions, she and Colimorio plan to frame it like this: "The donor was kind of like a technology that helped us make you a person, and make sure that you exist," she says, role playing a conversation with their kids. "But it's not necessarily that you're looking to this person [for] support or love, or because you're missing a piece."
It's a line in the sand that's present even for couples still far off from conceiving. When Mallory Schwartz, a film and TV producer in Los Angeles, and Lauren Pietra, a marriage and family therapy associate (and Shamonki's step-daughter), talk about getting married someday, it's a package deal with talking about how they'll approach having kids. They feel there are too many variables and choices to make around family planning as a same-sex couple these days to not have those conversations simultaneously. Consumer DNA databases are already on their minds.
"It frustrates me that the DNA databases are just totally unregulated," says Schwartz. "I hope they are by the time we do this. I think everyone deserves a right to privacy when making your family [using a sperm donor]."
"I wouldn't want to create a world where people who are donor-conceived feel like they can't participate in this technology because they're trying to shut out [other] information."
On the prospect of having a donor relation pop up non-consensually for a future child, Pietra says, "I don't like it. It would be really disappointing if the child didn't want [contact], and unfortunately they're on the receiving end."
You can see how important preserving the right to keep this door closed is when you look at what's going on at The Sperm Bank of California. This pioneering cryobank was the first in the world to openly serve LGBTQ people and single women, and also the first to offer the open identification option when it opened in 1982, but not as many people are asking for their donor's identity as expected.
"We're finding a third of young people are coming forward for their donor's identity," says Alice Ruby, executive director. "We thought it would be a higher number." Viewed the other way, two-thirds of the donor-conceived people who could ethically get their donor's identity through The Sperm Bank of California are not asking the cryobank for it.
Ruby says that part of what historically made an open identification program appealing, rather than invasive or nerve-wracking, is how rigidly it's always been formatted around mutual consent, and protects against surprises for all parties. Those [donor-conceived people] who wanted more information were never barred from it, while those who wanted to remain in the dark could. No one group's wish eclipsed the other's. The potential breakdown of a system built around consent, expectations, and respect for privacy is why unregulated consumer DNA testing is most concerning to her as a path for connecting with genetic relatives.
For the last few decades in cryobanks around the world, the largest cohort of people seeking out donor sperm has been lesbian couples, followed by single women. For infertile heterosexual couples, the smallest client demographic, Ruby says donor sperm offers a solution to a medical problem, but in contrast, it historically "provided the ability for [lesbian] couples and single moms to have some reproductive autonomy." Yes, it was still a solution to a biological problem, but it was also a solution to a social one.
The Sperm Bank of California updated its registration forms to include language urging parents, donor-conceived people, and donors not to use consumer DNA tests, and to go through the cryobank if they, understandably, want to learn more about who they're connected to. But truthfully, there's not much else cryobanks can do to protect clients on any side of the donor transaction from surprise contact right now—especially not from relatives of the donor who may not even know someone in their family has donated sperm.
A Tricky Position
Personally, I've known I was donor-conceived from day one. It has never been a source of confusion, angst, or curiosity, and in fact has never loomed particularly large for me in any way. I see it merely as a type of reproductive technology—on par with in vitro fertilization—that enabled me to exist, and, now that I do exist, is irrelevant. Being confronted with my donor's identity or any donor siblings would make this fact of my conception bigger than I need it to be, as an adult with a full-blown identity derived from all of my other life experiences. But I still wonder about the minutiae of my ethnicity in much the same way as anyone else who wonders, and feel there's no safe way for me to find out without relinquishing some of my existential independence.
The author and her mom in spring of 1998.
"People obviously want to participate in 23andMe and Ancestry because they're interested in knowing more about themselves," says Shamonki. "I wouldn't want to create a world where people who are donor-conceived feel like they can't participate in this technology because they're trying to shut out [other] information."
After all, it was the allure of that exact conceit—knowing more about oneself—that seemed to magnetically draw in millions of people to these tools in the first place. It's an experience that clearly taps into a population-wide psychic need, even—perhaps especially—if one's origins are a mystery.
[Editor's Note: On June 6, 2017, Anne Shabason, an artist, hospice educator, and mother of two from Bolton, Ontario, a small town about 30 miles outside of Toronto, underwent Deep Brain Stimulation (DBS) to treat her Parkinson's disease. The FDA approved DBS for Parkinson's disease in 2002. Although it's shown to be safe and effective, agreeing to invasive brain surgery is no easy decision, even when you have your family and one of North America's premier neurosurgeons at your side.
Here, with support from Stan, her husband of the past 40 years, Anne talks about her life before Parkinson's, what the disease took away, and what she got back because of DBS. As told to writer Heather R. Johnson.]
I was an artist.
I worked in mixed media, Papier-mâché, and collage, inspired by dreams, birds, mystery. I had gallery shows and participated in studio tours.
Educated in thanatology, I worked in hospice care as a volunteer and education director for Hospice Caledon, an organization that supports people facing life-limiting illness and grief.
I trained volunteers who helped people through their transition.
Parkinson's disease changed all that.
My hands and my head were not coordinating, so it was impossible to do my art.
It started as a twitch in my leg. During a hospice workshop, my right leg started vibrating in a way I hadn't experienced before. I told a friend, "This can't be good."
Over the next year, my right foot vibrated more and more. I could not sleep well. In my dreams people lurked in corners, in dark places, and behind castle doors. I knew they were there and couldn't avoid the ambush. I shrieked and woke everyone in the house.
An anxiety attack—something I had also never experienced before—came next.
During a class I was teaching, my mouth got so dry, I couldn't speak. I stood in front of the class for three or four minutes, unable to continue. I pushed through and finished the class. That's when I realized this was more than jiggling legs.
That's when I went to see a doctor.
A Diagnosis
My first doctor, when I suggested it might be Parkinson's, didn't believe me. She sent me to a neurologist who told me I had to meditate more and calm myself.
A friend from hospice told me to phone the Toronto Western Hospital Movement Disorders Clinic. In January 2010, I was diagnosed with Parkinson's disease.
The doctor, a fellow, got all my stats and asked a lot of questions. He was so excited he knew what it was, he exclaimed, "You've got Parkinson's!" like it was the best thing ever. I must say, that wasn't the best news, but at least I finally had a diagnosis.
I could choose whether to take medication or not. The doctor said, "If Parkinson's is compromising your lifestyle, you should consider taking levodopa."
"Well I can't run my classes, I can't do my art, so it's compromising me," I said. And my health was going downhill. The shaking—my whole body moved—sleeping was horrible. Two to four hours max a night was usual. I had terrible anxiety and panic attacks and had to quit work.
So I started taking levodopa. It's taken in a four-hour cycle, but the medication didn't last the full time. I developed dyskenisia, a side effect of the medication that made me experience uncontrolled, involuntary movements. I was edgy, irritable, and focused on my watch like a drug addict. I'd lie on the couch, feel crummy and tired, and wait.
The medication cycle restricted where I could go. Fearing the "off" period, I avoided interaction with lifelong friends, which increased my feeling of social isolation. They would come over and cook with me and read to me sometimes, and that was fine, as long as it was during an "on" period.
There was incontinence, constipation, and fatigue.
I lost fine motor skills, like writing. And painting. My hands and my head were not coordinating, so it was impossible to do my art.
It was a terrible time.
The worst symptoms—what pushed me to consider DBS—were the symptoms no one could see. The anxiety and depression were so bad, the sleeplessness, not eating.
I projected a lot of my discomforts onto Stan. I reacted so badly to him. I actually separated from him briefly on two separate occasions and lived in a separate space—a self-imposed isolation. There wasn't anything he could do to help me really except sit back and watch.
I tried alternative therapies—a naturopath, an osteopath, a reflexologist and a Chinese medicine practitioner—but nothing seemed to help.
I felt like I was dying. Certain parts of my life were being taken away from me. I was a perfectionist, and I felt imperfect. It was a horrible feeling, to not be in control of myself.
The DBS Decision
I was familiar with DBS, a procedure that involves a neurosurgeon drilling small holes into your skull and implanting electrical leads deep in your brain to modify neural activity, reducing involuntary movements.
But I was convinced I'd never do it. I was brought up in a family that believed 'doctors make you sick and hospitals kill you.'
I worried the room wouldn't be sterile. Someone's cutting into your brain, you don't know what's going to happen. They're putting things in your body. I didn't want to risk possible infection.
And my doctor said he couldn't promise he would actually do the operation. It might be a fellow, but he'd be in the background in case anything went wrong. I wasn't comfortable with that arrangement.
When filmmakers Taryn Southern and Elena Gaby decided to make a documentary about people whose lives were changed by cutting-edge brain implants--and I agreed to participate—my doctor said he would for sure do the operation. They couldn't risk anything happening on the operating table on camera, so most of my fears went away.
My family supported the decision. My mother had trigeminal neuralgia, which is a very painful facial condition. She also had a stroke and what we now believe to be Parkinson's. My father, a retired dentist, managed her care and didn't give her the opportunity to see a specialist.
I felt them running the knife across my scalp, and drilling two holes in my head, but only as pressure, not pain.
When we were talking about DBS, my son, Joseph, said, "How can you not do this, for the sake of your family? Because if you don't, you'll end up like Grandma, who, for the last few years of her life, just lay on a couch because she didn't get any kind of outside help. If you even have a chance to improve your life or give yourself five extra years, why wouldn't you do that, for our sake? Are we not worth that?"
That talk really affected me, and I realized I had to try. Even though it was difficult, I had to be brave for my family.
Surgery, Recovery, and Tweaking
You have to be awake for part of the procedure—I was awake enough that my subconscious could hear, because they had to know how far to insert the electrodes. DBS targets the troublemaking areas of the brain. There's a one millimeter difference between success and failure.
I felt them running the knife across my scalp, and drilling two holes in my head, but only as pressure, not pain.
Once they were inside, they asked me to move parts of my body to see whether the right neurons were activated.
They put me to sleep to put a battery-powered neurostimulator in my chest. A wire that runs behind my ear and down my neck connects the electrodes in my brain to the battery pack. The neurostimulator creates electric pulses 24 hours a day.
I was moving around almost immediately after surgery. Recovery from the stitches took a few weeks, but everything else took a lot longer.
I couldn't read. My motor skills were still impaired, and my brain and my hands weren't yet linked up. I needed the device to be programmed and tweaked. Until that happened, I needed help.
The depression and anxiety, though, went away almost immediately. From that perspective, it was like I never had Parkinson's. I was so happy.
When they calibrated the electrodes, they adjusted how much electrical current goes to any one of four contact points on the left and right sides of the brain. If they increased it too much, a leg would start shaking, a foot would start cramping, or my tongue would feel thicker. It took a while to get it calibrated correctly to control the symptoms.
First it was five sessions in five weeks, then once a month, then every three months. Now I visit every six months. As the disease progresses, they have the ability to keep making adjustments. (DBS controls the symptoms, but it doesn't cure the disease.)
Once they got the calibration right, my motor skills improved. I could walk without shuffling. My muscles weren't stiff and aching, and the dyskinesia disappeared. But if I turn off the device, my symptoms return almost immediately.
Some days I have more fatigue than others, and sometimes my brain doesn't click. And my voice got softer – that's a common side effect of this operation. But I'm doing so much better than before.
I have a quality of life I didn't have before. Before COVID-19 hit, Stan and I traveled, went to concerts, movies, galleries, and spent time with our growing family.
Anne in her home studio with her art, 2019.
I cut back the levodopa from seven-and-a-half pills a day to two-and-a-half. I often forget to take my medication until I realize I'm feeling tired or anxious.
Best of all, my motivation and creative ability have clicked in.
I am an artist—again.
I'm painting every day. It's what is keeping me sane. It's my saving grace.
I'm not perfect. But I am Anne. Again.
Isaac Asimov on the History of Infectious Disease—and How Humanity Learned to Fight Back
[EDITOR'S FORWARD: Humanity has always faced existential threats from dangerous microbes, and though this is the first pandemic in our lifetimes, it won't be the last our species will ever face. This newly relevant work by beloved sci-fi writer Isaac Asimov, an excerpt from his 1979 book, A Choice of Catastrophes, establishes that reality in its historical context and makes clear how far we have come since ancient times. But by some measures, we are still in the earliest stages of figuring out how to effectively neutralize such threats. Advancing progress as fast as we can—by leveraging all the insights of modern science—offers our best hope for containing this pandemic and those that will inevitably follow.]
Infectious Disease
An even greater danger to humanity than the effect of small, fecund pests on human beings, their food, and their possessions, is their tendency to spread some forms of infectious disease.
Every living organism is subject to disease of various sorts, where disease is defined in its broadest sense as "dis-ease," that is, as any malfunction or alteration of the physiology or biochemistry that interferes with the smooth workings of the organism. In the end, the cumulative effect of malfunctions, misfunctions, nonfunctions, even though much of it is corrected or patched up, produces irreversible damage—we call it old age—and, even with the best care in the world, brings on inevitable death.
Civilization has meant the development and growth of cities and the crowding of people into close quarters.
There are some individual trees that may live five thousand years, some cold-blooded animals that may live two hundred years, some warm-blooded animals that may live one hundred years, but for each multicellular individual death comes as the end.
This is an essential part of the successful functioning of life. New individuals constantly come into being with new combinations of chromosomes and genes, and with mutated genes, too. These represent new attempts, so to speak, at fitting the organism to the environment. Without the continuing arrival of new organisms that are not mere copies of the old, evolution would come to a halt. Naturally, the new organisms cannot perform their role properly unless the old ones are removed from the scene after they have performed their function of producing the new. In short, the death of the individual is essential to the life of the species.
It is essential, however, that the individual not die before the new generation has been produced; at least, not in so many cases as to ensure the population dwindling to extinction.
The human species cannot have the relative immunity to harm from individual death possessed by the small and fecund species. Human beings are comparatively large, long-lived, and slow to reproduce, so that too rapid individual death holds within it the specter of catastrophe. The rapid death of unusually high numbers of human beings through disease can seriously dent the human population. Carried to an extreme, it is not too hard to imagine it wiping out the human species.
Most dangerous in this respect is that class of malfunction referred to as "infectious disease." There are many disorders that affect a particular human being for one reason or another and may kill him or her, too, but which will not, in itself, offer a danger to the species, because it is strictly confined to the suffering individual. Where, however, a disease can, in some way travel from one human being to another, and where its occurrence in a single individual may lead to the death of not that one alone but of millions of others as well, then there is the possibility of catastrophe.
And indeed, infectious disease has come closer to destroying the human species in historic times than have the depredations of any animals. Although infectious disease, even at its worst, has never yet actually put an end to human beings as a living species (obviously), it can seriously damage a civilization and change the course of history. It has, in fact, done so not once, but many times.
What's more, the situation has perhaps grown worse with the coming of civilization. Civilization has meant the development and growth of cities and the crowding of people into close quarters. Just as fire can spread much more rapidly from tree to tree in a dense forest than in isolated stands, so can infectious disease spread more quickly in crowded quarters than in sparse settlements.
To mention a few notorious cases in history:
In 431 B.C., Athens and its allies went to war with Sparta and its allies. It was a twenty-seven-year war that ruined Athens and, to a considerable extent, all of Greece. Since Sparta controlled the land, the entire Athenian population crowded into the walled city of Athens. There they were safe and could be provisioned by sea, which was controlled by the Athenian navy. Athens would very likely have won a war of attrition before long and Greece might have avoided ruin, but for disease.
In 430 B.C., an infectious plague struck the crowded Athenian population and killed 20 percent of them, including the charismatic leader, Pericles. Athens kept on fighting but it never recovered its population or its strength and in the end it lost.
Plagues very frequently started in eastern and southern Asia, where population was densest, and spread westward. In A.D. 166, when the Roman Empire was at its peak of strength and civilization under the hard-working philosopher-emperor Marcus Aurelius, the Roman armies, fighting on the eastern borders in Asia Minor, began to suffer from an epidemic disease (possibly smallpox). They brought it back with them to other provinces and to Rome itself. At its height, 2,000 people were dying in the city of Rome each day. The population began to decline and did not reach its preplague figure again until the twentieth century. There are a great many reasons advanced for the long, slow decline of Rome that followed the reign of Marcus Aurelius, but the weakening effect of the plague of 166 surely played a part.
Even after the western provinces of the empire were torn away by invasions of the German tribes, and Rome itself was lost, the eastern half of the Roman Empire continued to exist, with its capital at Constantinople. Under the capable emperor Justinian I, who came to the throne in 527, Africa, Italy, and parts of Spain were taken and, for a while, it looked as though the empire might be reunited. In 541, however, the bubonic plague struck. It was a disease that attacked rats primarily, but one that fleas could spread to human beings by biting first a sick rat and then a healthy human being. Bubonic disease was fast-acting and often quickly fatal. It may even have been accompanied by a more deadly variant, pneumonic plague, which can leap directly from one person to another.
For two years the plague raged, and between one-third and one-half of the population of the city of Constantinople died, together with many people in the countryside outside the city. There was no hope of uniting the empire thereafter and the eastern portion, which came to be known as the Byzantine Empire, continued to decline thereafter (with occasional rallies).
The very worst epidemic in the history of the human species came in the fourteenth century. Sometime in the 1330s, a new variety of bubonic plague, a particularly deadly one, appeared in central Asia. People began to die and the plague spread outward, inexorably, from its original focus.
Eventually, it reached the Black Sea. There on the Crimean peninsula, jutting into the north-central coast of that sea, was a seaport called Kaffa where the Italian city of Genoa had established a trading post. In October, 1347, a Genoese ship just managed to make it back to Genoa from Kaffa. The few men on board who were not dead of the plague were dying. They were carried ashore and thus the plague entered Europe and began to spread rapidly.
Sometimes one caught a mild version of the disease, but often it struck violently. In the latter case, the patient was almost always dead within one to three days after the onset of the first symptoms. Because the extreme dangers were marked by hemorrhagic spots that turned dark, the disease was called the "Black Death."
The Black Death spread unchecked. It is estimated to have killed some 25 million people in Europe before it died down and many more than that in Africa and Asia. It may have killed a third of all the human population of the planet, perhaps 60 million people altogether or even more. Never before or after do we know of anything that killed so large a percentage of the population as did the Black Death.
It is no wonder that it inspired abject terror among the populace. Everyone walked in fear. A sudden attack of shivering or giddiness, a mere headache, might mean that death had marked one for its own and that no more than a couple of dozen hours were left in which to die. Whole towns were depopulated, with the first to die lying unburied while the survivors fled to spread the disease. Farms lay untended; domestic animals wandered uncared for. Whole nations—Aragon, for instance, in what is now eastern Spain—were afflicted so badly that they never truly recovered.
Distilled liquors had been first developed in Italy about 1100. Now, two centuries later they grew popular. The theory was that strong drink acted as a preventive against contagion. It didn't, but it made the drinker less concerned which, under the circumstances, was something. Drunkenness set in over Europe and it stayed even after the plague was gone; indeed, it has never left. The plague also upset the feudal economy by cutting down on the labor supply very drastically. This did as much to destroy feudalism as did the invention of gunpowder. (Perhaps the most distressing sidelight of the Black Death is the horrible insight into human nature that it offers. England and France were in the early decades of the Hundred Years War at the time. Although the Black Death afflicted both nations and nearly destroyed each, the war continued right on. There was no thought of peace in this greatest of all crises faced by the human species.)
There have been other great plagues since, though none to match the Black Death in unrivaled terror and destruction. In 1664 and 1665, the bubonic plague struck London and killed 75,000.
Cholera, which always simmered just below the surface in India (where it is "endemic") would occasionally explode and spread outward into an "epidemic." Europe was visited by deadly cholera epidemics in 1831 and again in 1848 and 1853. Yellow fever, a tropical disease, would be spread by sailors to more northern seaports, and periodically American cities would be decimated by it. Even as late as 1905, there was a bad yellow fever epidemic in New Orleans.
The most serious epidemic since the Black Death, was one of "Spanish influenza" which struck the world in 1918 and in one year killed 30 million people the world over, and about 600,000 of them in the United States. In comparison, four years of World War I, just preceding 1918, had killed 8 million. However, the influenza epidemic killed less than 2 percent of the world's population, so that the Black Death remains unrivaled.
What stands between such a catastrophe and us is the new knowledge we have gained in the last century and a half concerning the causes of infectious disease and methods for fighting it.
[…] Infectious disease is clearly more dangerous to human existence than any animal possibly could be, and we might be right to wonder whether it might not produce a final catastrophe before the glaciers ever have a chance to invade again and certainly before the sun begins to inch its way toward red gianthood.
What stands between such a catastrophe and us is the new knowledge we have gained in the last century and a half concerning the causes of infectious disease and methods for fighting it.
Microorganisms
People, throughout most of history, had no defense whatever against infectious disease. Indeed, the very fact of infection was not recognized in ancient and medieval times. When people began dying in droves, the usual theory was that an angry god was taking vengeance for some reason or other. Apollo's arrows were flying, so that one death was not responsible for another; Apollo was responsible for all, equally.
The Bible tells of a number of epidemics and in each case it is the anger of God kindled against sinners, as in 2 Samuel 24. In New Testament times, the theory of demonic possession as an explanation of disease was popular, and both Jesus and others cast our devils. The biblical authority for this has caused the theory to persist to this day, as witness by the popularity of such movies as The Exorcist.
As long as disease was blamed on divine or demonic influences, something as mundane as contagion was overlooked. Fortunately, the Bible also contains instructions for isolating those with leprosy (a name given not only to leprosy itself, but to other, less serious skin conditions). The biblical practice of isolation was for religious rather than hygienic reasons, for leprosy has a very low infectivity. On biblical authority, lepers were isolated in the Middle Ages, while those with really infectious disease were not. The practice of isolation, however, caused some physicians to think of it in connection with disease generally. In particular, the ultimate terror of the Black Death helped spread the notion of quarantine, a name which referred originally to isolation for forty (quarante in French) days.
The fact that isolation did slow the spread of a disease made it look as though contagion was a factor. The first to deal with this possibility in detail was an Italian physician, Girolamo Fracastoro (1478–1553). In 1546, he suggested that disease could be spread by direct contact of a well person with an ill one or by indirect contact of a well person with infected articles or even through transmission over a distance. He suggested that minute bodies, too small to be seen, passed from an ill person to a well one and that the minute bodies had the power of self-multiplication.
It was a remarkable bit of insight, but Fracastoro had no firm evidence to support his theory. If one is going to accept minute unseen bodies leaping from one body to another and do it on nothing more than faith, one might as well accept unseen demons.
Minute bodies did not, however, remain unseen. Already in Fracastoro's time, the use of lenses to aid vision was well established. By 1608, combinations of lenses were used to magnify distant objects and the telescope came into existence. It didn't take much of a modification to have lenses magnify tiny objects. The Italian physiologist Marcello Malpighi (1628–94) was the first to use a microscope for important work, reporting his observations in the 1650s.
The Dutch microscopist Anton van Leeuwenhoek (1632–1723) laboriously ground small but excellent lenses, which gave him a better view of the world of tiny objects than anyone else in his time had had. In 1677, he placed ditch water at the focus of one of his small lenses and found living organisms too small to see with the naked eye but each one as indisputably alive as a whale or an elephant—or as a human being. These were the one-celled animals we now call "protozoa."
In 1683, van Leeuwenhoek discovered structures still tinier than protozoa. They were at the limit of visibility with even his best lenses, but from his sketches of what he saw, it is clear that he had discovered bacteria, the smallest cellular creatures that exist.
To do any better than van Leeuwenhoek, one had to have distinctly better microscopes and these were slow to be developed. The next microscopist to describe bacteria was the Danish biologist Otto Friedrich Müller (1730–84) who described them in a book on the subject, published posthumously, in 1786.
In hindsight, it seems that one might have guessed that bacteria represented Fracastoro's infectious agents, but there was no evidence of that and even Müller's observations were so borderline that there was no general agreement that bacteria even existed, or that they were alive if they did.
The English optician Joseph Jackson Lister (1786–1869) developed an achromatic microscope in in 1830. Until then, the lenses used had refracted light into rainbows so that tiny objects were rimmed in color and could not be seen clearly. Lister combined lenses of different kinds of glass in such a way as to remove the colors.
With the colors gone, tiny objects stood out sharply and in the 1860s, the German botanist Ferdinand Julius Cohn (1828–98) saw and described bacteria with the first really convincing success. It was only with Cohn's work that the science of bacteriology was founded and that there came to be general agreement that bacteria existed.
Meanwhile, even without a clear indication of the existence of Fracastoro's agents, some physicians were discovering methods of reducing infection.
The Hungarian physician Ignaz Philipp Semmelweiss (1818–65) insisted that childbed fever which killed so many mothers in childbirth, was spread by the doctors themselves, since they went from autopsies straight to women in labor. He fought to get the doctors to wash their hands before attending the women, and when he managed to enforce this, in 1847, the incidence of childbed fever dropped precipitously. The insulted doctors, proud of their professional filth, revolted at this, however and finally managed to do their work with dirty hands again. The incidence of childbed fever climbed as rapidly as it had fallen—but that didn't bother the doctors.
The crucial moment came with the work of the French chemist Louis Pasteur (1822–95). Although he was a chemist his work had turned him more and more toward microscopes and microorganisms, and in 1865 he set to work studying a silkworm disease that was destroying France's silk industry. Using his microscope, he discovered a tiny parasite infesting the silkworms and the mulberry leaves that were fed to them. Pasteur's solution was drastic but rational. All infested worms and infested food must be destroyed. A new beginning must be made with healthy worms and the disease would be wiped out. His advice was followed and it worked. The silk industry was saved.
This turned Pasteur's interest to contagious diseases. It seemed to him that if the silkworm disease was the product of microscopic parasites other diseases might be, and thus was born the "germ theory of disease." Fracastoro's invisible infectious agents were microorganisms, often the bacteria that Cohn was just bringing clearly into the light of day.
It now became possible to attack infectious disease rationally, making use of a technique that had been introduced to medicine over half a century before. In 1798, the English physician Edward Jenner (1749–1823) had shown that people inoculated with the mild disease, cowpox, or vaccinia in Latin, acquired immunity not only to cowpox itself but also to the related but very virulent and dreaded disease, smallpox. The technique of "vaccination" virtually ended most of the devastation of smallpox.
Unfortunately, no other diseases were found to occur in such convenient pairs, with the mild one conferring immunity from the serious one. Nevertheless, with the notion of the germ theory the technique could be extended in another way.
Pasteur located specific germs associated with specific diseases, then weakened those germs by heating them or in other ways, and used the weakened germs for inoculation. Only a very mild disease was produced but immunity was conferred against the dangerous one. The first disease treated in this way was the deadly anthrax that ravaged herds of domestic animals.
Similar work was pursued even more successfully by the German bacteriologist Robert Koch (1843–1910). Antitoxins designed to neutralize bacterial poisons were also developed.
Meanwhile, the English surgeon Joseph Lister (1827–1912), the son of the inventor of the achromatic microscope, had followed up Semmelweiss's work. Once he learned of Pasteur's research he had a convincing rationale as excuse and began to insist that, before operating, surgeons wash their hands in solutions of chemicals known to kill bacteria. From 1867 on, the practice of "antiseptic surgery" spread quickly.
The germ theory also sped the adoption of rational preventive measures—personal hygiene, such as washing and bathing; careful disposal of wastes; the guarding of the cleanliness of food and water. Leaders in this were the German scientist Max Joseph von Pettenkofer (1818–1901) and Rudolph Virchow (1821–1902). They themselves did not accept the germ theory of disease but their recommendations would not have been followed as readily were it not that others did.
In addition, it was discovered that diseases such as yellow fever and malaria were transmitted by mosquitoes, typhus fever by lice, Rocky Mountain spotted fever by ticks, bubonic plague by fleas and so on. Measures against these small germ-transferring organisms acted to reduce the incidence of the diseases. Men such as the Americans Walter Reed (1851–1902) and Howard Taylor Ricketts (1871–1910) and the Frenchman Charles J. Nicolle (1866–1936) were involved in such discoveries.
The German bacteriologist Paul Ehrlich (1854–1915) pioneered the use of specific chemicals that would kill particular bacteria without killing the human being in which it existed. His most successful discovery came in 1910, when he found an arsenic compound that was active against the bacterium that causes syphilis.
This sort of work culminated in the discovery of the antibacterial effect of sulfanilamide and related compounds, beginning with the work of the German biochemist Gerhard Domagk (1895–1964) in 1935 and of antibiotics, beginning with the work of the French-American microbiologist René Jules Dubos (1901–[1982]) in 1939.
As late as 1955 came a victory over poliomyelitis, thanks to a vaccine prepared by the American microbiologist Jonas Edward Salk (1914–[1995]).
And yet victory is not total. Right now, the once ravaging disease of smallpox seems to be wiped out. Not one case exists, as far as we know, in the entire world. There are however infectious diseases such as a few found in Africa that are very contagious, virtually 100 percent fatal, and for which no cure exists. Careful hygienic measures have made it possible for such diseases to be studied without their spreading, and no doubt effective countermeasures will be worked out.
New Disease
It would seem, then, that as long as our civilization survives and our medical technology is not shattered there is no longer any danger that infectious disease will produce catastrophe or even anything like the disasters of the Black Death and the Spanish influenza. Yet, old familiar diseases have, within them, the potentiality of arising in new forms.
The human body (and all living organisms) have natural defenses against the invasion of foreign organisms. Antibodies are developed in the bloodstream that neutralize toxins or the microorganisms themselves. White cells in the blood stream physically attack bacteria.
Every few years a new strain of flu rises to pester us. It is possible, however, to produce vaccines against such a new strain once it makes an appearance.
Evolutionary processes generally make the fight an even one. Those organisms more efficient at self-protection against microorganisms tend to survive and pass on their efficiency to their offspring. Nevertheless, microorganisms are far smaller even than insects and far more fecund. They evolve much more quickly, with individual microorganisms almost totally unimportant in the scheme of things.
Considering the uncounted numbers of microorganisms of any particular species that are continually multiplying by cell fission, large numbers of mutations must be produced just as continually. Every once in a while such a mutation may act to make a particular disease far more infectious and deadly. Furthermore, it may sufficiently alter the chemical nature of the microorganism so that the antibodies which the host organism is capable of manufacturing are no longer usable. The result is the sudden onslaught of an epidemic. The Black Death was undoubtedly brought about by a mutant strain of the microorganism causing it.
Eventually, though, those human beings who are most susceptible die, and the relatively resistant survive, so that the virulence of the diseases dies down. In that case, is the human victory over the pathogenic microorganism permanent? Might not new strains of germs arise? They might and they do. Every few years a new strain of flu rises to pester us. It is possible, however, to produce vaccines against such a new strain once it makes an appearance. Thus, when a single case of "swine flu" appeared in 1976, a full scale mass-vaccination was set in action. It turned out not to be needed, but it showed what could be done.
Copyright © 1979 by Isaac Asimov, A Choice of Catastrophes: The Disasters That Threaten Our World, originally published by Simon & Schuster. Reprinted with permission from the Asimov estate.
[This article was originally published on June 8th, 2020 as part of a standalone magazine called GOOD10: The Pandemic Issue. Produced as a partnership among LeapsMag, The Aspen Institute, and GOOD, the magazine is available for free online.]