How 30 Years of Heart Surgeries Taught My Dad How to Live
[Editor's Note: This piece is the winner of our 2019 essay contest, which prompted readers to reflect on the question: "How has an advance in science or medicine changed your life?"]
My father did not expect to live past the age of 50. Neither of his parents had done so. And he also knew how he would die: by heart attack, just as his father did.
In July of 1976, he had his first heart attack, days before his 40th birthday.
My dad lived the first 40 years of his life with this knowledge buried in his bones. He started smoking at the age of 12, and was drinking before he was old enough to enlist in the Navy. He had a sarcastic, often cruel, sense of humor that could drive my mother, my sister and me into tears. He was not an easy man to live with, but that was okay by him - he didn't expect to live long.
In July of 1976, he had his first heart attack, days before his 40th birthday. I was 13, and my sister was 11. He needed quadruple bypass surgery. Our small town hospital was not equipped to do this type of surgery; he would have to be transported 40 miles away to a heart center. I understood this journey to mean that my father was seriously ill, and might die in the hospital, away from anyone he knew. And my father knew a lot of people - he was a popular high school English teacher, in a town with only three high schools. He knew generations of students and their parents. Our high school football team did a blood drive in his honor.
During a trip to Disney World in 1974, Dad was suffering from angina the entire time but refused to tell me (left) and my sister, Kris.
Quadruple bypass surgery in 1976 meant that my father's breastbone was cut open by a sternal saw. His ribcage was spread wide. After the bypass surgery, his bones would be pulled back together, and tied in place with wire. The wire would later be pulled out of his body when the bones knitted back together. It would take months before he was fully healed.
Dad was in the hospital for the rest of the summer and into the start of the new school year. Going to visit him was farther than I could ride my bicycle; it meant planning a trip in the car and going onto the interstate. The first time I was allowed to visit him in the ICU, he was lying in bed, and then pushed himself to sit up. The heart monitor he was attached to spiked up and down, and I fainted. I didn't know that heartbeats change when you move; television medical dramas never showed that - I honestly thought that I had driven my father into another heart attack.
Only a few short years after that, my father returned to the big hospital to have his heart checked with a new advance in heart treatment: a CT scan. This would allow doctors to check for clogged arteries and treat them before a fatal heart attack. The procedure identified a dangerous blockage, and my father was admitted immediately. This time, however, there was no need to break bones to get to the problem; my father was home within a month.
During the late 1970's, my father changed none of his habits. He was still smoking, and he continued to drink. But now, he was also taking pills - pills to manage the pain. He would pop a nitroglycerin tablet under his tongue whenever he was experiencing angina (I have a vivid memory of him doing this during my driving lessons), but he never mentioned that he was in pain. Instead, he would snap at one of us, or joke that we were killing him.
I think he finally determined that, if he was going to have these extra decades of life, he wanted to make them count.
Being the kind of guy he was, my father never wanted to talk about his health. Any admission of pain implied that he couldn't handle pain. He would try to "muscle through" his angina, as if his willpower would be stronger than his heart muscle. His efforts would inevitably fail, leaving him angry and ready to lash out at anyone or anything. He would blame one of us as a reason he "had" to take valium or pop a nitro tablet. Dinners often ended in shouts and tears, and my father stalking to the television room with a bottle of red wine.
In the 1980's while I was in college, my father had another heart attack. But now, less than 10 years after his first, medicine had changed: our hometown hospital had the technology to run dye through my father's blood stream, identify the blockages, and do preventative care that involved statins and blood thinners. In one case, the doctors would take blood vessels from my father's legs, and suture them to replace damaged arteries around his heart. New advances in cholesterol medication and treatments for angina could extend my father's life by many years.
My father decided it was time to quit smoking. It was the first significant health step I had ever seen him take. Until then, he treated his heart issues as if they were inevitable, and there was nothing that he could do to change what was happening to him. Quitting smoking was the first sign that my father was beginning to move out of his fatalistic mindset - and the accompanying fatal behaviors that all pointed to an early death.
In 1986, my father turned 50. He had now lived longer than either of his parents. The habits he had learned from them could be changed. He had stopped smoking - what else could he do?
It was a painful decade for all of us. My parents divorced. My sister quit college. I moved to the other side of the country and stopped speaking to my father for almost 10 years. My father remarried, and divorced a second time. I stopped counting the number of times he was in and out of the hospital with heart-related issues.
In the early 1990's, my father reached out to me. I think he finally determined that, if he was going to have these extra decades of life, he wanted to make them count. He traveled across the country to spend a week with me, to meet my friends, and to rebuild his relationship with me. He did the same with my sister. He stopped drinking. He was more forthcoming about his health, and admitted that he was taking an antidepressant. His humor became less cruel and sadistic. He took an active interest in the world. He became part of my life again.
The 1990's was also the decade of angioplasty. My father explained it to me like this: during his next surgery, the doctors would place balloons in his arteries, and inflate them. The balloons would then be removed (or dissolve), leaving the artery open again for blood. He had several of these surgeries over the next decade.
When my father was in his 60's, he danced at with me at my wedding. It was now 10 years past the time he had expected to live, and his life was transformed. He was living with a woman I had known since I was a child, and my wife and I would make regular visits to their home. My father retired from teaching, became an avid gardener, and always had a home project underway. He was a happy man.
Dancing with my father at my wedding in 1998.
Then, in the mid 2000's, my father faced another serious surgery. Years of arterial surgery, angioplasty, and damaged heart muscle were taking their toll. He opted to undergo a life-saving surgery at Cleveland Clinic. By this time, I was living in New York and my sister was living in Arizona. We both traveled to the Midwest to be with him. Dad was unconscious most of the time. We took turns holding his hand in the ICU, encouraging him to regain his will to live, and making outrageous threats if he didn't listen to us.
The nursing staff were wonderful. I remember telling them that my father had never expected to live this long. One of the nurses pointed out that most of the patients in their ward were in their 70's and 80's, and a few were in their 90's. She reminded me that just a decade earlier, most hospitals were unwilling to do the kind of surgery my father had received on patients his age. In the first decade of the 21st century, however, things were different: 90-year-olds could now undergo heart surgery and live another decade. My father was on the "young" side of their patients.
The Cleveland Clinic visit would be the last major heart surgery my father would have. Not that he didn't return to his local hospital a few times after that: he broke his neck -- not once, but twice! -- slipping on ice. And in the 2010's, he began to show signs of dementia, and needed more home care. His partner, who had her own health issues, was not able to provide the level of care my father needed. My sister invited him to move in with her, and in 2015, I traveled with him to Arizona to get him settled in.
After a few months, he accepted home hospice. We turned off his pacemaker when the hospice nurse explained to us that the job of a pacemaker is to literally jolt a patient's heart back into beating. The jolts were happening more and more frequently, causing my Dad additional, unwanted pain.
My father in 2015, a few months before his death.
My father died in February 2016. His body carried the scars and implants of 30 years of cardiac surgeries, from the ugly breastbone scar from the 1970's to scars on his arms and legs from borrowed blood vessels, to the tiny red circles of robotic incisions from the 21st century. The arteries and veins feeding his heart were a patchwork of transplanted leg veins and fragile arterial walls pressed thinner by balloons.
And my father died with no regrets or unfinished business. He died in my sister's home, with his long-time partner by his side. Medical advancements had given him the opportunity to live 30 years longer than he expected. But he was the one who decided how to live those extra years. He was the one who made the years matter.
The Internet has made it easier than ever to misguide people. The anti-vaxx movement, climate change denial, protests against stem cell research, and other movements like these are rooted in the spread of misinformation and a distrust of science.
"I had been taught intelligent design and young-earth creationism instead of evolution, geology, and biology."
Science illiteracy is pervasive in the communities responsible for these movements. For the mainstream, the challenge lies not in sharing the facts, but in combating the spread of misinformation and facilitating an open dialogue between experts and nonexperts.
I grew up in a household that was deeply skeptical of science and medicine. My parents are evangelical Christians who believe the word of the Bible is law. To protect my four siblings and me from secular influence, they homeschooled some of us and put the others in private Christian schools. When my oldest brother left for a Christian college and the tuition began to add up, I was placed in a public charter school to offset the costs.
There, I became acutely aware of my ignorant upbringing. I had been taught intelligent design and young-earth creationism instead of evolution, geology, and biology. My mother skipped over world religions, and much of my history curriculum was more biblical-based than factual. She warned me that stem cell research, vaccines, genetic modification of crops, and other areas of research in biological science were examples of humans trying to be like God. At the time, biologist Richard Dawkins' The God Delusion was a bestseller and science seemed like an excuse to not believe in God, so she and my father discouraged me from studying it.
The gaps in my knowledge left me feeling frustrated and embarrassed. The solution was to learn about the things that had been censored from my education, but several obstacles stood in the way.
"When I first learned about fundamentalism, my parents' behavior finally made sense."
I lacked a good foundation in basic mathematics after being taught by my mother, who never graduated college. My father, who holds a graduate degree in computer science, repeatedly told me that I inherited my mother's "bad math genes" and was therefore ill-equipped for science. While my brothers excelled at math under his supervision and were even encouraged toward careers in engineering and psychology, I was expected to do well in other subjects, such as literature. When I tried to change this by enrolling in honors math and science classes, they scolded me -- so reluctantly, I dropped math. By the time I graduated high school, I was convinced that math and science were beyond me.
When I look back at my high school transcripts, that sense of failure was unfounded: my grades were mostly A's and B's, and I excelled in honors biology. Even my elementary standardized test scores don't reflect a student disinclined toward STEM, because I consistently scored in the top percentile for sciences. Teachers often encouraged me to consider studying science in college. Why then, I wondered, did my parents reject that idea? Why did they work so hard to sway me from that path? It wasn't until I moved away from my parents' home and started working to put myself through community college that I discovered my passion for both biology and science writing.
As a young adult venturing into the field of science communication, I've become fascinated with understanding communities that foster antagonistic views toward science. When I first learned about fundamentalism, my parents' behavior finally made sense. It is the foundation of the Religious Right, a right-wing Christian group which heavily influences the Republican party in the United States. The Religious Right crusades against secular education, stem cell research, abortion, evolution, and other controversial issues in science and medicine on the basis that they contradict Christian beliefs. They are quietly overturning the separation of church and state in order to enforce their religion as policy -- at the expense of science and progress.
Growing up in this community, I learned that strong feelings about these issues arise from both a lack of science literacy and a distrust of experts. Those who are against genetic modification of crops don't understand that GMO research aims to produce more, and longer-lasting, food for a growing planet. The anti-vaxx movement is still relying on a deeply flawed study that was ultimately retracted. Those who are against stem cell research don't understand how it works or the important benefits it provides the field of medicine, such as discovering new treatment methods.
In fact, at one point the famous Christian radio show Focus on the Family spread anti-vaxx mentality when they discussed vaccines that, long ago, were derived from aborted fetal cells. Although Focus on the Family now endorses vaccines, at the time it was enough to convince my own mother, who listened to the show every morning, not to vaccinate us unless the law required it.
"In everyday interactions with skeptics, science communicators need to shift their focus from convincing to discussing."
We can help clear up misunderstandings by sharing the facts, but the real challenge lies in willful ignorance. It was hard for me to accept, but I've come to understand that I'm not going to change anyone's mind. It's up to an individual to evaluate the facts, consider the arguments for and against, and make his or her own decision.
As my parents grew older and my siblings and I introduced them to basic concepts in science, they came around to trusting the experts a little more. They now see real doctors instead of homeopathic practitioners. They acknowledge our world's changing climate instead of denying it. And they even applaud two of their children for pursuing careers in science. Although they have held on to their fundamentalism and we still disagree on many issues, these basic changes give me hope that people in deeply skeptical communities are not entirely out of reach.
In everyday interactions with skeptics, science communicators need to shift their focus from convincing to discussing. This means creating an open dialogue with the intention of being understanding and helpful, not persuasive. This approach can be beneficial in both personal and online interactions. There are people within these movements who have doubts, and their doubts will grow as we continue to feed them through discussion.
People will only change their minds when it is the right time for them to do so. We need to be there ready to hold their hand and lead them toward truth when they reach out. Until then, all we can do is keep the channels of communication open, keep sharing the facts, and fight the spread of misinformation. Science is the pursuit of truth, and as scientists and science communicators, sometimes we need to let the truth speak for itself. We're just there to hold the megaphone.
Like any life-threatening medical condition that affects children, food allergies can traumatize more than just the patient. My wife and I learned this one summer afternoon when our daughter was three years old.
Emergency room visits for anaphylaxis in children more than doubled from 2010 to 2016.
At an ice cream parlor, I gave Samantha a lick of my pistachio cone; within seconds, red blotches erupted on her skin, her lips began to swell, and she complained that her throat felt funny. We rushed her to the nearest emergency room, where a doctor injected her with epinephrine. Explaining that the reaction, known as anaphylaxis, could have been fatal if left unchecked, he advised us to have her tested for nut allergies—and to start carrying an injector of our own.
After an allergist confirmed Sam's vulnerability to tree nuts and peanuts, we figured that keeping her safe would be relatively simple. But food allergies often come in bunches. Over the next year, she wound up back in the ER after eating bread with sesame seeds at an Italian restaurant, and again after slurping buckwheat noodles at our neighborhood Japanese. She hated eggs, so we discovered that (less severe) allergy only when she vomited after eating a variety of products containing them.
In recent years, a growing number of families have had to grapple with such challenges. An estimated 32 million Americans have food allergies, or nearly 10 percent of the population—10 times the prevalence reported 35 years ago. The severity of symptoms seems to be increasing, too. According to a study released in January by Food Allergy Research & Education (FARE), a Virginia-based nonprofit, insurance claims for anaphylactic food reactions rose 377 percent in the U.S. from 2007 to 2016.
Because food allergies most commonly emerge in childhood, these trends are largely driven by the young. An insurance-industry study found that emergency room visits for anaphylaxis in children more than doubled from 2010 to 2016. Peanut allergies, once rare, tripled in kids between 1997 and 2008. "The first year, it was 1 in 250," says Scott Sicherer, chief of pediatric allergy and immunology at New York City's Mount Sinai Hospital, who led that study. "When we did the next round of research, in 2002, it was 1 in 125. I thought there must be a mistake. But by 2008, it was 1 in 70."
The forces behind these dire statistics—as well as similar numbers throughout the developed world—have yet to be positively identified. But the leading suspects are elements of our modern lifestyle that can throw the immune system out of whack, prompting potentially deadly overreactions to harmless proteins. Although parents can take a few steps that might lessen their children's risk, societal changes may be needed to brighten the larger epidemiological picture.
Meanwhile, scientists are racing to develop therapies that can induce patients' hyped-up immune defenses to chill. And lately, they've made some big strides toward that goal.
A Variety of Culprits
In the United States, about 90 percent of allergic reactions come from eight foods: milk, eggs, peanuts, tree nuts, soy, wheat, fish, and shellfish. The list varies from country to country, depending on dietary customs, but what the trigger foods all have in common is proteins that can survive breakdown in the stomach and enter the bloodstream more or less intact.
"When we were kids, we played in the dirt. Today, children tend to be on their screens, inside sealed buildings."
A food allergy results from a chain of biochemical misunderstandings. The first time the immune system encounters an allergen (as a protein that triggers an allergy is known), it mistakes the substance for a hostile invader—perhaps a parasite with a similar molecular profile. In response, it produces an antibody called immunoglobin E (IgE), which is designed to bind to a specific protein and flag it for attack. These antibodies circulate through the bloodstream and attach to immune-system foot soldiers known as mast cells and basophils, which congregate in the nose, throat, lungs, skin, and gastrointestinal tract.
The next time the person is exposed to the allergen, the IgE antibodies signal the warrior cells to blast the intruder with histamines and other chemical weapons. Tissues in the affected areas swell and leak fluid; blood pressure may fall. Depending on the strength of the reaction, collateral damage to the patient can range from unpleasant—itching, runny nose, nausea—to catastrophic.
This kind of immunological glitchiness runs in families. Genome-wide association studies have identified a dozen genes linked to allergies of all types, and twin studies suggest that about 80 percent of the risk of food allergies is heritable. But why one family member shows symptoms while another doesn't remains unknown. Nor can genetics explain why food allergy rates have skyrocketed in such a brief period. For that, we must turn to the environment.
First, it's important to note that rates of all allergies are rising—including skin and respiratory afflictions—though none as rapidly or with as much risk of anaphylaxis as those involving food. The takeoff was already underway in the late 1980s, when British epidemiologist David P. Strachan found that children in larger households had fewer instances of hay fever. The reason, he suggested, was that their immune systems were strengthened by exposure to their siblings' germs. Since then, other researchers have discerned more evidence for Strachan's "hygiene hypothesis": higher rates of allergy (as well as autoimmune disorders) in cities versus rural areas, in industrialized countries versus developing ones, in lab animals raised under sterile conditions versus those exposed to germs.
Fending off a variety of pathogens, experts theorize, helps train the immune system to better distinguish friend from foe, and to respond to threats in a more nuanced manner. In an era of increasing urbanization, shrinking family sizes, and more sheltered lifestyles, such conditioning may be harder to come by. "When we were kids, we played in the dirt," observes Cathryn R. Nagler, a professor and food allergy researcher at the University of Chicago. "Today, children tend to be on their screens, inside sealed buildings."
But other factors may be driving the allergy epidemic as well. More time indoors, for example, means less exposure to sunlight, which can lead to a deficiency in vitamin D—a nutrient crucial to immune system regulation. The growing popularity of processed foods filled with refined fats and sugars may play a role, along with rising rates of obesity, by promoting tissue inflammation that could increase some people's risk of immunological mayhem. And the surge in allergies also correlates with several trends that may be altering the human microbiome, the community of microbes (including bacteria, viruses, and fungi, among others) that inhabits our guts, skin, and bodily orifices.
The microbiome connection may be particularly relevant to food allergies. In 2014, a team led by Nagler published a landmark study showing that Clostridia, a common class of gut bacteria, protects against these allergies. When the researchers fed peanut allergens to germ-free mice (born and raised in sterile conditions) and to mice treated with antibiotics as newborns (reducing their gut bacteria), the animals showed a strong immunological response. This sensitization could be reversed, however, by reintroducing Clostridia—but not another class of bacteria, Bacteroides—into the mice. Further experiments revealed that Clostridia caused immune cells to produce high levels of interleukin-22 (IL-22), a signaling molecule known to decrease the permeability of the intestinal lining.
"In simple terms," Nagler says, "what we found is that these bacteria prevent food allergens from gaining access to the blood in an intact form that elicits an allergic reaction."
A growing body of evidence suggests that our eating habits are throwing our gut microbiota off-balance, in part by depriving helpful species of the dietary fiber they feed on. Our increasing exposure to antibiotics and antimicrobial compounds may be harming our beneficial bugs as well. These depletions could affect kids from the moment they enter the world: Because babies are seeded with their mothers' microbiota as they pass through the birth canal, they may be inheriting a less diverse microbiome than did previous generations. And the rising rate of caesarian deliveries may be further depriving our children of the bugs they need.
On expert suggests two measures worth a try: increasing consumption of fiber, and reducing use of antimicrobial agents, from antibacterial cleaners to antibiotics.
So which culprit is most responsible for the food allergy upsurge? "The illnesses that we're measuring are complex," says Sicherer. "There are multiple genetic inputs, which interact with one another, and there are multiple environmental inputs, which interact with each other and with the genes. There's not one single thing that's causing this. It's a conglomeration."
What Parents Can Do
For anyone hoping to reduce their child's or their own odds of developing a food allergy (rates of adult onset are also increasing), the current state of science offers few guideposts. As with many other areas of health research, it's hard to know when the data is solid enough to warrant a particular course of action. A case in point: the American Academy of Pediatrics once recommended that children at risk of allergy to peanuts (as evidenced by family history, other food allergies, or eczema) wait to eat them until age three; now, the AAP advises those parents to start their babies at four months, citing epidemiological evidence that early exposure may prevent peanut allergies.
And it's all too easy for a layperson to draw mistaken conclusions from media coverage of such research—inferring, for instance, that taking commercially available probiotics might have a protective effect. Unfortunately, says Nagler, none of those products even contain the relevant kind of bacteria.
Although, as a research scientist, she refrains from giving medical advice, Nagler does suggest (based on a large body of academic literature) that two measures are worth a try: increasing consumption of fiber, and reducing use of antimicrobial agents, from antibacterial cleaners to antibiotics. Yet she acknowledges that it's not always possible to avoid the suspected risk factors for food allergies. Sometimes an antibiotic is a lifesaving necessity, for example—and it's tough to avoid exposure to such drugs altogether, due to their use in animal feed and their consequent presence in many foods and in the water supply. If these chemicals are contributing to the food allergy epidemic, protecting ourselves will require action from farmers, doctors, manufacturers, and policymakers.
My family's experience illustrates the limits of healthy lifestyle choices in mitigating allergy risk. My daughter and son were born without C-sections; both were breastfed as well, receiving maximum microbial seeding from their mother. As a family, we eat exemplary diets, and no one could describe our home as excessively clean. Yet one child can't taste nuts, sesame, or buckwheat without becoming dangerously ill. "You can do everything right and still have allergies," says Ian A. Myles, a staff clinician at the National Institute of Allergy and Infectious Diseases. "You can do everything wrong and not have allergies. The two groups overlap."
The Latest Science Shows Promise
But while preventing all food allergies is clearly unrealistic, researchers are making remarkable progress in developing better treatments—therapies that, instead of combating symptoms after they've started (like epinephrine or antihistamines), aim to make patients less sensitive to allergens in the first place. One promising approach is oral immunotherapy (OIT), in which patients consume small but slowly increasing amounts of an allergen, gradually reducing their sensitivity. A study published last year in the New England Journal of Medicine showed that an experimental OIT called AR101, consisting of a standardized peanut powder mixed into food, enabled 67 percent of participants to tolerate a dose equivalent to two peanut kernels—a potential lifesaver if they were accidentally exposed to the real thing.
Because OIT itself can trigger troublesome reactions in some patients, however, it's not for everyone. Another experimental treatment, sublingual immunotherapy (SLIT) uses an allergen solution or dissolving tablet placed beneath the tongue; although its results are less robust than OIT's, it seems to generate milder side effects. Epicutaneous immunotherapy (EPIT) avoids the mouth entirely, using a technology similar to a nicotine patch to deliver allergens through the skin. Researchers are also exploring the use of medications known as biologics, aiming to speed up the action of immunotherapies by suppressing IgE or targeting other immune-system molecules.
These findings suggest that drugs based on microbial metabolites could help protect vulnerable individuals against a wide range of allergies.
One downside of the immunotherapy approach is that in most cases the allergen must be taken indefinitely to maintain desensitization. To provide a potentially permanent fix, scientists are working on vaccines that use DNA or peptides (protein fragments) from allergens to reset patients' immune systems.
Nagler is attacking the problem from a different angle—one that starts with the microbiome. In a recent study, a follow-up to her peanut-allergy investigation, she and her colleagues found that Clostridia bacteria protect mice against milk allergy as well; they also identified a particular species responsible, known as Anaerostipes caccae. The bugs, the team determined, produce a short-chain fatty acid called butyrate, which modulates many immune activities crucial to maintaining a well-sealed gut.
These findings suggest that drugs based on microbial metabolites could help protect vulnerable individuals against a wide range of allergies. Nagler has launched a company, ClostraBio, to develop biotherapeutics based on this notion; she expects its first product, using synthetic butyrate, to be ready for clinical trials within the next two years.
My daughter could well be a candidate for such a medication. Sam, now 15, is a vibrant, resilient kid who handles her allergies with confidence and humor. Thanks to vigilance and luck (on her part as well as her parents'), she hasn't had another food-related ER visit in more than a decade; she's never had to use her Epi-Pen. Still, she says, she would welcome the arrival of a pill that could reduce the danger. "I've learned how to watch out for myself," she says. "But it would be nice not to have to be so careful."