Sarah Mancoll was 22 years old when she noticed a bald spot on the back of her head. A dermatologist confirmed that it was alopecia aerata, an autoimmune disorder that causes hair loss.
Of 213 new drugs approved from 2003 to 2012, only five percent included any data from pregnant women.
She successfully treated the condition with corticosteroid shots for nearly 10 years. Then Mancoll and her husband began thinking about starting a family. Would the shots be safe for her while pregnant? For the fetus? What about breastfeeding?
Mancoll consulted her primary care physician, her dermatologist, even a pediatrician. Without clinical data, no one could give her a definitive answer, so she stopped treatment to be "on the safe side." By the time her son was born, she'd lost at least half her hair. She returned to her Washington, D.C., public policy job two months later entirely bald—and without either eyebrows or eyelashes.
After having two more children in quick succession, Mancoll recently resumed the shots but didn't forget her experience. Today, she is an advocate for including more pregnant and lactating women in clinical studies so they can have more information about therapies than she did.
"I live a very privileged life, and I'll do just fine with or without hair, but it's not just about me," Mancoll said. "It's about a huge population of women who are being disenfranchised…They're invisible."
About 4 million women give birth each year in the United States, and many face medical conditions, from hypertension and diabetes to psychiatric disorders. A 2011 study showed that most women reported taking at least one medication while pregnant between 1976 and 2008. But for decades, pregnant and lactating women have been largely excluded from clinical drug studies that rigorously test medications for safety and effectiveness.
An estimated 98 percent of government-approved drug treatments between 2000 and 2010 had insufficient data to determine risk to the fetus, and close to 75 percent had no human pregnancy data at all. All told, of 213 new pharmaceuticals approved from 2003 to 2012, only five percent included any data from pregnant women.
But recent developments suggest that could be changing. Amid widespread concerns about increased maternal mortality rates, women's health advocates, physicians, and researchers are sensing and encouraging a cultural shift toward protecting women through responsible research instead of from research.
"The question is not whether to do research with pregnant women, but how," Anne Drapkin Lyerly, professor and associate director of the Center for Bioethics at the University of North Carolina at Chapel Hill, wrote last year in an op-ed. "These advances are essential. It is well past time—and it is morally imperative—for research to benefit pregnant women."
"In excluding pregnant women from drug trials to protect them from experimentation, we subject them to uncontrolled experimentation."
To that end, the American College of Obstetricians and Gynecologists' Committee on Ethics acknowledged that research trials need to be better designed so they don't "inappropriately constrain the reproductive choices of study participants or unnecessarily exclude pregnant women." A federal task force also called for significantly expanded research and the removal of regulatory barriers that make it difficult for pregnant and lactating women to participate in research.
Several months ago, a government change to a regulation known as the Common Rule took effect, removing pregnant women as a "vulnerable population" in need of special protections -- a designation that had made it more difficult to enroll them in clinical drug studies. And just last week, the U.S. Food and Drug Administration (FDA) issued new draft guidances for industry on when and how to include pregnant and lactating women in clinical trials.
Inclusion is better than the absence of data on their treatment, said Catherine Spong, former chair of the federal task force.
"It's a paradox," said Spong, professor of obstetrics and gynecology and chief of maternal fetal medicine at University of Texas Southwestern Medical Center. "There is a desire to protect women and fetuses from harm, which is translated to a reluctance to include them in research. By excluding them, the evidence for their care is limited."
Jacqueline Wolf, a professor of the history of medicine at Ohio University, agreed.
"In excluding pregnant women from drug trials to protect them from experimentation, we subject them to uncontrolled experimentation," she said. "We give them the medication without doing any research, and that's dangerous."
Women, of course, don't stop getting sick or having chronic medical conditions just because they are pregnant or breastfeeding, and conditions during pregnancy can affect a baby's health later in life. Evidence-based data is important for other reasons, too.
Pregnancy can dramatically change a woman's physiology, affecting how drugs act on her body and how her body acts or reacts to drugs. For instance, pregnant bodies can more quickly clear out medications such as glyburide, used during diabetes in pregnancy to stabilize high blood-sugar levels, which can be toxic to the fetus and harmful to women. That means a regular dose of the drug may not be enough to control blood sugar and prevent poor outcomes.
Pregnant patients also may be reluctant to take needed drugs for underlying conditions (and doctors may be hesitant to prescribe them), which in turn can cause more harm to the woman and fetus than had they been treated. For example, women who have severe asthma attacks while pregnant are at a higher risk of having low-birthweight babies, and pregnant women with uncontrolled diabetes in early pregnancy have more than four times the risk of birth defects.
Current clinical trials involving pregnant women are assessing treatments for obstructive sleep apnea, postpartum hemorrhage, lupus, and diabetes.
For Kate O'Brien, taking medication during her pregnancy was a matter of life and death. A freelance video producer who lives in New Jersey, O'Brien was diagnosed with tuberculosis in 2015 after she became pregnant with her second child, a boy. Even as she signed hospital consent forms, she had no idea if the treatment would harm him.
"It's a really awful experience," said O'Brien, who now is active with We are TB, an advocacy and support network. "All they had to tell me about the medication was just that women have been taking it for a really long time all over the world. That was the best they could do."
More and more doctors, researchers and women's health organizations and advocates are calling that unacceptable.
By indicating that filling current knowledge gaps is "a critical public health need," the FDA is signaling its support for advancing research with pregnant women, said Lyerly, also co-founder of the Second Wave Initiative, which promotes fair representation of the health interests of pregnant women in biomedical research and policies. "It's a very important shift."
Research with pregnant women can be done ethically, Lyerly said, whether by systematically collecting data from those already taking medications or enrolling pregnant women in studies of drugs or vaccines in development.
Current clinical trials involving pregnant women are assessing treatments for obstructive sleep apnea, postpartum hemorrhage, lupus, and diabetes. Notable trials in development target malaria and HIV prevention in pregnancy.
"It clearly is doable to do this research, and test trials are important to provide evidence for treatment," Spong said. "If we don't have that evidence, we aren't making the best educated decisions for women."
Like any life-threatening medical condition that affects children, food allergies can traumatize more than just the patient. My wife and I learned this one summer afternoon when our daughter was three years old.
Emergency room visits for anaphylaxis in children more than doubled from 2010 to 2016.
At an ice cream parlor, I gave Samantha a lick of my pistachio cone; within seconds, red blotches erupted on her skin, her lips began to swell, and she complained that her throat felt funny. We rushed her to the nearest emergency room, where a doctor injected her with epinephrine. Explaining that the reaction, known as anaphylaxis, could have been fatal if left unchecked, he advised us to have her tested for nut allergies—and to start carrying an injector of our own.
After an allergist confirmed Sam's vulnerability to tree nuts and peanuts, we figured that keeping her safe would be relatively simple. But food allergies often come in bunches. Over the next year, she wound up back in the ER after eating bread with sesame seeds at an Italian restaurant, and again after slurping buckwheat noodles at our neighborhood Japanese. She hated eggs, so we discovered that (less severe) allergy only when she vomited after eating a variety of products containing them.
In recent years, a growing number of families have had to grapple with such challenges. An estimated 32 million Americans have food allergies, or nearly 10 percent of the population—10 times the prevalence reported 35 years ago. The severity of symptoms seems to be increasing, too. According to a study released in January by Food Allergy Research & Education (FARE), a Virginia-based nonprofit, insurance claims for anaphylactic food reactions rose 377 percent in the U.S. from 2007 to 2016.
Because food allergies most commonly emerge in childhood, these trends are largely driven by the young. An insurance-industry study found that emergency room visits for anaphylaxis in children more than doubled from 2010 to 2016. Peanut allergies, once rare, tripled in kids between 1997 and 2008. "The first year, it was 1 in 250," says Scott Sicherer, chief of pediatric allergy and immunology at New York City's Mount Sinai Hospital, who led that study. "When we did the next round of research, in 2002, it was 1 in 125. I thought there must be a mistake. But by 2008, it was 1 in 70."
The forces behind these dire statistics—as well as similar numbers throughout the developed world—have yet to be positively identified. But the leading suspects are elements of our modern lifestyle that can throw the immune system out of whack, prompting potentially deadly overreactions to harmless proteins. Although parents can take a few steps that might lessen their children's risk, societal changes may be needed to brighten the larger epidemiological picture.
Meanwhile, scientists are racing to develop therapies that can induce patients' hyped-up immune defenses to chill. And lately, they've made some big strides toward that goal.
A Variety of Culprits
In the United States, about 90 percent of allergic reactions come from eight foods: milk, eggs, peanuts, tree nuts, soy, wheat, fish, and shellfish. The list varies from country to country, depending on dietary customs, but what the trigger foods all have in common is proteins that can survive breakdown in the stomach and enter the bloodstream more or less intact.
"When we were kids, we played in the dirt. Today, children tend to be on their screens, inside sealed buildings."
A food allergy results from a chain of biochemical misunderstandings. The first time the immune system encounters an allergen (as a protein that triggers an allergy is known), it mistakes the substance for a hostile invader—perhaps a parasite with a similar molecular profile. In response, it produces an antibody called immunoglobin E (IgE), which is designed to bind to a specific protein and flag it for attack. These antibodies circulate through the bloodstream and attach to immune-system foot soldiers known as mast cells and basophils, which congregate in the nose, throat, lungs, skin, and gastrointestinal tract.
The next time the person is exposed to the allergen, the IgE antibodies signal the warrior cells to blast the intruder with histamines and other chemical weapons. Tissues in the affected areas swell and leak fluid; blood pressure may fall. Depending on the strength of the reaction, collateral damage to the patient can range from unpleasant—itching, runny nose, nausea—to catastrophic.
This kind of immunological glitchiness runs in families. Genome-wide association studies have identified a dozen genes linked to allergies of all types, and twin studies suggest that about 80 percent of the risk of food allergies is heritable. But why one family member shows symptoms while another doesn't remains unknown. Nor can genetics explain why food allergy rates have skyrocketed in such a brief period. For that, we must turn to the environment.
First, it's important to note that rates of all allergies are rising—including skin and respiratory afflictions—though none as rapidly or with as much risk of anaphylaxis as those involving food. The takeoff was already underway in the late 1980s, when British epidemiologist David P. Strachan found that children in larger households had fewer instances of hay fever. The reason, he suggested, was that their immune systems were strengthened by exposure to their siblings' germs. Since then, other researchers have discerned more evidence for Strachan's "hygiene hypothesis": higher rates of allergy (as well as autoimmune disorders) in cities versus rural areas, in industrialized countries versus developing ones, in lab animals raised under sterile conditions versus those exposed to germs.
Fending off a variety of pathogens, experts theorize, helps train the immune system to better distinguish friend from foe, and to respond to threats in a more nuanced manner. In an era of increasing urbanization, shrinking family sizes, and more sheltered lifestyles, such conditioning may be harder to come by. "When we were kids, we played in the dirt," observes Cathryn R. Nagler, a professor and food allergy researcher at the University of Chicago. "Today, children tend to be on their screens, inside sealed buildings."
But other factors may be driving the allergy epidemic as well. More time indoors, for example, means less exposure to sunlight, which can lead to a deficiency in vitamin D—a nutrient crucial to immune system regulation. The growing popularity of processed foods filled with refined fats and sugars may play a role, along with rising rates of obesity, by promoting tissue inflammation that could increase some people's risk of immunological mayhem. And the surge in allergies also correlates with several trends that may be altering the human microbiome, the community of microbes (including bacteria, viruses, and fungi, among others) that inhabits our guts, skin, and bodily orifices.
The microbiome connection may be particularly relevant to food allergies. In 2014, a team led by Nagler published a landmark study showing that Clostridia, a common class of gut bacteria, protects against these allergies. When the researchers fed peanut allergens to germ-free mice (born and raised in sterile conditions) and to mice treated with antibiotics as newborns (reducing their gut bacteria), the animals showed a strong immunological response. This sensitization could be reversed, however, by reintroducing Clostridia—but not another class of bacteria, Bacteroides—into the mice. Further experiments revealed that Clostridia caused immune cells to produce high levels of interleukin-22 (IL-22), a signaling molecule known to decrease the permeability of the intestinal lining.
"In simple terms," Nagler says, "what we found is that these bacteria prevent food allergens from gaining access to the blood in an intact form that elicits an allergic reaction."
A growing body of evidence suggests that our eating habits are throwing our gut microbiota off-balance, in part by depriving helpful species of the dietary fiber they feed on. Our increasing exposure to antibiotics and antimicrobial compounds may be harming our beneficial bugs as well. These depletions could affect kids from the moment they enter the world: Because babies are seeded with their mothers' microbiota as they pass through the birth canal, they may be inheriting a less diverse microbiome than did previous generations. And the rising rate of caesarian deliveries may be further depriving our children of the bugs they need.
On expert suggests two measures worth a try: increasing consumption of fiber, and reducing use of antimicrobial agents, from antibacterial cleaners to antibiotics.
So which culprit is most responsible for the food allergy upsurge? "The illnesses that we're measuring are complex," says Sicherer. "There are multiple genetic inputs, which interact with one another, and there are multiple environmental inputs, which interact with each other and with the genes. There's not one single thing that's causing this. It's a conglomeration."
What Parents Can Do
For anyone hoping to reduce their child's or their own odds of developing a food allergy (rates of adult onset are also increasing), the current state of science offers few guideposts. As with many other areas of health research, it's hard to know when the data is solid enough to warrant a particular course of action. A case in point: the American Academy of Pediatrics once recommended that children at risk of allergy to peanuts (as evidenced by family history, other food allergies, or eczema) wait to eat them until age three; now, the AAP advises those parents to start their babies at four months, citing epidemiological evidence that early exposure may prevent peanut allergies.
And it's all too easy for a layperson to draw mistaken conclusions from media coverage of such research—inferring, for instance, that taking commercially available probiotics might have a protective effect. Unfortunately, says Nagler, none of those products even contain the relevant kind of bacteria.
Although, as a research scientist, she refrains from giving medical advice, Nagler does suggest (based on a large body of academic literature) that two measures are worth a try: increasing consumption of fiber, and reducing use of antimicrobial agents, from antibacterial cleaners to antibiotics. Yet she acknowledges that it's not always possible to avoid the suspected risk factors for food allergies. Sometimes an antibiotic is a lifesaving necessity, for example—and it's tough to avoid exposure to such drugs altogether, due to their use in animal feed and their consequent presence in many foods and in the water supply. If these chemicals are contributing to the food allergy epidemic, protecting ourselves will require action from farmers, doctors, manufacturers, and policymakers.
My family's experience illustrates the limits of healthy lifestyle choices in mitigating allergy risk. My daughter and son were born without C-sections; both were breastfed as well, receiving maximum microbial seeding from their mother. As a family, we eat exemplary diets, and no one could describe our home as excessively clean. Yet one child can't taste nuts, sesame, or buckwheat without becoming dangerously ill. "You can do everything right and still have allergies," says Ian A. Myles, a staff clinician at the National Institute of Allergy and Infectious Diseases. "You can do everything wrong and not have allergies. The two groups overlap."
The Latest Science Shows Promise
But while preventing all food allergies is clearly unrealistic, researchers are making remarkable progress in developing better treatments—therapies that, instead of combating symptoms after they've started (like epinephrine or antihistamines), aim to make patients less sensitive to allergens in the first place. One promising approach is oral immunotherapy (OIT), in which patients consume small but slowly increasing amounts of an allergen, gradually reducing their sensitivity. A study published last year in the New England Journal of Medicine showed that an experimental OIT called AR101, consisting of a standardized peanut powder mixed into food, enabled 67 percent of participants to tolerate a dose equivalent to two peanut kernels—a potential lifesaver if they were accidentally exposed to the real thing.
Because OIT itself can trigger troublesome reactions in some patients, however, it's not for everyone. Another experimental treatment, sublingual immunotherapy (SLIT) uses an allergen solution or dissolving tablet placed beneath the tongue; although its results are less robust than OIT's, it seems to generate milder side effects. Epicutaneous immunotherapy (EPIT) avoids the mouth entirely, using a technology similar to a nicotine patch to deliver allergens through the skin. Researchers are also exploring the use of medications known as biologics, aiming to speed up the action of immunotherapies by suppressing IgE or targeting other immune-system molecules.
These findings suggest that drugs based on microbial metabolites could help protect vulnerable individuals against a wide range of allergies.
One downside of the immunotherapy approach is that in most cases the allergen must be taken indefinitely to maintain desensitization. To provide a potentially permanent fix, scientists are working on vaccines that use DNA or peptides (protein fragments) from allergens to reset patients' immune systems.
Nagler is attacking the problem from a different angle—one that starts with the microbiome. In a recent study, a follow-up to her peanut-allergy investigation, she and her colleagues found that Clostridia bacteria protect mice against milk allergy as well; they also identified a particular species responsible, known as Anaerostipes caccae. The bugs, the team determined, produce a short-chain fatty acid called butyrate, which modulates many immune activities crucial to maintaining a well-sealed gut.
These findings suggest that drugs based on microbial metabolites could help protect vulnerable individuals against a wide range of allergies. Nagler has launched a company, ClostraBio, to develop biotherapeutics based on this notion; she expects its first product, using synthetic butyrate, to be ready for clinical trials within the next two years.
My daughter could well be a candidate for such a medication. Sam, now 15, is a vibrant, resilient kid who handles her allergies with confidence and humor. Thanks to vigilance and luck (on her part as well as her parents'), she hasn't had another food-related ER visit in more than a decade; she's never had to use her Epi-Pen. Still, she says, she would welcome the arrival of a pill that could reduce the danger. "I've learned how to watch out for myself," she says. "But it would be nice not to have to be so careful."
[Editor's Note: This is the final video of a five-part series titled "The Future Is Now: The Revolutionary Power of Stem Cell Research." Produced in partnership with the Regenerative Medicine Foundation, and filmed at the annual 2019 World Stem Cell Summit, this series illustrates how stem cell research will profoundly impact human life.]
Kira Peikoff was the editor-in-chief of Leaps.org from 2017 to 2021. As a journalist, her work has appeared in The New York Times, Newsweek, Nautilus, Popular Mechanics, The New York Academy of Sciences, and other outlets. She is also the author of four suspense novels that explore controversial issues arising from scientific innovation: Living Proof, No Time to Die, Die Again Tomorrow, and Mother Knows Best. Peikoff holds a B.A. in Journalism from New York University and an M.S. in Bioethics from Columbia University. She lives in New Jersey with her husband and two young sons. Follow her on Twitter @KiraPeikoff.