Scientists Are Growing an Edible Cholera Vaccine in Rice
The world's attention has been focused on the coronavirus crisis but Yemen, Bangladesh and many others countries in Asia and Africa are also in the grips of another pandemic: cholera. The current cholera pandemic first emerged in the 1970s and has devastated many communities in low-income countries. Each year, cholera is responsible for an estimated 1.3 million to 4 million cases and 21,000 to 143,000 deaths worldwide.
Immunologist Hiroshi Kiyono and his team at the University of Tokyo hope they can be part of the solution: They're making a cholera vaccine out of rice.
"It is much less expensive than a traditional vaccine, by a long shot."
Cholera is caused by eating food or drinking water that's contaminated by the feces of a person infected with the cholera bacteria, Vibrio cholerae. The bacteria produces the cholera toxin in the intestines, leading to vomiting, diarrhea and severe dehydration. Cholera can kill within hours of infection if it if's not treated quickly.
Current cholera vaccines are mainly oral. The most common oral are given in two doses and are made out of animal or insect cells that are infected with killed or weakened cholera bacteria. Dukoral also includes cells infected with CTB, a non-harmful part of the cholera toxin. Scientists grow cells containing the cholera bacteria and the CTB in bioreactors, large tanks in which conditions can be carefully controlled.
These cholera vaccines offer moderate protection but it wears off relatively quickly. Cold storage can also be an issue. The most common oral vaccines can be stored at room temperature but only for 14 days.
"Current vaccines confer around 60% efficacy over five years post-vaccination," says Lucy Breakwell, who leads the U.S. Centers for Disease Control and Prevention's cholera work within Global Immunization Division. Given the limited protection, refrigeration issue, and the fact that current oral vaccines require two disease, delivery of cholera vaccines in a campaign or emergency setting can be challenging. "There is a need to develop and test new vaccines to improve public health response to cholera outbreaks."
A New Kind of Vaccine
Kiyono and scientists at Tokyo University are creating a new, plant-based cholera vaccine dubbed MucoRice-CTB. The researchers genetically modify rice so that it contains CTB, a non-harmful part of the cholera toxin. The rice is crushed into a powder, mixed with saline solution and then drunk. The digestive tract is lined with mucosal membranes which contain the mucosal immune system. The mucosal immune system gets trained to recognize the cholera toxin as the rice passes through the intestines.
The cholera toxin has two main parts: the A subunit, which is harmful, and the B subunit, also known as CTB, which is nontoxic but allows the cholera bacteria to attach to gut cells. By inducing CTB-specific antibodies, "we might be able to block the binding of the vaccine toxin to gut cells, leading to the prevention of the toxin causing diarrhea," Kiyono says.
Kiyono studies the immune responses that occur at mucosal membranes across the body. He chose to focus on cholera because he wanted to replicate the way traditional vaccines work to get mucosal membranes in the digestive tract to produce an immune response. The difference is that his team is creating a food-based vaccine to induce this immune response. They are also solely focusing on getting the vaccine to induce antibodies for the cholera toxin. Since the cholera toxin is responsible for bacteria sticking to gut cells, the hope is that they can stop this process by producing antibodies for the cholera toxin. Current cholera vaccines target the cholera bacteria or both the bacteria and the toxin.
David Pascual, an expert in infectious diseases and immunology at the University of Florida, thinks that the MucoRice vaccine has huge promise. "I truly believe that the development of a food-based vaccine can be effective. CTB has a natural affinity for sampling cells in the gut to adhere, be processed, and then stimulate our immune system, he says. "In addition to vaccinating the gut, MucoRice has the potential to touch other mucosal surfaces in the mouth, which can help generate an immune response locally in the mouth and distally in the gut."
Cost Effectiveness
Kiyono says the MucoRice vaccine is much cheaper to produce than a traditional vaccine. Current vaccines need expensive bioreactors to grow cell cultures under very controlled, sterile conditions. This makes them expensive to manufacture, as different types of cell cultures need to be grown in separate buildings to avoid any chance of contamination. MucoRice doesn't require such an expensive manufacturing process because the rice plants themselves act as bioreactors.
The MucoRice vaccine also doesn't require the high cost of cold storage. It can be stored at room temperature for up to three years unlike traditional vaccines. "Plant-based vaccine development platforms present an exciting tool to reduce vaccine manufacturing costs, expand vaccine shelf life, and remove refrigeration requirements, all of which are factors that can limit vaccine supply and accessibility," Breakwell says.
Kathleen Hefferon, a microbiologist at Cornell University agrees. "It is much less expensive than a traditional vaccine, by a long shot," she says. "The fact that it is made in rice means the vaccine can be stored for long periods on the shelf, without losing its activity."
A plant-based vaccine may even be able to address vaccine hesitancy, which has become a growing problem in recent years. Hefferon suggests that "using well-known food plants may serve to reduce the anxiety of some vaccine hesitant people."
Challenges of Plant Vaccines
Despite their advantages, no plant-based vaccines have been commercialized for human use. There are a number of reasons for this, ranging from the potential for too much variation in plants to the lack of facilities large enough to grow crops that comply with good manufacturing practices. Several plant vaccines for diseases like HIV and COVID-19 are in development, but they're still in early stages.
In developing the MucoRice vaccine, scientists at the University of Tokyo have tried to overcome some of the problems with plant vaccines. They've created a closed facility where they can grow rice plants directly in nutrient-rich water rather than soil. This ensures they can grow crops all year round in a space that satisfies regulations. There's also less chance for variation since the environment is tightly controlled.
Clinical Trials and Beyond
After successfully growing rice plants containing the vaccine, the team carried out their first clinical trial. It was completed early this year. Thirty participants received a placebo and 30 received the vaccine. They were all Japanese men between the ages of 20 and 40 years old. 60 percent produced antibodies against the cholera toxin with no side effects. It was a promising result. However, there are still some issues Kiyono's team need to address.
The vaccine may not provide enough protection on its own. The antigen in any vaccine is the substance it contains to induce an immune response. For the MucoRice vaccine, the antigen is not the cholera bacteria itself but the cholera toxin the bacteria produces.
"The development of the antigen in rice is innovative," says David Sack, a professor at John Hopkins University and expert in cholera vaccine development. "But antibodies against only the toxin have not been very protective. The major protective antigen is thought to be the LPS." LPS, or lipopolysaccharide, is a component of the outer wall of the cholera bacteria that plays an important role in eliciting an immune response.
The Japanese team is considering getting the rice to also express the O antigen, a core part of the LPS. Further investigation and clinical trials will look into improving the vaccine's efficacy.
Beyond cholera, Kiyono hopes that the vaccine platform could one day be used to make cost-effective vaccines for other pathogens, such as norovirus or coronavirus.
"We believe the MucoRice system may become a new generation of vaccine production, storage, and delivery system."
The Internet has made it easier than ever to misguide people. The anti-vaxx movement, climate change denial, protests against stem cell research, and other movements like these are rooted in the spread of misinformation and a distrust of science.
"I had been taught intelligent design and young-earth creationism instead of evolution, geology, and biology."
Science illiteracy is pervasive in the communities responsible for these movements. For the mainstream, the challenge lies not in sharing the facts, but in combating the spread of misinformation and facilitating an open dialogue between experts and nonexperts.
I grew up in a household that was deeply skeptical of science and medicine. My parents are evangelical Christians who believe the word of the Bible is law. To protect my four siblings and me from secular influence, they homeschooled some of us and put the others in private Christian schools. When my oldest brother left for a Christian college and the tuition began to add up, I was placed in a public charter school to offset the costs.
There, I became acutely aware of my ignorant upbringing. I had been taught intelligent design and young-earth creationism instead of evolution, geology, and biology. My mother skipped over world religions, and much of my history curriculum was more biblical-based than factual. She warned me that stem cell research, vaccines, genetic modification of crops, and other areas of research in biological science were examples of humans trying to be like God. At the time, biologist Richard Dawkins' The God Delusion was a bestseller and science seemed like an excuse to not believe in God, so she and my father discouraged me from studying it.
The gaps in my knowledge left me feeling frustrated and embarrassed. The solution was to learn about the things that had been censored from my education, but several obstacles stood in the way.
"When I first learned about fundamentalism, my parents' behavior finally made sense."
I lacked a good foundation in basic mathematics after being taught by my mother, who never graduated college. My father, who holds a graduate degree in computer science, repeatedly told me that I inherited my mother's "bad math genes" and was therefore ill-equipped for science. While my brothers excelled at math under his supervision and were even encouraged toward careers in engineering and psychology, I was expected to do well in other subjects, such as literature. When I tried to change this by enrolling in honors math and science classes, they scolded me -- so reluctantly, I dropped math. By the time I graduated high school, I was convinced that math and science were beyond me.
When I look back at my high school transcripts, that sense of failure was unfounded: my grades were mostly A's and B's, and I excelled in honors biology. Even my elementary standardized test scores don't reflect a student disinclined toward STEM, because I consistently scored in the top percentile for sciences. Teachers often encouraged me to consider studying science in college. Why then, I wondered, did my parents reject that idea? Why did they work so hard to sway me from that path? It wasn't until I moved away from my parents' home and started working to put myself through community college that I discovered my passion for both biology and science writing.
As a young adult venturing into the field of science communication, I've become fascinated with understanding communities that foster antagonistic views toward science. When I first learned about fundamentalism, my parents' behavior finally made sense. It is the foundation of the Religious Right, a right-wing Christian group which heavily influences the Republican party in the United States. The Religious Right crusades against secular education, stem cell research, abortion, evolution, and other controversial issues in science and medicine on the basis that they contradict Christian beliefs. They are quietly overturning the separation of church and state in order to enforce their religion as policy -- at the expense of science and progress.
Growing up in this community, I learned that strong feelings about these issues arise from both a lack of science literacy and a distrust of experts. Those who are against genetic modification of crops don't understand that GMO research aims to produce more, and longer-lasting, food for a growing planet. The anti-vaxx movement is still relying on a deeply flawed study that was ultimately retracted. Those who are against stem cell research don't understand how it works or the important benefits it provides the field of medicine, such as discovering new treatment methods.
In fact, at one point the famous Christian radio show Focus on the Family spread anti-vaxx mentality when they discussed vaccines that, long ago, were derived from aborted fetal cells. Although Focus on the Family now endorses vaccines, at the time it was enough to convince my own mother, who listened to the show every morning, not to vaccinate us unless the law required it.
"In everyday interactions with skeptics, science communicators need to shift their focus from convincing to discussing."
We can help clear up misunderstandings by sharing the facts, but the real challenge lies in willful ignorance. It was hard for me to accept, but I've come to understand that I'm not going to change anyone's mind. It's up to an individual to evaluate the facts, consider the arguments for and against, and make his or her own decision.
As my parents grew older and my siblings and I introduced them to basic concepts in science, they came around to trusting the experts a little more. They now see real doctors instead of homeopathic practitioners. They acknowledge our world's changing climate instead of denying it. And they even applaud two of their children for pursuing careers in science. Although they have held on to their fundamentalism and we still disagree on many issues, these basic changes give me hope that people in deeply skeptical communities are not entirely out of reach.
In everyday interactions with skeptics, science communicators need to shift their focus from convincing to discussing. This means creating an open dialogue with the intention of being understanding and helpful, not persuasive. This approach can be beneficial in both personal and online interactions. There are people within these movements who have doubts, and their doubts will grow as we continue to feed them through discussion.
People will only change their minds when it is the right time for them to do so. We need to be there ready to hold their hand and lead them toward truth when they reach out. Until then, all we can do is keep the channels of communication open, keep sharing the facts, and fight the spread of misinformation. Science is the pursuit of truth, and as scientists and science communicators, sometimes we need to let the truth speak for itself. We're just there to hold the megaphone.
Like any life-threatening medical condition that affects children, food allergies can traumatize more than just the patient. My wife and I learned this one summer afternoon when our daughter was three years old.
Emergency room visits for anaphylaxis in children more than doubled from 2010 to 2016.
At an ice cream parlor, I gave Samantha a lick of my pistachio cone; within seconds, red blotches erupted on her skin, her lips began to swell, and she complained that her throat felt funny. We rushed her to the nearest emergency room, where a doctor injected her with epinephrine. Explaining that the reaction, known as anaphylaxis, could have been fatal if left unchecked, he advised us to have her tested for nut allergies—and to start carrying an injector of our own.
After an allergist confirmed Sam's vulnerability to tree nuts and peanuts, we figured that keeping her safe would be relatively simple. But food allergies often come in bunches. Over the next year, she wound up back in the ER after eating bread with sesame seeds at an Italian restaurant, and again after slurping buckwheat noodles at our neighborhood Japanese. She hated eggs, so we discovered that (less severe) allergy only when she vomited after eating a variety of products containing them.
In recent years, a growing number of families have had to grapple with such challenges. An estimated 32 million Americans have food allergies, or nearly 10 percent of the population—10 times the prevalence reported 35 years ago. The severity of symptoms seems to be increasing, too. According to a study released in January by Food Allergy Research & Education (FARE), a Virginia-based nonprofit, insurance claims for anaphylactic food reactions rose 377 percent in the U.S. from 2007 to 2016.
Because food allergies most commonly emerge in childhood, these trends are largely driven by the young. An insurance-industry study found that emergency room visits for anaphylaxis in children more than doubled from 2010 to 2016. Peanut allergies, once rare, tripled in kids between 1997 and 2008. "The first year, it was 1 in 250," says Scott Sicherer, chief of pediatric allergy and immunology at New York City's Mount Sinai Hospital, who led that study. "When we did the next round of research, in 2002, it was 1 in 125. I thought there must be a mistake. But by 2008, it was 1 in 70."
The forces behind these dire statistics—as well as similar numbers throughout the developed world—have yet to be positively identified. But the leading suspects are elements of our modern lifestyle that can throw the immune system out of whack, prompting potentially deadly overreactions to harmless proteins. Although parents can take a few steps that might lessen their children's risk, societal changes may be needed to brighten the larger epidemiological picture.
Meanwhile, scientists are racing to develop therapies that can induce patients' hyped-up immune defenses to chill. And lately, they've made some big strides toward that goal.
A Variety of Culprits
In the United States, about 90 percent of allergic reactions come from eight foods: milk, eggs, peanuts, tree nuts, soy, wheat, fish, and shellfish. The list varies from country to country, depending on dietary customs, but what the trigger foods all have in common is proteins that can survive breakdown in the stomach and enter the bloodstream more or less intact.
"When we were kids, we played in the dirt. Today, children tend to be on their screens, inside sealed buildings."
A food allergy results from a chain of biochemical misunderstandings. The first time the immune system encounters an allergen (as a protein that triggers an allergy is known), it mistakes the substance for a hostile invader—perhaps a parasite with a similar molecular profile. In response, it produces an antibody called immunoglobin E (IgE), which is designed to bind to a specific protein and flag it for attack. These antibodies circulate through the bloodstream and attach to immune-system foot soldiers known as mast cells and basophils, which congregate in the nose, throat, lungs, skin, and gastrointestinal tract.
The next time the person is exposed to the allergen, the IgE antibodies signal the warrior cells to blast the intruder with histamines and other chemical weapons. Tissues in the affected areas swell and leak fluid; blood pressure may fall. Depending on the strength of the reaction, collateral damage to the patient can range from unpleasant—itching, runny nose, nausea—to catastrophic.
This kind of immunological glitchiness runs in families. Genome-wide association studies have identified a dozen genes linked to allergies of all types, and twin studies suggest that about 80 percent of the risk of food allergies is heritable. But why one family member shows symptoms while another doesn't remains unknown. Nor can genetics explain why food allergy rates have skyrocketed in such a brief period. For that, we must turn to the environment.
First, it's important to note that rates of all allergies are rising—including skin and respiratory afflictions—though none as rapidly or with as much risk of anaphylaxis as those involving food. The takeoff was already underway in the late 1980s, when British epidemiologist David P. Strachan found that children in larger households had fewer instances of hay fever. The reason, he suggested, was that their immune systems were strengthened by exposure to their siblings' germs. Since then, other researchers have discerned more evidence for Strachan's "hygiene hypothesis": higher rates of allergy (as well as autoimmune disorders) in cities versus rural areas, in industrialized countries versus developing ones, in lab animals raised under sterile conditions versus those exposed to germs.
Fending off a variety of pathogens, experts theorize, helps train the immune system to better distinguish friend from foe, and to respond to threats in a more nuanced manner. In an era of increasing urbanization, shrinking family sizes, and more sheltered lifestyles, such conditioning may be harder to come by. "When we were kids, we played in the dirt," observes Cathryn R. Nagler, a professor and food allergy researcher at the University of Chicago. "Today, children tend to be on their screens, inside sealed buildings."
But other factors may be driving the allergy epidemic as well. More time indoors, for example, means less exposure to sunlight, which can lead to a deficiency in vitamin D—a nutrient crucial to immune system regulation. The growing popularity of processed foods filled with refined fats and sugars may play a role, along with rising rates of obesity, by promoting tissue inflammation that could increase some people's risk of immunological mayhem. And the surge in allergies also correlates with several trends that may be altering the human microbiome, the community of microbes (including bacteria, viruses, and fungi, among others) that inhabits our guts, skin, and bodily orifices.
The microbiome connection may be particularly relevant to food allergies. In 2014, a team led by Nagler published a landmark study showing that Clostridia, a common class of gut bacteria, protects against these allergies. When the researchers fed peanut allergens to germ-free mice (born and raised in sterile conditions) and to mice treated with antibiotics as newborns (reducing their gut bacteria), the animals showed a strong immunological response. This sensitization could be reversed, however, by reintroducing Clostridia—but not another class of bacteria, Bacteroides—into the mice. Further experiments revealed that Clostridia caused immune cells to produce high levels of interleukin-22 (IL-22), a signaling molecule known to decrease the permeability of the intestinal lining.
"In simple terms," Nagler says, "what we found is that these bacteria prevent food allergens from gaining access to the blood in an intact form that elicits an allergic reaction."
A growing body of evidence suggests that our eating habits are throwing our gut microbiota off-balance, in part by depriving helpful species of the dietary fiber they feed on. Our increasing exposure to antibiotics and antimicrobial compounds may be harming our beneficial bugs as well. These depletions could affect kids from the moment they enter the world: Because babies are seeded with their mothers' microbiota as they pass through the birth canal, they may be inheriting a less diverse microbiome than did previous generations. And the rising rate of caesarian deliveries may be further depriving our children of the bugs they need.
On expert suggests two measures worth a try: increasing consumption of fiber, and reducing use of antimicrobial agents, from antibacterial cleaners to antibiotics.
So which culprit is most responsible for the food allergy upsurge? "The illnesses that we're measuring are complex," says Sicherer. "There are multiple genetic inputs, which interact with one another, and there are multiple environmental inputs, which interact with each other and with the genes. There's not one single thing that's causing this. It's a conglomeration."
What Parents Can Do
For anyone hoping to reduce their child's or their own odds of developing a food allergy (rates of adult onset are also increasing), the current state of science offers few guideposts. As with many other areas of health research, it's hard to know when the data is solid enough to warrant a particular course of action. A case in point: the American Academy of Pediatrics once recommended that children at risk of allergy to peanuts (as evidenced by family history, other food allergies, or eczema) wait to eat them until age three; now, the AAP advises those parents to start their babies at four months, citing epidemiological evidence that early exposure may prevent peanut allergies.
And it's all too easy for a layperson to draw mistaken conclusions from media coverage of such research—inferring, for instance, that taking commercially available probiotics might have a protective effect. Unfortunately, says Nagler, none of those products even contain the relevant kind of bacteria.
Although, as a research scientist, she refrains from giving medical advice, Nagler does suggest (based on a large body of academic literature) that two measures are worth a try: increasing consumption of fiber, and reducing use of antimicrobial agents, from antibacterial cleaners to antibiotics. Yet she acknowledges that it's not always possible to avoid the suspected risk factors for food allergies. Sometimes an antibiotic is a lifesaving necessity, for example—and it's tough to avoid exposure to such drugs altogether, due to their use in animal feed and their consequent presence in many foods and in the water supply. If these chemicals are contributing to the food allergy epidemic, protecting ourselves will require action from farmers, doctors, manufacturers, and policymakers.
My family's experience illustrates the limits of healthy lifestyle choices in mitigating allergy risk. My daughter and son were born without C-sections; both were breastfed as well, receiving maximum microbial seeding from their mother. As a family, we eat exemplary diets, and no one could describe our home as excessively clean. Yet one child can't taste nuts, sesame, or buckwheat without becoming dangerously ill. "You can do everything right and still have allergies," says Ian A. Myles, a staff clinician at the National Institute of Allergy and Infectious Diseases. "You can do everything wrong and not have allergies. The two groups overlap."
The Latest Science Shows Promise
But while preventing all food allergies is clearly unrealistic, researchers are making remarkable progress in developing better treatments—therapies that, instead of combating symptoms after they've started (like epinephrine or antihistamines), aim to make patients less sensitive to allergens in the first place. One promising approach is oral immunotherapy (OIT), in which patients consume small but slowly increasing amounts of an allergen, gradually reducing their sensitivity. A study published last year in the New England Journal of Medicine showed that an experimental OIT called AR101, consisting of a standardized peanut powder mixed into food, enabled 67 percent of participants to tolerate a dose equivalent to two peanut kernels—a potential lifesaver if they were accidentally exposed to the real thing.
Because OIT itself can trigger troublesome reactions in some patients, however, it's not for everyone. Another experimental treatment, sublingual immunotherapy (SLIT) uses an allergen solution or dissolving tablet placed beneath the tongue; although its results are less robust than OIT's, it seems to generate milder side effects. Epicutaneous immunotherapy (EPIT) avoids the mouth entirely, using a technology similar to a nicotine patch to deliver allergens through the skin. Researchers are also exploring the use of medications known as biologics, aiming to speed up the action of immunotherapies by suppressing IgE or targeting other immune-system molecules.
These findings suggest that drugs based on microbial metabolites could help protect vulnerable individuals against a wide range of allergies.
One downside of the immunotherapy approach is that in most cases the allergen must be taken indefinitely to maintain desensitization. To provide a potentially permanent fix, scientists are working on vaccines that use DNA or peptides (protein fragments) from allergens to reset patients' immune systems.
Nagler is attacking the problem from a different angle—one that starts with the microbiome. In a recent study, a follow-up to her peanut-allergy investigation, she and her colleagues found that Clostridia bacteria protect mice against milk allergy as well; they also identified a particular species responsible, known as Anaerostipes caccae. The bugs, the team determined, produce a short-chain fatty acid called butyrate, which modulates many immune activities crucial to maintaining a well-sealed gut.
These findings suggest that drugs based on microbial metabolites could help protect vulnerable individuals against a wide range of allergies. Nagler has launched a company, ClostraBio, to develop biotherapeutics based on this notion; she expects its first product, using synthetic butyrate, to be ready for clinical trials within the next two years.
My daughter could well be a candidate for such a medication. Sam, now 15, is a vibrant, resilient kid who handles her allergies with confidence and humor. Thanks to vigilance and luck (on her part as well as her parents'), she hasn't had another food-related ER visit in more than a decade; she's never had to use her Epi-Pen. Still, she says, she would welcome the arrival of a pill that could reduce the danger. "I've learned how to watch out for myself," she says. "But it would be nice not to have to be so careful."