Eight Big Medical and Science Trends to Watch in 2021
The world as we know it has forever changed. With a greater focus on science and technology than before, experts in the biotech and life sciences spaces are grappling with what comes next as SARS-CoV-2, the coronavirus that causes the COVID-19 illness, has spread and mutated across the world.
Even with vaccines being distributed, so much still remains unknown.
Jared Auclair, Technical Supervisor for the Northeastern University's Life Science Testing Center in Burlington, Massachusetts, guides a COVID testing lab that cranks out thousands of coronavirus test results per day. His lab is also focused on monitoring the quality of new cell and gene therapy products coming to the market.
Here are trends Auclair and other experts are watching in 2021.
Better Diagnostic Testing for COVID
Expect improvements in COVID diagnostic testing and the ability to test at home.
There are currently three types of coronavirus tests. The molecular test—also known as the RT-PCR test, detects the virus's genetic material, and is highly accurate, but it can take days to receive results. There are also antibody tests, done through a blood draw, designed to test whether you've had COVID in the past. Finally, there's the quick antigen test that isn't as accurate as the PCR test, but can identify if people are going to infect others.
Last month, Lucira Health secured the U.S. FDA Emergency Use Authorization for the first prescription molecular diagnostic test for COVID-19 that can be performed at home. On December 15th, the Ellume Covid-19 Home Test received authorization as the first over-the-counter COVID-19 diagnostic antigen test that can be done at home without a prescription. The test uses a nasal swab that is connected to a smartphone app and returns results in 15-20 minutes. Similarly, the BinaxNOW COVID-19 Ag Card Home Test received authorization on Dec. 16 for its 15-minute antigen test that can be used within the first seven days of onset of COIVD-19 symptoms.
Home testing has the possibility to impact the pandemic pretty drastically, Auclair says, but there are other considerations: the type and timing of test that is administered, how expensive is the test (and if it is financially feasible for the general public) and the ability of a home test taker to accurately administer the test.
"The vaccine roll-out will not eliminate the need for testing until late 2021 or early 2022."
Ideally, everyone would frequently get tested, but that would mean the cost of a single home test—which is expected to be around $30 or more—would need to be much cheaper, more in the $5 range.
Auclair expects "innovations in the diagnostic space to explode" with the need for more accurate, inexpensive, quicker COVID tests. Auclair foresees innovations to be at first focused on COVID point-of-care testing, but he expects improvements within diagnostic testing for other types of viruses and diseases too.
"We still need more testing to get the pandemic under control, likely over the next 12 months," Auclair says. "The vaccine roll-out will not eliminate the need for testing until late 2021 or early 2022."
Rise of mRNA-based Vaccines and Therapies
A year ago, vaccines weren't being talked about like they are today.
"But clearly vaccines are the talk of the town," Auclair says. "The reason we got a vaccine so fast was there was so much money thrown at it."
A vaccine can take more than 10 years to fully develop, according to the World Economic Forum. Prior to the new COVID vaccines, which were remarkably developed and tested in under a year, the fastest vaccine ever made was for mumps -- and it took four years.
"Normally you have to produce a protein. This is typically done in eggs. It takes forever," says Catherine Dulac, a neuroscientist and developmental biologist at Harvard University who won the 2021 Breakthrough Prize in Life Sciences. "But an mRNA vaccine just enabled [us] to skip all sorts of steps [compared with burdensome conventional manufacturing] and go directly to a product that can be injected into people."
Non-traditional medicines based on genetic research are in their infancy. With mRNA-based vaccines hitting the market for the first time, look for more vaccines to be developed for whatever viruses we don't currently have vaccines for, like dengue virus and Ebola, Auclair says.
"There's a whole bunch of things that could be explored now that haven't been thought about in the past," Auclair says. "It could really be a game changer."
Vaccine Innovation over the last 140 years.
Max Roser/Our World in Data (Creative Commons license)
Advancements in Cell and Gene Therapies
CRISPR, a type of gene editing, is going to be huge in 2021, especially after the Nobel Prize in Chemistry was awarded to Emmanuelle Charpentier and Jennifer Doudna in October for pioneering the technology.
Right now, CRISPR isn't completely precise and can cause deletions or rearrangements of DNA.
"It's definitely not there yet, but over the next year it's going to get a lot closer and you're going to have a lot of momentum in this space," Auclair says. "CRISPR is one of the technologies I'm most excited about and 2021 is the year for it."
Gene therapies are typically used on rare genetic diseases. They work by replacing the faulty dysfunctional genes with corrected DNA codes.
"Cell and gene therapies are really where the field is going," Auclair says. "There is so much opportunity....For the first time in our life, in our existence as a species, we may actually be able to cure disease by using [techniques] like gene editing, where you cut in and out of pieces of DNA that caused a disease and put in healthy DNA," Auclair says.
For example, Spinal Muscular Atrophy is a rare genetic disorder that leads to muscle weakness, paralysis and death in children by age two. As of last year, afflicted children can take a gene therapy drug called Zolgensma that targets the missing or nonworking SMN1 gene with a new copy.
Another recent breakthrough uses gene editing for sickle cell disease. Victoria Gray, a mom from Mississippi who was exclusively followed by NPR, was the first person in the United States to be successfully treated for the genetic disorder with the help of CRISPR. She has continued to improve since her landmark treatment on July 2, 2019 and her once-debilitating pain has greatly eased.
"This is really a life-changer for me," she told NPR. "It's magnificent."
"You are going to see bigger leaps in gene therapies."
Look out also for improvements in cell therapies, but on a much lesser scale.
Cell therapies remove immune cells from a person or use cells from a donor. The cells are modified or cultured in lab, multiplied by the millions and then injected back into patients. These include stem cell therapies as well as CAR-T cell therapies, which are typically therapies of last resort and used in cancers like leukemia, Auclair says.
"You are going to see bigger leaps in gene therapies," Auclair says. "It's being heavily researched and we understand more about how to do gene therapies. Cell therapies will lie behind it a bit because they are so much more difficult to work with right now."
More Monoclonal Antibody Therapies
Look for more customized drugs to personalize medicine even more in the biotechnology space.
In 2019, the FDA anticipated receiving more than 200 Investigational New Drug (IND) applications in 2020. But with COVID, the number of INDs skyrocketed to 6,954 applications for the 2020 fiscal year, which ended September 30, 2020, according to the FDA's online tracker. Look for antibody therapies to play a bigger role.
Monoclonal antibodies are lab-grown proteins that mimic or enhance the immune system's response to fight off pathogens, like viruses, and they've been used to treat cancer. Now they are being used to treat patients with COVID-19.
President Donald Trump received a monoclonal antibody cocktail, called REGEN-COV2, which later received FDA emergency use authorization.
A newer type of monoclonal antibody therapy is Antibody-Drug Conjugates, also called ADCs. It's something we're going to be hearing a lot about in 2021, Auclair says.
"Antibody-Drug Conjugates is a monoclonal antibody with a chemical, we consider it a chemical warhead on it," Auclair says. "The monoclonal antibody binds to a specific antigen in your body or protein and delivers a chemical to that location and kills the infected cell."
Moving Beyond Male-Centric Lab Testing
Scientific testing for biology has, until recently, focused on testing males. Dulac, a Howard Hughes Medical Investigator and professor of molecular and cellular biology at Harvard University, challenged that idea to find brain circuitry behind sex-specific behaviors.
"For the longest time, until now, all the model systems in biology, are male," Dulac says. "The idea is if you do testing on males, you don't need to do testing on females."
Clinical models are done in male animals, as well as fundamental research. Because biological research is always done on male models, Dulac says the outcomes and understanding in biology is geared towards understanding male biology.
"All the drugs currently on the market and diagnoses of diseases are biased towards the understanding of male biology," Dulac says. "The diagnostics of diseases is way weaker in women than men."
That means the treatment isn't necessarily as good for women as men, she says, including what is known and understood about pain medication.
"So pain medication doesn't work well in women," Dulac says. "It works way better in men. It's true for almost all diseases that I know. Why? because you have a science that is dominated by males."
Although some in the scientific community challenge that females are not interesting or too complicated with their hormonal variations, Dulac says that's simply not true.
"There's absolutely no reason to decide 50% of life forms are interesting and the other 50% are not interesting. What about looking at both?" says Dulac, who was awarded the $3 million Breakthrough Prize in Life Sciences in September for connecting specific neural mechanisms to male and female parenting behaviors.
Disease Research on Single Cells
To better understand how diseases manifest in the body's cell and tissues, many researchers are looking at single-cell biology. Cells are the most fundamental building blocks of life. Much still needs to be learned.
"A remarkable development this year is the massive use of analysis of gene expression and chromosomal regulation at the single-cell level," Dulac says.
Much is focused on the Human Cell Atlas (HCA), a global initiative to map all cells in healthy humans and to better identify which genes associated with diseases are active in a person's body. Most estimates put the number of cells around 30 trillion.
Dulac points to work being conducted by the Cell Census Network (BICCN) Brain Initiative, an initiative by the National Institutes of Health to come up with an atlas of cell types in mouse, human and non-human primate brains, and the Chan Zuckerberg Initiative's funding of single-cell biology projects, including those focused on single-cell analysis of inflammation.
"Our body and our brain are made of a large number of cell types," Dulac says. "The ability to explore and identify differences in gene expression and regulation in massively multiplex ways by analyzing millions of cells is extraordinarily important."
Converting Plastics into Food
Yep, you heard it right, plastics may eventually be turned into food. The Defense Advanced Research Projects Agency, better known as DARPA, is funding a project—formally titled "Production of Macronutrients from Thermally Oxo-Degraded Wastes"—and asking researchers how to do this.
"When I first heard about this challenge, I thought it was absolutely absurd," says Dr. Robert Brown, director of the Bioeconomy Institute at Iowa State University and the project's principal investigator, who is working with other research partners at the University of Delaware, Sandia National Laboratories, and the American Institute of Chemical Engineering (AIChE)/RAPID Institute.
But then Brown realized plastics will slowly start oxidizing—taking in oxygen—and microorganisms can then consume it. The oxidation process at room temperature is extremely slow, however, which makes plastics essentially not biodegradable, Brown says.
That changes when heat is applied at brick pizza oven-like temperatures around 900-degrees Fahrenheit. The high temperatures get compounds to oxidize rapidly. Plastics are synthetic polymers made from petroleum—large molecules formed by linking many molecules together in a chain. Heated, these polymers will melt and crack into smaller molecules, causing them to vaporize in a process called devolatilization. Air is then used to cause oxidation in plastics and produce oxygenated compounds—fatty acids and alcohols—that microorganisms will eat and grow into single-cell proteins that can be used as an ingredient or substitute in protein-rich foods.
"The caveat is the microorganisms must be food-safe, something that we can consume," Brown says. "Like supplemental or nutritional yeast, like we use to brew beer and to make bread or is used in Australia to make Vegemite."
What do the microorganisms look like? For any home beer brewers, it's the "gunky looking stuff you'd find at the bottom after the fermentation process," Brown says. "That's cellular biomass. Like corn grown in the field, yeast or other microorganisms like bacteria can be harvested as macro-nutrients."
Brown says DARPA's ReSource program has challenged all the project researchers to find ways for microorganisms to consume any plastics found in the waste stream coming out of a military expeditionary force, including all the packaging of food and supplies. Then the researchers aim to remake the plastic waste into products soldiers can use, including food. The project is in the first of three phases.
"We are talking about polyethylene, polypropylene, like PET plastics used in water bottles and converting that into macronutrients that are food," says Brown.
Renewed Focus on Climate Change
The Union of Concerned Scientists say carbon dioxide levels are higher today than any point in at least 800,000 years.
"Climate science is so important for all of humankind. It is critical because the quality of life of humans on the planet depends on it."
Look for technology to help locate large-scale emitters of carbon dioxide, including sensors on satellites and artificial intelligence to optimize energy usage, especially in data centers.
Other technologies focus on alleviating the root cause of climate change: emissions of heat-trapping gasses that mainly come from burning fossil fuels.
Direct air carbon capture, an emerging effort to capture carbon dioxide directly from ambient air, could play a role.
The technology is in the early stages of development and still highly uncertain, says Peter Frumhoff, director of science and policy at Union of Concerned Scientists. "There are a lot of questions about how to do that at sufficiently low costs...and how to scale it up so you can get carbon dioxide stored in the right way," he says, and it can be very energy intensive.
One of the oldest solutions is planting new forests, or restoring old ones, which can help convert carbon dioxide into oxygen through photosynthesis. Hence the Trillion Trees Initiative launched by the World Economic Forum. Trees are only part of the solution, because planting trees isn't enough on its own, Frumhoff says. That's especially true, since 2020 was the year that human-made, artificial stuff now outweighs all life on earth.
More research is also going into artificial photosynthesis for solar fuels. The U.S. Department of Energy awarded $100 million in 2020 to two entities that are conducting research. Look also for improvements in battery storage capacity to help electric vehicles, as well as back-up power sources for solar and wind power, Frumhoff says.
Another method to combat climate change is solar geoengineering, also called solar radiation management, which reflects sunlight back to space. The idea stems from a volcanic eruption in 1991 that released a tremendous amount of sulfate aerosol particles into the stratosphere, reflecting the sunlight away from Earth. The planet cooled by a half degree for nearly a year, Frumhoff says. However, he acknowledges, "there's a lot of things we don't know about the potential impacts and risks" involved in this controversial approach.
Whatever the approach, scientific solutions to climate change are attracting renewed attention. Under President Trump, the White House Office of Science and Technology Policy didn't have an acting director for almost two years. Expect that to change when President-elect Joe Biden takes office.
"Climate science is so important for all of humankind," Dulac says. "It is critical because the quality of life of humans on the planet depends on it."
Creamy milk with velvety texture. Dark with sprinkles of sea salt. Crunchy hazelnut-studded chunks. Chocolate is a treat that appeals to billions of people worldwide, no matter the age. And it’s not only the taste, but the feel of a chocolate morsel slowly melting in our mouths—the smoothness and slipperiness—that’s part of the overwhelming satisfaction. Why is it so enjoyable?
That’s what an interdisciplinary research team of chocolate lovers from the University of Leeds School of Food Science and Nutrition and School of Mechanical Engineering in the U.K. resolved to study in 2021. They wanted to know, “What is making chocolate that desirable?” says Siavash Soltanahmadi, one of the lead authors of a new study about chocolates hedonistic quality.
Besides addressing the researchers’ general curiosity, their answers might help chocolate manufacturers make the delicacy even more enjoyable and potentially healthier. After all, chocolate is a billion-dollar industry. Revenue from chocolate sales, whether milk or dark, is forecasted to grow 13 percent by 2027 in the U.K. In the U.S., chocolate and candy sales increased by 11 percent from 2020 to 2021, on track to reach $44.9 billion by 2026. Figuring out how chocolate affects the human palate could up the ante even more.
Building a 3D tongue
The team began by building a 3D tongue to analyze the physical process by which chocolate breaks down inside the mouth.
As part of the effort, reported earlier this year in the scientific journal ACS Applied Materials and Interfaces, the team studied a large variety of human tongues with the intention to build an “average” 3D model, says Soltanahmadi, a lubrication scientist. When it comes to edible substances, lubrication science looks at how food feels in the mouth and can help design foods that taste better and have more satisfying texture or health benefits.
There are variations in how people enjoy chocolate; some chew it while others “lick it” inside their mouths.
Tongue impressions from human participants studied using optical imaging helped the team build a tongue with key characteristics. “Our tongue is not a smooth muscle, it’s got some texture, it has got some roughness,” Soltanahmadi says. From those images, the team came up with a digital design of an average tongue and, using 3D printed molds, built a “mimic tongue.” They also added elastomers—such as silicone or polyurethane—to mimic the roughness, the texture and the mechanical properties of a real tongue. “Wettability" was another key component of the 3D tongue, Soltanahmadi says, referring to whether a surface mixes with water (hydrophilic) or, in the case of oil, resists it (hydrophobic).
Notably, the resulting artificial 3D-tongues looked nothing like the human version, but they were good mimics. The scientists also created “testing kits” that produced data on various physical parameters. One such parameter was viscosity, the measure of how gooey a food or liquid is — honey is more viscous compared to water, for example. Another was tribology, which defines how slippery something is — high fat yogurt is more slippery than low fat yogurt; milk can be more slippery than water. The researchers then mixed chocolate with artificial saliva and spread it on the 3D tongue to measure the tribology and the viscosity. From there they were able to study what happens inside the mouth when we eat chocolate.
The team focused on the stages of lubrication and the location of the fat in the chocolate, a process that has rarely been researched.
The artificial 3D-tongues look nothing like human tongues, but they function well enough to do the job.
Courtesy Anwesha Sarkar and University of Leeds
The oral processing of chocolate
We process food in our mouths in several stages, Soltanahmadi says. And there is variation in these stages depending on the type of food. So, the oral processing of a piece of meat would be different from, say, the processing of jelly or popcorn.
There are variations with chocolate, in particular; some people chew it while others use their tongues to explore it (within their mouths), Soltanahmadi explains. “Usually, from a consumer perspective, what we find is that if you have a luxury kind of a chocolate, then people tend to start with licking the chocolate rather than chewing it.” The researchers used a luxury brand of dark chocolate and focused on the process of licking rather than chewing.
As solid cocoa particles and fat are released, the emulsion envelops the tongue and coats the palette creating a smooth feeling of chocolate all over the mouth. That tactile sensation is part of the chocolate’s hedonistic appeal we crave.
Understanding the make-up of the chocolate was also an important step in the study. “Chocolate is a composite material. So, it has cocoa butter, which is oil, it has some particles in it, which is cocoa solid, and it has sugars," Soltanahmadi says. "Dark chocolate has less oil, for example, and less sugar in it, most of the time."
The researchers determined that the oral processing of chocolate begins as soon as it enters a person’s mouth; it starts melting upon exposure to one’s body temperature, even before the tongue starts moving, Soltanahmadi says. Then, lubrication begins. “[Saliva] mixes with the oily chocolate and it makes an emulsion." An emulsion is a fluid with a watery (or aqueous) phase and an oily phase. As chocolate breaks down in the mouth, that solid piece turns into a smooth emulsion with a fatty film. “The oil from the chocolate becomes droplets in a continuous aqueous phase,” says Soltanahmadi. In other words, as solid cocoa particles and fat are released, the emulsion envelops the tongue and coats the palette, creating a smooth feeling of chocolate all over the mouth. That tactile sensation is part of the chocolate’s hedonistic appeal we crave, says Soltanahmadi.
Finding the sweet spot
After determining how chocolate is orally processed, the research team wanted to find the exact sweet spot of the breakdown of solid cocoa particles and fat as they are released into the mouth. They determined that the epicurean pleasure comes only from the chocolate's outer layer of fat; the secondary fatty layers inside the chocolate don’t add to the sensation. It was this final discovery that helped the team determine that it might be possible to produce healthier chocolate that would contain less oil, says Soltanahmadi. And therefore, less fat.
Rongjia Tao, a physicist at Temple University in Philadelphia, thinks the Leeds study and the concept behind it is “very interesting.” Tao, himself, did a study in 2016 and found he could reduce fat in milk chocolate by 20 percent. He believes that the Leeds researchers’ discovery about the first layer of fat being more important for taste than the other layer can inform future chocolate manufacturing. “As a scientist I consider this significant and an important starting point,” he says.
Chocolate is rich in polyphenols, naturally occurring compounds also found in fruits and vegetables, such as grapes, apples and berries. Research found that plant polyphenols can protect against cancer, diabetes and osteoporosis as well as cardiovascular ad neurodegenerative diseases.
Not everyone thinks it’s a good idea, such as chef Michael Antonorsi, founder and owner of Chuao Chocolatier, one of the leading chocolate makers in the U.S. First, he says, “cacao fat is definitely a good fat.” Second, he’s not thrilled that science is trying to interfere with nature. “Every time we've tried to intervene and change nature, we get things out of balance,” says Antonorsi. “There’s a reason cacao is botanically known as food of the gods. The botanical name is the Theobroma cacao: Theobroma in ancient Greek, Theo is God and Brahma is food. So it's a food of the gods,” Antonorsi explains. He’s doubtful that a chocolate made only with a top layer of fat will produce the same epicurean satisfaction. “You're not going to achieve the same sensation because that surface fat is going to dissipate and there is no fat from behind coming to take over,” he says.
Without layers of fat, Antonorsi fears the deeply satisfying experiential part of savoring chocolate will be lost. The University of Leeds team, however, thinks that it may be possible to make chocolate healthier - when consumed in limited amounts - without sacrificing its taste. They believe the concept of less fatty but no less slick chocolate will resonate with at least some chocolate-makers and consumers, too.
Chocolate already contains some healthful compounds. Its cocoa particles have “loads of health benefits,” says Soltanahmadi. Dark chocolate usually has more cocoa than milk chocolate. Some experts recommend that dark chocolate should contain at least 70 percent cocoa in order for it to offer some health benefit. Research has shown that the cocoa in chocolate is rich in polyphenols, naturally occurring compounds also found in fruits and vegetables, such as grapes, apples and berries. Research has shown that consuming plant polyphenols can be protective against cancer, diabetes and osteoporosis as well as cardiovascular and neurodegenerative diseases.
“So keeping the healthy part of it and reducing the oily part of it, which is not healthy, but is giving you that indulgence of it … that was the final aim,” Soltanahmadi says. He adds that the team has been approached by individuals in the chocolate industry about their research. “Everyone wants to have a healthy chocolate, which at the same time tastes brilliant and gives you that self-indulging experience.”
In 1945, almost two decades after Alexander Fleming discovered penicillin, he warned that as antibiotics use grows, they may lose their efficiency. He was prescient—the first case of penicillin resistance was reported two years later. Back then, not many people paid attention to Fleming’s warning. After all, the “golden era” of the antibiotics age had just began. By the 1950s, three new antibiotics derived from soil bacteria — streptomycin, chloramphenicol, and tetracycline — could cure infectious diseases like tuberculosis, cholera, meningitis and typhoid fever, among others.
Today, these antibiotics and many of their successors developed through the 1980s are gradually losing their effectiveness. The extensive overuse and misuse of antibiotics led to the rise of drug resistance. The livestock sector buys around 80 percent of all antibiotics sold in the U.S. every year. Farmers feed cows and chickens low doses of antibiotics to prevent infections and fatten up the animals, which eventually causes resistant bacterial strains to evolve. If manure from cattle is used on fields, the soil and vegetables can get contaminated with antibiotic-resistant bacteria. Another major factor is doctors overprescribing antibiotics to humans, particularly in low-income countries. Between 2000 to 2018, the global rates of human antibiotic consumption shot up by 46 percent.
In recent years, researchers have been exploring a promising avenue: the use of synthetic biology to engineer new bacteria that may work better than antibiotics. The need continues to grow, as a Lancet study linked antibiotic resistance to over 1.27 million deaths worldwide in 2019, surpassing HIV/AIDS and malaria. The western sub-Saharan Africa region had the highest death rate (27.3 people per 100,000).
Researchers warn that if nothing changes, by 2050, antibiotic resistance could kill 10 million people annually.
To make it worse, our remedy pipelines are drying up. Out of the 18 biggest pharmaceutical companies, 15 abandoned antibiotic development by 2013. According to the AMR Action Fund, venture capital has remained indifferent towards biotech start-ups developing new antibiotics. In 2019, at least two antibiotic start-ups filed for bankruptcy. As of December 2020, there were 43 new antibiotics in clinical development. But because they are based on previously known molecules, scientists say they are inadequate for treating multidrug-resistant bacteria. Researchers warn that if nothing changes, by 2050, antibiotic resistance could kill 10 million people annually.
The rise of synthetic biology
To circumvent this dire future, scientists have been working on alternative solutions using synthetic biology tools, meaning genetically modifying good bacteria to fight the bad ones.
From the time life evolved on earth around 3.8 billion years ago, bacteria have engaged in biological warfare. They constantly strategize new methods to combat each other by synthesizing toxic proteins that kill competition.
For example, Escherichia coli produces bacteriocins or toxins to kill other strains of E.coli that attempt to colonize the same habitat. Microbes like E.coli (which are not all pathogenic) are also naturally present in the human microbiome. The human microbiome harbors up to 100 trillion symbiotic microbial cells. The majority of them are beneficial organisms residing in the gut at different compositions.
The chemicals that these “good bacteria” produce do not pose any health risks to us, but can be toxic to other bacteria, particularly to human pathogens. For the last three decades, scientists have been manipulating bacteria’s biological warfare tactics to our collective advantage.
In the late 1990s, researchers drew inspiration from electrical and computing engineering principles that involve constructing digital circuits to control devices. In certain ways, every cell in living organisms works like a tiny computer. The cell receives messages in the form of biochemical molecules that cling on to its surface. Those messages get processed within the cells through a series of complex molecular interactions.
Synthetic biologists can harness these living cells’ information processing skills and use them to construct genetic circuits that perform specific instructions—for example, secrete a toxin that kills pathogenic bacteria. “Any synthetic genetic circuit is merely a piece of information that hangs around in the bacteria’s cytoplasm,” explains José Rubén Morones-Ramírez, a professor at the Autonomous University of Nuevo León, Mexico. Then the ribosome, which synthesizes proteins in the cell, processes that new information, making the compounds scientists want bacteria to make. “The genetic circuit remains separated from the living cell’s DNA,” Morones-Ramírez explains. When the engineered bacteria replicates, the genetic circuit doesn’t become part of its genome.
Highly intelligent by bacterial standards, some multidrug resistant V. cholerae strains can also “collaborate” with other intestinal bacterial species to gain advantage and take hold of the gut.
In 2000, Boston-based researchers constructed an E.coli with a genetic switch that toggled between turning genes on and off two. Later, they built some safety checks into their bacteria. “To prevent unintentional or deleterious consequences, in 2009, we built a safety switch in the engineered bacteria’s genetic circuit that gets triggered after it gets exposed to a pathogen," says James Collins, a professor of biological engineering at MIT and faculty member at Harvard University’s Wyss Institute. “After getting rid of the pathogen, the engineered bacteria is designed to switch off and leave the patient's body.”
Overuse and misuse of antibiotics causes resistant strains to evolve
Adobe Stock
Seek and destroy
As the field of synthetic biology developed, scientists began using engineered bacteria to tackle superbugs. They first focused on Vibrio cholerae, which in the 19th and 20th century caused cholera pandemics in India, China, the Middle East, Europe, and Americas. Like many other bacteria, V. cholerae communicate with each other via quorum sensing, a process in which the microorganisms release different signaling molecules, to convey messages to its brethren. Highly intelligent by bacterial standards, some multidrug resistant V. cholerae strains can also “collaborate” with other intestinal bacterial species to gain advantage and take hold of the gut. When untreated, cholera has a mortality rate of 25 to 50 percent and outbreaks frequently occur in developing countries, especially during floods and droughts.
Sometimes, however, V. cholerae makes mistakes. In 2008, researchers at Cornell University observed that when quorum sensing V. cholerae accidentally released high concentrations of a signaling molecule called CAI-1, it had a counterproductive effect—the pathogen couldn’t colonize the gut.
So the group, led by John March, professor of biological and environmental engineering, developed a novel strategy to combat V. cholerae. They genetically engineered E.coli to eavesdrop on V. cholerae communication networks and equipped it with the ability to release the CAI-1 molecules. That interfered with V. cholerae progress. Two years later, the Cornell team showed that V. cholerae-infected mice treated with engineered E.coli had a 92 percent survival rate.
These findings inspired researchers to sic the good bacteria present in foods like yogurt and kimchi onto the drug-resistant ones.
Three years later in 2011, Singapore-based scientists engineered E.coli to detect and destroy Pseudomonas aeruginosa, an often drug-resistant pathogen that causes pneumonia, urinary tract infections, and sepsis. Once the genetically engineered E.coli found its target through its quorum sensing molecules, it then released a peptide, that could eradicate 99 percent of P. aeruginosa cells in a test-tube experiment. The team outlined their work in a Molecular Systems Biology study.
“At the time, we knew that we were entering new, uncharted territory,” says lead author Matthew Chang, an associate professor and synthetic biologist at the National University of Singapore and lead author of the study. “To date, we are still in the process of trying to understand how long these microbes stay in our bodies and how they might continue to evolve.”
More teams followed the same path. In a 2013 study, MIT researchers also genetically engineered E.coli to detect P. aeruginosa via the pathogen’s quorum-sensing molecules. It then destroyed the pathogen by secreting a lab-made toxin.
Probiotics that fight
A year later in 2014, a Nature study found that the abundance of Ruminococcus obeum, a probiotic bacteria naturally occurring in the human microbiome, interrupts and reduces V.cholerae’s colonization— by detecting the pathogen’s quorum sensing molecules. The natural accumulation of R. obeum in Bangladeshi adults helped them recover from cholera despite living in an area with frequent outbreaks.
The findings from 2008 to 2014 inspired Collins and his team to delve into how good bacteria present in foods like yogurt and kimchi can attack drug-resistant bacteria. In 2018, Collins and his team developed the engineered probiotic strategy. They tweaked a bacteria commonly found in yogurt called Lactococcus lactis to treat cholera.
Engineered bacteria can be trained to target pathogens when they are at their most vulnerable metabolic stage in the human gut. --José Rubén Morones-Ramírez.
More scientists followed with more experiments. So far, researchers have engineered various probiotic organisms to fight pathogenic bacteria like Staphylococcus aureus (leading cause of skin, tissue, bone, joint and blood infections) and Clostridium perfringens (which causes watery diarrhea) in test-tube and animal experiments. In 2020, Russian scientists engineered a probiotic called Pichia pastoris to produce an enzyme called lysostaphin that eradicated S. aureus in vitro. Another 2020 study from China used an engineered probiotic bacteria Lactobacilli casei as a vaccine to prevent C. perfringens infection in rabbits.
In a study last year, Ramírez’s group at the Autonomous University of Nuevo León, engineered E. coli to detect quorum-sensing molecules from Methicillin-resistant Staphylococcus aureus or MRSA, a notorious superbug. The E. coli then releases a bacteriocin that kills MRSA. “An antibiotic is just a molecule that is not intelligent,” says Ramírez. “On the other hand, engineered bacteria can be trained to target pathogens when they are at their most vulnerable metabolic stage in the human gut.”
Collins and Timothy Lu, an associate professor of biological engineering at MIT, found that engineered E. coli can help treat other conditions—such as phenylketonuria, a rare metabolic disorder, that causes the build-up of an amino acid phenylalanine. Their start-up Synlogic aims to commercialize the technology, and has completed a phase 2 clinical trial.
Circumventing the challenges
The bacteria-engineering technique is not without pitfalls. One major challenge is that beneficial gut bacteria produce their own quorum-sensing molecules that can be similar to those that pathogens secrete. If an engineered bacteria’s biosensor is not specific enough, it will be ineffective.
Another concern is whether engineered bacteria might mutate after entering the gut. “As with any technology, there are risks where bad actors could have the capability to engineer a microbe to act quite nastily,” says Collins of MIT. But Collins and Ramírez both insist that the chances of the engineered bacteria mutating on its own are virtually non-existent. “It is extremely unlikely for the engineered bacteria to mutate,” Ramírez says. “Coaxing a living cell to do anything on command is immensely challenging. Usually, the greater risk is that the engineered bacteria entirely lose its functionality.”
However, the biggest challenge is bringing the curative bacteria to consumers. Pharmaceutical companies aren’t interested in antibiotics or their alternatives because it’s less profitable than developing new medicines for non-infectious diseases. Unlike the more chronic conditions like diabetes or cancer that require long-term medications, infectious diseases are usually treated much quicker. Running clinical trials are expensive and antibiotic-alternatives aren’t lucrative enough.
“Unfortunately, new medications for antibiotic resistant infections have been pushed to the bottom of the field,” says Lu of MIT. “It's not because the technology does not work. This is more of a market issue. Because clinical trials cost hundreds of millions of dollars, the only solution is that governments will need to fund them.” Lu stresses that societies must lobby to change how the modern healthcare industry works. “The whole world needs better treatments for antibiotic resistance.”