COVID Variants Are Like “a Thief Changing Clothes” – and Our Camera System Barely Exists
Whether it's "natural selection" as Darwin called it, or it's "mutating" as the X-Men called it, living organisms change over time, developing thumbs or more efficient protein spikes, depending on the organism and the demands of its environment. The coronavirus that causes COVID-19, SARS-CoV-2, is not an exception, and now, after the virus has infected millions of people around the globe for more than a year, scientists are beginning to see those changes.
The notorious variants that have popped up include B.1.1.7, sometimes called the UK variant, as well as P.1 and B.1.351, which seem to have emerged in Brazil and South Africa respectively. As vaccinations are picking up pace, officials are warning that now
is not the time to become complacent or relax restrictions because the variants aren't well understood.
Some appear to be more transmissible, and deadlier, while others can evade the immune system's defenses better than earlier versions of the virus, potentially undermining the effectiveness of vaccines to some degree. Genomic surveillance, the process of sequencing the genetic code of the virus widely to observe changes and patterns, is a critical way that scientists can keep track of its evolution and work to understand how the variants might affect humans.
"It's like a thief changing clothes"
It's important to note that viruses mutate all the time. If there were funding and personnel to sequence the genome of every sample of the virus, scientists would see thousands of mutations. Not every variant deserves our attention. The vast majority of mutations are not important at all, but recognizing those that are is a crucial tool in getting and staying ahead of the virus. The work of sequencing, analyzing, observing patterns, and using public health tools as necessary is complicated and confusing to those without years of specialized training.
Jeremy Kamil, associate professor of microbiology and immunology at LSU Health Shreveport, in Louisiana, says that the variants developing are like a thief changing clothes. The thief goes in your house, steals your stuff, then leaves and puts on a different shirt and a wig, in the hopes you won't recognize them. Genomic surveillance catches the "thief" even in those different clothes.
One of the tricky things about variants is recognizing the point at which they move from interesting, to concerning at a local level, to dangerous in a larger context.
Understanding variants, both the uninteresting ones and the potentially concerning ones, gives public health officials and researchers at different levels a useful set of tools. Locally, knowing which variants are circulating in the community helps leaders know whether mask mandates and similar measures should be implemented or discontinued, or whether businesses and schools can open relatively safely.
There's more to it than observing new variants
Analysis is complex, particularly when it comes to understanding which variants are of concern. "So the question is always if a mutation becomes common, is that a random occurrence?" says Phoebe Lostroh, associate professor of molecular biology at Colorado College. "Or is the variant the result of some kind of selection because the mutation changes some property about the virus that makes it reproduce more quickly than variants of the virus that don't have that mutation? For a virus, [mutations can affect outcomes like] how much it replicates inside a person's body, how much somebody breathes it out, whether the particles that somebody might breathe in get smaller and can lead to greater transmission."
Along with all of those factors, accurate and useful genomic surveillance requires an understanding of where variants are occurring, how they are related, and an examination of why they might be prevalent.
For example, if a potentially worrisome variant appears in a community and begins to spread very quickly, it's not time to raise a public health alarm until several important questions have been answered, such as whether the variant is spreading due to specific events, or if it's happening because the mutation has allowed the virus to infect people more efficiently. Kamil offered a hypothetical scenario to explain: Imagine that a member of a community became infected and the virus mutated. That person went to church and three more people were infected, but one of them went to a karaoke bar and while singing infected 100 other people. Examining the conditions under which the virus has spread is, therefore, an essential part of untangling whether a mutation itself made the virus more transmissible or if an infected person's behaviors contributed to a local outbreak.
One of the tricky things about variants is recognizing the point at which they move from interesting, to concerning at a local level, to dangerous in a larger context. Genomic sequencing can help with that, but only when it's coordinated. When the same mutation occurs frequently, but is localized to one region, it's a concern, but when the same mutation happens in different places at the same time, it's much more likely that the "virus is learning that's a good mutation," explains Kamil.
The process is called convergent evolution, and it was a fascinating topic long before COVID. Just as your heritage can be traced through DNA, so can that of viruses, and when separate lineages develop similar traits it's almost like scientists can see evolution happening in real time. A mutation to SARS-CoV-2 that happens in more than one place at once is a mutation that makes it easier in some way for the virus to survive and that is when it may become alarming. The widespread, documented variants P.1 and B.1.351 are examples of convergence because they share some of the same virulent mutations despite having developed thousands of miles apart.
However, even variants that are emerging in different places at the same time don't present the kind of threat SARS-CoV-2 did in 2019. "This is nature," says Kamil. "It just means that this virus will not easily be driven to extinction or complete elimination by vaccines." Although a person who has already had COVID-19 can be reinfected with a variant, "it is almost always much milder disease" than the original infection, Kamil adds. Rather than causing full-fledged disease, variants have the potiental to "penetrate herd immunity, spreading relatively quietly among people who have developed natural immunity or been vaccinated, until the virus finds someone who has no immunity yet, and that person would be at risk of hospitalization-grade severe disease or death."
Surveillance and predictions
According to Lostroh, genomic surveillance can help scientists predict what's going to happen. "With the British strain, for instance, that's more transmissible, you can measure how fast it's doubling in the population and you can sort of tell whether we should take more measures against this mutation. Should we shut things down a little longer because that mutation is present in the population? That could be really useful if you did enough sampling in the population that you knew where it was," says Lostroh. If, for example, the more transmissible strain was present in 50 percent of cases, but in another county or state it was barely present, it would allow for rolling lockdowns instead of sweeping measures.
Variants are also extremely important when it comes to the development, manufacture, and distribution of vaccines. "You're also looking at medical countermeasures, such as whether your vaccine is still effective, or if your antiviral needs to be updated," says Lane Warmbrod, a senior analyst and research associate at Johns Hopkins Center for Health Security.
Properly funded and extensive genomic surveillance could eventually help control endemic diseases, too, like the seasonal flu, or other common respiratory infections. Kamil says he envisions a future in which genomic surveillance allows for prediction of sickness just as the weather is predicted today. "It's a 51 for infection today at the San Francisco Airport. There's been detection of some respiratory viruses," he says, offering an example. He says that if you're a vulnerable person, if you're immune-suppressed for some reason, you may want to wear a mask based on the sickness report.
The U.S. has the ability, but lacks standards
The benefits of widespread genomic surveillance are clear, and the United States certainly has the necessary technology, equipment, and personnel to carry it out. But, it's not happening at the speed and extent it needs to for the country to gain the benefits.
"The numbers are improving," said Kamil. "We're probably still at less than half a percent of all the samples that have been taken have been sequenced since the beginning of the pandemic."
Although there's no consensus on how many sequences is ideal for a robust surveillance program, modeling performed by the company Illumina suggests about 5 percent of positive tests should be sequenced. The reasons the U.S. has lagged in implementing a sequencing program are complex and varied, but solvable.
Perhaps the most important element that is currently missing is leadership. In order to conduct an effective genomic surveillance program, there need to be standards. The Johns Hopkins Center for Health Security recently published a paper with recommendations as to what kinds of elements need to be standardized in order to make the best use of sequencing technology and analysis.
"Along with which bioinformatic pipelines you're going to use to do the analyses, which sequencing strategy protocol are you going to use, what's your sampling strategy going to be, how is the data is going to be reported, what data gets reported," says Warmbrod. Currently, there's no guidance from the CDC on any of those things. So, while scientists can collect and report information, they may be collecting and reporting different information that isn't comparable, making it less useful for public health measures and vaccine updates.
Globally, one of the most important tools in making the information from genomic surveillance useful is GISAID, a platform designed for scientists to share -- and, importantly, to be credited for -- their data regarding genetic sequences of influenza. Originally, it was launched as a database of bird flu sequences, but has evolved to become an essential tool used by the WHO to make flu vaccine virus recommendations each year. Scientists who share their credentials have free access to the database, and anyone who uses information from the database must credit the scientist who uploaded that information.
Safety, logistics, and funding matter
Scientists at university labs and other small organizations have been uploading sequences to GISAID almost from the beginning of the pandemic, but their funding is generally limited, and there are no standards regarding information collection or reporting. Private, for-profit labs haven't had motivation to set up sequencing programs, although many of them have the logistical capabilities and funding to do so. Public health departments are understaffed, underfunded, and overwhelmed.
University labs may also be limited by safety concerns. The SARS-CoV-2 virus is dangerous, and there's a question of how samples should be transported to labs for sequencing.
Larger, for-profit organizations often have the tools and distribution capabilities to safely collect and sequence samples, but there hasn't been a profit motive. Genomic sequencing is less expensive now than ever before, but even at $100 per sample, the cost adds up -- not to mention the cost of employing a scientist with the proper credentials to analyze the sequence.
The path forward
The recently passed COVID-19 relief bill does have some funding to address genomic sequencing. Specifically, the American Rescue Plan Act includes $1.75 billion in funding for the Centers for Disease Control and Prevention's Advanced Molecular Detection (AMD) program. In an interview last month, CDC Director Rochelle Walensky said that the additional funding will be "a dial. And we're going to need to dial it up." AMD has already announced a collaboration called the Sequencing for Public Health Emergency Response, Epidemiology, and Surveillance (SPHERES) Initiative that will bring together scientists from public health, academic, clinical, and non-profit laboratories across the country with the goal of accelerating sequencing.
Such a collaboration is a step toward following the recommendations in the paper Warmbrod coauthored. Building capacity now, creating a network of labs, and standardizing procedures will mean improved health in the future. "I want to be optimistic," she says. "The good news is there are a lot of passionate, smart, capable people who are continuing to work with government and work with different stakeholders." She cautions, however, that without a national strategy we won't succeed.
"If we maximize the potential and create that framework now, we can also use it for endemic diseases," she says. "It's a very helpful system for more than COVID if we're smart in how we plan it."
Scientists implant brain cells to counter Parkinson's disease
Martin Taylor was only 32 when he was diagnosed with Parkinson's, a disease that causes tremors, stiff muscles and slow physical movement - symptoms that steadily get worse as time goes on.
“It's horrible having Parkinson's,” says Taylor, a data analyst, now 41. “It limits my ability to be the dad and husband that I want to be in many cruel and debilitating ways.”
Today, more than 10 million people worldwide live with Parkinson's. Most are diagnosed when they're considerably older than Taylor, after age 60. Although recent research has called into question certain aspects of the disease’s origins, Parkinson’s eventually kills the nerve cells in the brain that produce dopamine, a signaling chemical that carries messages around the body to control movement. Many patients have lost 60 to 80 percent of these cells by the time they are diagnosed.
For years, there's been little improvement in the standard treatment. Patients are typically given the drug levodopa, a chemical that's absorbed by the brain’s nerve cells, or neurons, and converted into dopamine. This drug addresses the symptoms but has no impact on the course of the disease as patients continue to lose dopamine producing neurons. Eventually, the treatment stops working effectively.
BlueRock Therapeutics, a cell therapy company based in Massachusetts, is taking a different approach by focusing on the use of stem cells, which can divide into and generate new specialized cells. The company makes the dopamine-producing cells that patients have lost and inserts these cells into patients' brains. “We have a disease with a high unmet need,” says Ahmed Enayetallah, the senior vice president and head of development at BlueRock. “We know [which] cells…are lost to the disease, and we can make them. So it really came together to use stem cells in Parkinson's.”
In a phase 1 research trial announced late last month, patients reported that their symptoms had improved after a year of treatment. Brain scans also showed an increased number of neurons generating dopamine in patients’ brains.
Increases in dopamine signals
The recent phase 1 trial focused on deploying BlueRock’s cell therapy, called bemdaneprocel, to treat 12 patients suffering from Parkinson’s. The team developed the new nerve cells and implanted them into specific locations on each side of the patient's brain through two small holes in the skull made by a neurosurgeon. “We implant cells into the places in the brain where we think they have the potential to reform the neural networks that are lost to Parkinson's disease,” Enayetallah says. The goal is to restore motor function to patients over the long-term.
Five patients were given a relatively low dose of cells while seven got higher doses. Specialized brain scans showed evidence that the transplanted cells had survived, increasing the overall number of dopamine producing cells. The team compared the baseline number of these cells before surgery to the levels one year later. “The scans tell us there is evidence of increased dopamine signals in the part of the brain affected by Parkinson's,” Enayetallah says. “Normally you’d expect the signal to go down in untreated Parkinson’s patients.”
"I think it has a real chance to reverse motor symptoms, essentially replacing a missing part," says Tilo Kunath, a professor of regenerative neurobiology at the University of Edinburgh.
The team also asked patients to use a specific type of home diary to log the times when symptoms were well controlled and when they prevented normal activity. After a year of treatment, patients taking the higher dose reported symptoms were under control for an average of 2.16 hours per day above their baselines. At the smaller dose, these improvements were significantly lower, 0.72 hours per day. The higher-dose patients reported a corresponding decrease in the amount of time when symptoms were uncontrolled, by an average of 1.91 hours, compared to 0.75 hours for the lower dose. The trial was safe, and patients tolerated the year of immunosuppression needed to make sure their bodies could handle the foreign cells.
Claire Bale, the associate director of research at Parkinson's U.K., sees the promise of BlueRock's approach, while noting the need for more research on a possible placebo effect. The trial participants knew they were getting the active treatment, and placebo effects are known to be a potential factor in Parkinson’s research. Even so, “The results indicate that this therapy produces improvements in symptoms for Parkinson's, which is very encouraging,” Bale says.
Tilo Kunath, a professor of regenerative neurobiology at the University of Edinburgh, also finds the results intriguing. “I think it's excellent,” he says. “I think it has a real chance to reverse motor symptoms, essentially replacing a missing part.” However, it could take time for this therapy to become widely available, Kunath says, and patients in the late stages of the disease may not benefit as much. “Data from cell transplantation with fetal tissue in the 1980s and 90s show that cells did not survive well and release dopamine in these [late-stage] patients.”
Searching for the right approach
There's a long history of using cell therapy as a treatment for Parkinson's. About four decades ago, scientists at the University of Lund in Sweden developed a method in which they transferred parts of fetal brain tissue to patients with Parkinson's so that their nerve cells would produce dopamine. Many benefited, and some were able to stop their medication. However, the use of fetal tissue was highly controversial at that time, and the tissues were difficult to obtain. Later trials in the U.S. showed that people benefited only if a significant amount of the tissue was used, and several patients experienced side effects. Eventually, the work lost momentum.
“Like many in the community, I'm aware of the long history of cell therapy,” says Taylor, the patient living with Parkinson's. “They've long had that cure over the horizon.”
In 2000, Lorenz Studer led a team at the Memorial Sloan Kettering Centre, in New York, to find the chemical signals needed to get stem cells to differentiate into cells that release dopamine. Back then, the team managed to make cells that produced some dopamine, but they led to only limited improvements in animals. About a decade later, in 2011, Studer and his team found the specific signals needed to guide embryonic cells to become the right kind of dopamine producing cells. Their experiments in mice, rats and monkeys showed that their implanted cells had a significant impact, restoring lost movement.
Studer then co-founded BlueRock Therapeutics in 2016. Forming the most effective stem cells has been one of the biggest challenges, says Enayetallah, the BlueRock VP. “It's taken a lot of effort and investment to manufacture and make the cells at the right scale under the right conditions.” The team is now using cells that were first isolated in 1998 at the University of Wisconsin, a major advantage because they’re available in a virtually unlimited supply.
Other efforts underway
In the past several years, University of Lund researchers have begun to collaborate with the University of Cambridge on a project to use embryonic stem cells, similar to BlueRock’s approach. They began clinical trials this year.
A company in Japan called Sumitomo is using a different strategy; instead of stem cells from embryos, they’re reprogramming adults' blood or skin cells into induced pluripotent stem cells - meaning they can turn into any cell type - and then directing them into dopamine producing neurons. Although Sumitomo started clinical trials earlier than BlueRock, they haven’t yet revealed any results.
“It's a rapidly evolving field,” says Emma Lane, a pharmacologist at the University of Cardiff who researches clinical interventions for Parkinson’s. “But BlueRock’s trial is the first full phase 1 trial to report such positive findings with stem cell based therapies.” The company’s upcoming phase 2 research will be critical to show how effectively the therapy can improve disease symptoms, she added.
The cure over the horizon
BlueRock will continue to look at data from patients in the phase 1 trial to monitor the treatment’s effects over a two-year period. Meanwhile, the team is planning the phase 2 trial with more participants, including a placebo group.
For patients with Parkinson’s like Martin Taylor, the therapy offers some hope, though Taylor recognizes that more research is needed.
BlueRock Therapeutics
“Like many in the community, I'm aware of the long history of cell therapy,” he says. “They've long had that cure over the horizon.” His expectations are somewhat guarded, he says, but, “it's certainly positive to see…movement in the field again.”
"If we can demonstrate what we’re seeing today in a more robust study, that would be great,” Enayetallah says. “At the end of the day, we want to address that unmet need in a field that's been waiting for a long time.”
Editor's note: The company featured in this piece, BlueRock Therapeutics, is a portfolio company of Leaps by Bayer, which is a sponsor of Leaps.org. BlueRock was acquired by Bayer Pharmaceuticals in 2019. Leaps by Bayer and other sponsors have never exerted influence over Leaps.org content or contributors.
Scientists experiment with burning iron as a fuel source
Story by Freethink
Try burning an iron metal ingot and you’ll have to wait a long time — but grind it into a powder and it will readily burst into flames. That’s how sparklers work: metal dust burning in a beautiful display of light and heat. But could we burn iron for more than fun? Could this simple material become a cheap, clean, carbon-free fuel?
In new experiments — conducted on rockets, in microgravity — Canadian and Dutch researchers are looking at ways of boosting the efficiency of burning iron, with a view to turning this abundant material — the fourth most common in the Earth’s crust, about about 5% of its mass — into an alternative energy source.
Iron as a fuel
Iron is abundantly available and cheap. More importantly, the byproduct of burning iron is rust (iron oxide), a solid material that is easy to collect and recycle. Neither burning iron nor converting its oxide back produces any carbon in the process.
Iron oxide is potentially renewable by reacting with electricity or hydrogen to become iron again.
Iron has a high energy density: it requires almost the same volume as gasoline to produce the same amount of energy. However, iron has poor specific energy: it’s a lot heavier than gas to produce the same amount of energy. (Think of picking up a jug of gasoline, and then imagine trying to pick up a similar sized chunk of iron.) Therefore, its weight is prohibitive for many applications. Burning iron to run a car isn’t very practical if the iron fuel weighs as much as the car itself.
In its powdered form, however, iron offers more promise as a high-density energy carrier or storage system. Iron-burning furnaces could provide direct heat for industry, home heating, or to generate electricity.
Plus, iron oxide is potentially renewable by reacting with electricity or hydrogen to become iron again (as long as you’ve got a source of clean electricity or green hydrogen). When there’s excess electricity available from renewables like solar and wind, for example, rust could be converted back into iron powder, and then burned on demand to release that energy again.
However, these methods of recycling rust are very energy intensive and inefficient, currently, so improvements to the efficiency of burning iron itself may be crucial to making such a circular system viable.
The science of discrete burning
Powdered particles have a high surface area to volume ratio, which means it is easier to ignite them. This is true for metals as well.
Under the right circumstances, powdered iron can burn in a manner known as discrete burning. In its most ideal form, the flame completely consumes one particle before the heat radiating from it combusts other particles in its vicinity. By studying this process, researchers can better understand and model how iron combusts, allowing them to design better iron-burning furnaces.
Discrete burning is difficult to achieve on Earth. Perfect discrete burning requires a specific particle density and oxygen concentration. When the particles are too close and compacted, the fire jumps to neighboring particles before fully consuming a particle, resulting in a more chaotic and less controlled burn.
Presently, the rate at which powdered iron particles burn or how they release heat in different conditions is poorly understood. This hinders the development of technologies to efficiently utilize iron as a large-scale fuel.
Burning metal in microgravity
In April, the European Space Agency (ESA) launched a suborbital “sounding” rocket, carrying three experimental setups. As the rocket traced its parabolic trajectory through the atmosphere, the experiments got a few minutes in free fall, simulating microgravity.
One of the experiments on this mission studied how iron powder burns in the absence of gravity.
In microgravity, particles float in a more uniformly distributed cloud. This allows researchers to model the flow of iron particles and how a flame propagates through a cloud of iron particles in different oxygen concentrations.
Existing fossil fuel power plants could potentially be retrofitted to run on iron fuel.
Insights into how flames propagate through iron powder under different conditions could help design much more efficient iron-burning furnaces.
Clean and carbon-free energy on Earth
Various businesses are looking at ways to incorporate iron fuels into their processes. In particular, it could serve as a cleaner way to supply industrial heat by burning iron to heat water.
For example, Dutch brewery Swinkels Family Brewers, in collaboration with the Eindhoven University of Technology, switched to iron fuel as the heat source to power its brewing process, accounting for 15 million glasses of beer annually. Dutch startup RIFT is running proof-of-concept iron fuel power plants in Helmond and Arnhem.
As researchers continue to improve the efficiency of burning iron, its applicability will extend to other use cases as well. But is the infrastructure in place for this transition?
Often, the transition to new energy sources is slowed by the need to create new infrastructure to utilize them. Fortunately, this isn’t the case with switching from fossil fuels to iron. Since the ideal temperature to burn iron is similar to that for hydrocarbons, existing fossil fuel power plants could potentially be retrofitted to run on iron fuel.
This article originally appeared on Freethink, home of the brightest minds and biggest ideas of all time.