New therapy may improve stem cell transplants for blood cancers
In 2018, Robyn was diagnosed with myelofibrosis, a blood cancer causing chronic inflammation and scarring. As a research scientist by training, she knew she had limited options. A stem cell transplant is a terminally ill patient's best chance for survival against blood cancers, including leukaemia. It works by destroying a patient's cancer cells and replacing them with healthy cells from a donor.
However, there is a huge risk of Graft vs Host disease (GVHD), which affects around 30-40% of recipients. Patients receive billions of cells in a stem cell transplant but only a fraction are beneficial. The rest can attack healthy tissue leading to GVHD. It affects the skin, gut and lungs and can be truly debilitating.
Currently, steroids are used to try and prevent GVHD, but they have many side effects and are effective in only 50% of cases. “I spoke with my doctors and reached out to patients managing GVHD,” says Robyn, who prefers not to use her last name for privacy reasons. “My concerns really escalated for what I might face post-transplant.”
Then she heard about a new highly precise cell therapy developed by a company called Orca Bio, which gives patients more beneficial cells and fewer cells that cause GVHD. She decided to take part in their phase 2 trial.
How It Works
In stem cell transplants, patients receive immune cells and stem cells. The donor immune cells or T cells attack and kill malignant cells. This is the graft vs leukaemia effect (GVL). The stem cells generate new healthy cells.
Unfortunately, T cells can also cause GVHD, but a rare subset of T cells, called T regulatory cells, can actually prevent GVHD.
Orca’s cell sorting technology distinguishes T regulatory cells from stem cells and conventional T cells on a large scale. It’s this cell sorting technology which has enabled them to create their new cell therapy, called Orca T. It contains a precise combination of stem cells and immune cells with more T regulatory cells and fewer conventional T cells than in a typical stem cell transplant.
“Ivan Dimov’s idea was to spread out the cells, keep them stationary and then use laser scanning to sort the cells,” explains Nate Fernhoff, co-founder of Orca Bio. “The beauty here is that lasers don't care how quickly you move them.”
Over the past 40 years, scientists have been trying to create stem cell grafts that contain the beneficial cells whilst removing the cells that cause GVHD. What makes it even harder is that most transplant centers aren’t able to manipulate grafts to create a precise combination of cells.
Innovative Cell Sorting
Ivan Dimov, Jeroen Bekaert and Nate Fernhoff came up with the idea behind Orca as postdocs at Stanford, working with cell pioneer Irving Weissman. They recognised the need for a more effective cell sorting technology. In a small study at Stanford, Professor Robert Negrin had discovered a combination of T cells, T regulatory cells and stem cells which prevented GVHD but retained the beneficial graft vs leukaemia effect (GVL). However, manufacturing was problematic. Conventional cell sorting is extremely slow and specific. Negrin was only able to make seven highly precise products, for seven patients, in a year. Annual worldwide cases of blood cancer number over 1.2 million.
“We started Orca with this idea: how do we use manufacturing solutions to impact cell therapies,” co-founder Fernhoff reveals. In conventional cell sorting, cells move past a stationary laser which analyses each cell. But cells can only be moved so quickly. At a certain point they start to experience stress and break down. This makes it very difficult to sort the 100 billion cells from a donor in a stem cell transplant.
“Ivan Dimov’s idea was to spread out the cells, keep them stationary and then use laser scanning to sort the cells,” Fernhoff explains. “The beauty here is that lasers don't care how quickly you move them.” They developed this technology and called it Orca Sort. It enabled Orca to make up to six products per week in the first year of manufacturing.
Every product Orca makes is for one patient. The donor is uniquely matched to the patient. They have to carry out the cell sorting procedure each time. Everything also has to be done extremely quickly. They infuse fresh living cells from the donor's vein to the patient's within 72 hours.
“We’ve treated almost 200 patients in all the Orca trials, and you can't do that if you don't fix the manufacturing process,” Fernhoff says. “We're working on what we think is an incredibly promising drug, but it's all been enabled by figuring out how to make a high precision cell therapy at scale.”
Clinical Trials
Orca revealed the results of their phase 1b and phase 2 trials at the end of last year. In their phase 2 trial only 3% of the 29 patients treated with Orca T cell therapy developed chronic GVHD in the first year after treatment. Comparatively, 43% of the 95 patients given a conventional stem cell transplant in a contemporary Stanford trial developed chronic GVHD. Of the 109 patients tested in phase 1b and phase 2 trials, 74% using Orca T didn't relapse or develop any form of GVHD compared to 34% in the control trial.
“Until a randomised study is done, we can make no assumption about the relative efficacy of this approach," says Jeff Szer, professor of haematology at the Royal Melbourne Hospital. "But the holy grail of separating GVHD and GVL is still there and this is a step towards realising that dream.”
Stan Riddell, an immunology professor, at Fred Hutchinson Cancer Centre, believes Orca T is highly promising. “Orca has advanced cell selection processes with innovative methodology and can engineer grafts with greater precision to add cell subsets that may further contribute to beneficial outcomes,” he says. “Their results in phase 1 and phase 2 studies are very exciting and offer the potential of providing a new standard of care for stem cell transplant.”
However, though it is an “intriguing step,” there’s a need for further testing, according to Jeff Szer, a professor of haematology at the Peter MacCallum Cancer Centre at the Royal Melbourne Hospital.
“The numbers tested were tiny and comparing the outcomes to anything from a phase 1/2 setting is risky,” says Szer. “Until a randomised study is done, we can make no assumption about the relative efficacy of this approach. But the holy grail of separating GVHD and GVL is still there and this is a step towards realising that dream.”
The Future
The team is soon starting Phase 3 trials for Orca T. Its previous success has led them to develop Orca Q, a cell therapy for patients who can't find an exact donor match. Transplants for patients who are only a half-match or mismatched are not widely used because there is a greater risk of GVHD. Orca Q has the potential to control GVHD even more and improve access to transplants for many patients.
Fernhoff hopes they’ll be able to help people not just with blood cancers but also with other blood and immune disorders. If a patient has a debilitating disease which isn't life threatening, the risk of GVHD outweighs the potential benefits of a stem cell transplant. The Orca products could take away that risk.
Meanwhile, Robyn has no regrets about participating in the Phase 2 trial. “It was a serious decision to make but I'm forever grateful that I did,” she says. “I have resumed a quality of life aligned with how I felt pre-transplant. I have not had a single issue with GVHD.”
“I want to be able to get one of these products to every patient who could benefit from it,” Fernhoff says. “It's really exciting to think about how Orca's products could be applied to all sorts of autoimmune disorders.”
COVID Variants Are Like “a Thief Changing Clothes” – and Our Camera System Barely Exists
Whether it's "natural selection" as Darwin called it, or it's "mutating" as the X-Men called it, living organisms change over time, developing thumbs or more efficient protein spikes, depending on the organism and the demands of its environment. The coronavirus that causes COVID-19, SARS-CoV-2, is not an exception, and now, after the virus has infected millions of people around the globe for more than a year, scientists are beginning to see those changes.
The notorious variants that have popped up include B.1.1.7, sometimes called the UK variant, as well as P.1 and B.1.351, which seem to have emerged in Brazil and South Africa respectively. As vaccinations are picking up pace, officials are warning that now
is not the time to become complacent or relax restrictions because the variants aren't well understood.
Some appear to be more transmissible, and deadlier, while others can evade the immune system's defenses better than earlier versions of the virus, potentially undermining the effectiveness of vaccines to some degree. Genomic surveillance, the process of sequencing the genetic code of the virus widely to observe changes and patterns, is a critical way that scientists can keep track of its evolution and work to understand how the variants might affect humans.
"It's like a thief changing clothes"
It's important to note that viruses mutate all the time. If there were funding and personnel to sequence the genome of every sample of the virus, scientists would see thousands of mutations. Not every variant deserves our attention. The vast majority of mutations are not important at all, but recognizing those that are is a crucial tool in getting and staying ahead of the virus. The work of sequencing, analyzing, observing patterns, and using public health tools as necessary is complicated and confusing to those without years of specialized training.
Jeremy Kamil, associate professor of microbiology and immunology at LSU Health Shreveport, in Louisiana, says that the variants developing are like a thief changing clothes. The thief goes in your house, steals your stuff, then leaves and puts on a different shirt and a wig, in the hopes you won't recognize them. Genomic surveillance catches the "thief" even in those different clothes.
One of the tricky things about variants is recognizing the point at which they move from interesting, to concerning at a local level, to dangerous in a larger context.
Understanding variants, both the uninteresting ones and the potentially concerning ones, gives public health officials and researchers at different levels a useful set of tools. Locally, knowing which variants are circulating in the community helps leaders know whether mask mandates and similar measures should be implemented or discontinued, or whether businesses and schools can open relatively safely.
There's more to it than observing new variants
Analysis is complex, particularly when it comes to understanding which variants are of concern. "So the question is always if a mutation becomes common, is that a random occurrence?" says Phoebe Lostroh, associate professor of molecular biology at Colorado College. "Or is the variant the result of some kind of selection because the mutation changes some property about the virus that makes it reproduce more quickly than variants of the virus that don't have that mutation? For a virus, [mutations can affect outcomes like] how much it replicates inside a person's body, how much somebody breathes it out, whether the particles that somebody might breathe in get smaller and can lead to greater transmission."
Along with all of those factors, accurate and useful genomic surveillance requires an understanding of where variants are occurring, how they are related, and an examination of why they might be prevalent.
For example, if a potentially worrisome variant appears in a community and begins to spread very quickly, it's not time to raise a public health alarm until several important questions have been answered, such as whether the variant is spreading due to specific events, or if it's happening because the mutation has allowed the virus to infect people more efficiently. Kamil offered a hypothetical scenario to explain: Imagine that a member of a community became infected and the virus mutated. That person went to church and three more people were infected, but one of them went to a karaoke bar and while singing infected 100 other people. Examining the conditions under which the virus has spread is, therefore, an essential part of untangling whether a mutation itself made the virus more transmissible or if an infected person's behaviors contributed to a local outbreak.
One of the tricky things about variants is recognizing the point at which they move from interesting, to concerning at a local level, to dangerous in a larger context. Genomic sequencing can help with that, but only when it's coordinated. When the same mutation occurs frequently, but is localized to one region, it's a concern, but when the same mutation happens in different places at the same time, it's much more likely that the "virus is learning that's a good mutation," explains Kamil.
The process is called convergent evolution, and it was a fascinating topic long before COVID. Just as your heritage can be traced through DNA, so can that of viruses, and when separate lineages develop similar traits it's almost like scientists can see evolution happening in real time. A mutation to SARS-CoV-2 that happens in more than one place at once is a mutation that makes it easier in some way for the virus to survive and that is when it may become alarming. The widespread, documented variants P.1 and B.1.351 are examples of convergence because they share some of the same virulent mutations despite having developed thousands of miles apart.
However, even variants that are emerging in different places at the same time don't present the kind of threat SARS-CoV-2 did in 2019. "This is nature," says Kamil. "It just means that this virus will not easily be driven to extinction or complete elimination by vaccines." Although a person who has already had COVID-19 can be reinfected with a variant, "it is almost always much milder disease" than the original infection, Kamil adds. Rather than causing full-fledged disease, variants have the potiental to "penetrate herd immunity, spreading relatively quietly among people who have developed natural immunity or been vaccinated, until the virus finds someone who has no immunity yet, and that person would be at risk of hospitalization-grade severe disease or death."
Surveillance and predictions
According to Lostroh, genomic surveillance can help scientists predict what's going to happen. "With the British strain, for instance, that's more transmissible, you can measure how fast it's doubling in the population and you can sort of tell whether we should take more measures against this mutation. Should we shut things down a little longer because that mutation is present in the population? That could be really useful if you did enough sampling in the population that you knew where it was," says Lostroh. If, for example, the more transmissible strain was present in 50 percent of cases, but in another county or state it was barely present, it would allow for rolling lockdowns instead of sweeping measures.
Variants are also extremely important when it comes to the development, manufacture, and distribution of vaccines. "You're also looking at medical countermeasures, such as whether your vaccine is still effective, or if your antiviral needs to be updated," says Lane Warmbrod, a senior analyst and research associate at Johns Hopkins Center for Health Security.
Properly funded and extensive genomic surveillance could eventually help control endemic diseases, too, like the seasonal flu, or other common respiratory infections. Kamil says he envisions a future in which genomic surveillance allows for prediction of sickness just as the weather is predicted today. "It's a 51 for infection today at the San Francisco Airport. There's been detection of some respiratory viruses," he says, offering an example. He says that if you're a vulnerable person, if you're immune-suppressed for some reason, you may want to wear a mask based on the sickness report.
The U.S. has the ability, but lacks standards
The benefits of widespread genomic surveillance are clear, and the United States certainly has the necessary technology, equipment, and personnel to carry it out. But, it's not happening at the speed and extent it needs to for the country to gain the benefits.
"The numbers are improving," said Kamil. "We're probably still at less than half a percent of all the samples that have been taken have been sequenced since the beginning of the pandemic."
Although there's no consensus on how many sequences is ideal for a robust surveillance program, modeling performed by the company Illumina suggests about 5 percent of positive tests should be sequenced. The reasons the U.S. has lagged in implementing a sequencing program are complex and varied, but solvable.
Perhaps the most important element that is currently missing is leadership. In order to conduct an effective genomic surveillance program, there need to be standards. The Johns Hopkins Center for Health Security recently published a paper with recommendations as to what kinds of elements need to be standardized in order to make the best use of sequencing technology and analysis.
"Along with which bioinformatic pipelines you're going to use to do the analyses, which sequencing strategy protocol are you going to use, what's your sampling strategy going to be, how is the data is going to be reported, what data gets reported," says Warmbrod. Currently, there's no guidance from the CDC on any of those things. So, while scientists can collect and report information, they may be collecting and reporting different information that isn't comparable, making it less useful for public health measures and vaccine updates.
Globally, one of the most important tools in making the information from genomic surveillance useful is GISAID, a platform designed for scientists to share -- and, importantly, to be credited for -- their data regarding genetic sequences of influenza. Originally, it was launched as a database of bird flu sequences, but has evolved to become an essential tool used by the WHO to make flu vaccine virus recommendations each year. Scientists who share their credentials have free access to the database, and anyone who uses information from the database must credit the scientist who uploaded that information.
Safety, logistics, and funding matter
Scientists at university labs and other small organizations have been uploading sequences to GISAID almost from the beginning of the pandemic, but their funding is generally limited, and there are no standards regarding information collection or reporting. Private, for-profit labs haven't had motivation to set up sequencing programs, although many of them have the logistical capabilities and funding to do so. Public health departments are understaffed, underfunded, and overwhelmed.
University labs may also be limited by safety concerns. The SARS-CoV-2 virus is dangerous, and there's a question of how samples should be transported to labs for sequencing.
Larger, for-profit organizations often have the tools and distribution capabilities to safely collect and sequence samples, but there hasn't been a profit motive. Genomic sequencing is less expensive now than ever before, but even at $100 per sample, the cost adds up -- not to mention the cost of employing a scientist with the proper credentials to analyze the sequence.
The path forward
The recently passed COVID-19 relief bill does have some funding to address genomic sequencing. Specifically, the American Rescue Plan Act includes $1.75 billion in funding for the Centers for Disease Control and Prevention's Advanced Molecular Detection (AMD) program. In an interview last month, CDC Director Rochelle Walensky said that the additional funding will be "a dial. And we're going to need to dial it up." AMD has already announced a collaboration called the Sequencing for Public Health Emergency Response, Epidemiology, and Surveillance (SPHERES) Initiative that will bring together scientists from public health, academic, clinical, and non-profit laboratories across the country with the goal of accelerating sequencing.
Such a collaboration is a step toward following the recommendations in the paper Warmbrod coauthored. Building capacity now, creating a network of labs, and standardizing procedures will mean improved health in the future. "I want to be optimistic," she says. "The good news is there are a lot of passionate, smart, capable people who are continuing to work with government and work with different stakeholders." She cautions, however, that without a national strategy we won't succeed.
"If we maximize the potential and create that framework now, we can also use it for endemic diseases," she says. "It's a very helpful system for more than COVID if we're smart in how we plan it."
Since the beginning of life on Earth, plants have been naturally converting sunlight into energy. This photosynthesis process that's effortless for them has been anything but for scientists who have been trying to achieve artificial photosynthesis for the last half a century with the goal of creating a carbon-neutral fuel. Such a fuel could be a gamechanger — rather than putting CO2 back into the atmosphere like traditional fuels do, it would take CO2 out of the atmosphere and convert it into usable energy.
If given the option between a carbon-neutral fuel at the gas station and a fuel that produces carbon dioxide in spades -- and if costs and effectiveness were equal --who wouldn't choose the one best for the planet? That's the endgame scientists are after. A consumer switch to clean fuel could have a huge impact on our global CO2 emissions.
Up until this point, the methods used to make liquid fuel from atmospheric CO2 have been expensive, not efficient enough to really get off the ground, and often resulted in unwanted byproducts. But now, a new technology may be the key to unlocking the full potential of artificial photosynthesis. At the very least, it's a step forward and could help make a dent in atmospheric CO2 reduction.
"It's an important breakthrough in artificial photosynthesis," says Qian Wang, a researcher in the Department of Chemistry at Cambridge University and lead author on a recent study published in Nature about an innovation she calls "photosheets."
The latest version of the artificial leaf directly produces liquid fuel, which is easier to transport and use commercially.
These photosheets convert CO2, sunlight, and water into a carbon-neutral liquid fuel called formic acid without the aid of electricity. They're made of semiconductor powders that absorb sunlight. When in the presence of water and CO2, the electrons in the powders become excited and join with the CO2 and protons from the water molecules, reducing the CO2 in the process. The chemical reaction results in the production of formic acid, which can be used directly or converted to hydrogen, another clean energy fuel.
In the past, it's been difficult to reduce CO2 without creating a lot of unwanted byproducts. According to Wang, this new conversion process achieves the reduction and fuel creation with almost no byproducts.
The Cambridge team's new technology is a first and certainly momentous, but they're far from the only team to have produced fuel from CO2 using some form of artificial photosynthesis. More and more scientists are aiming to perfect the method in hopes of producing a truly sustainable, photosynthetic fuel capable of lowering carbon emissions.
Thanks to advancements in nanoscience, which has led to better control of materials, more successes are emerging. A team at the University of Illinois at Urbana-Champaign, for example, used gold nanoparticles as the photocatalysts in their process.
"My group demonstrated that you could actually use gold nanoparticles both as a light absorber and a catalyst in the process of converting carbon dioxide to hydrocarbons such as methane, ethane and propane fuels," says professor Prashant Jain, co-author of the study. Not only are gold nanoparticles great at absorbing light, they don't degrade as quickly as other metals, which makes them more sustainable.
That said, Jain's team, like every other research team working on artificial photosynthesis including the Cambridge team, is grappling with efficiency issues. Jain says that all parts of the process need to be optimized so the reaction can happen as quickly as possible.
"You can't just improve one [aspect], because that can lead to a decrease in performance in some other aspects," Jain explains.
The Cambridge team is currently experimenting with a range of catalysts to improve their device's stability and efficiency. Virgil Andrei, who is working on an artificial leaf design that was developed at Cambridge in 2019, was recently able to improve the performance and selectivity of the device. Now the leaf's solar-to-CO2 energy conversion efficiency is 0.2%, twice its previous efficiency.
The latest version also directly produces liquid fuel, which is easier to transport and use commercially.
In determining a method of fuel production's efficiency, one must consider how sustainable it is at every stage. That involves calculating whenever excess energy is needed to complete a step. According to Jain, in order to use CO2 for fuel production, you have to condense the CO2, which takes energy. And on the fuel production side, once the chemical reaction has created your byproducts, they need to be separated, which also takes energy.
To be truly sustainable, each part of the conversion system also needs to be durable. If parts need to be replaced often, or regularly maintained, that counts against it. Then you have to account for the system's reuse cycle. If you extract CO2 from the environment and convert it into fuel that's then put into a fuel cell, it's going to release CO2 at the other end. In order to create a fully green, carbon-neutral fuel source, that same amount of CO2 needs to be trapped and reintroduced back into the fuel conversion system.
"The cycle continues, and at each point, you will see a loss in efficiency, and depending on how much you [may also] see a loss in yield," says Jain. "And depending on what those efficiencies are at each one of those points will determine whether or not this process can be sustainable."
The science is at least a decade away from offering a competitive sustainable fuel option at scale. Streamlining a process to mimic what plants have perfected over billions of years is no small feat, but an ever-growing community of researchers using rapidly advancing technology is driving progress forward.