What’s the Right Way to Regulate Gene-Edited Crops?
In the next few decades, humanity faces its biggest food crisis since the invention of the plow. The planet's population, currently 7.6 billion, is expected to reach 10 billion by 2050; to avoid mass famine, according to the World Resource Institute, we'll need to produce 70 percent more calories than we do today.
Imagine that a cheap, easy-to-use, and rapidly deployable technology could make crops more fertile and strengthen their resistance to threats.
Meanwhile, climate change will bring intensifying assaults by heat, drought, storms, pests, and weeds, depressing farm yields around the globe. Epidemics of plant disease—already laying waste to wheat, citrus, bananas, coffee, and cacao in many regions—will spread ever further through the vectors of modern trade and transportation.
So here's a thought experiment: Imagine that a cheap, easy-to-use, and rapidly deployable technology could make crops more fertile and strengthen their resistance to these looming threats. Imagine that it could also render them more nutritious and tastier, with longer shelf lives and less vulnerability to damage in shipping—adding enhancements to human health and enjoyment, as well as reduced food waste, to the possible benefits.
Finally, imagine that crops bred with the aid of this tool might carry dangers. Some could contain unsuspected allergens or toxins. Others might disrupt ecosystems, affecting the behavior or very survival of other species, or infecting wild relatives with their altered DNA.
Now ask yourself: If such a technology existed, should policymakers encourage its adoption, or ban it due to the risks? And if you chose the former alternative, how should crops developed by this method be regulated?
In fact, this technology does exist, though its use remains mostly experimental. It's called gene editing, and in the past five years it has emerged as a potentially revolutionary force in many areas—among them, treating cancer and genetic disorders; growing transplantable human organs in pigs; controlling malaria-spreading mosquitoes; and, yes, transforming agriculture. Several versions are currently available, the newest and nimblest of which goes by the acronym CRISPR.
Gene editing is far simpler and more efficient than older methods used to produce genetically modified organisms (GMOs). Unlike those methods, moreover, it can be used in ways that leave no foreign genes in the target organism—an advantage that proponents argue should comfort anyone leery of consuming so-called "Frankenfoods." But debate persists over what precautions must be taken before these crops come to market.
Recently, two of the world's most powerful regulatory bodies offered very different answers to that question. The United States Department of Agriculture (USDA) declared in March 2018 that it "does not currently regulate, or have any plans to regulate" plants that are developed through most existing methods of gene editing. The Court of Justice of the European Union (ECJ), by contrast, ruled in July that such crops should be governed by the same stringent regulations as conventional GMOs.
Some experts suggest that the broadly permissive American approach and the broadly restrictive EU policy are equally flawed.
Each announcement drew protests, for opposite reasons. Anti-GMO activists assailed the USDA's statement, arguing that all gene-edited crops should be tested and approved before marketing. "You don't know what those mutations or rearrangements might do in a plant," warned Michael Hansen, a senior scientist with the advocacy group Consumers Union. Biotech boosters griped that the ECJ's decision would stifle innovation and investment. "By any sensible standard, this judgment is illogical and absurd," wrote the British newspaper The Observer.
Yet some experts suggest that the broadly permissive American approach and the broadly restrictive EU policy are equally flawed. "What's behind these regulatory decisions is not science," says Jennifer Kuzma, co-director of the Genetic Engineering and Society Center at North Carolina State University, a former advisor to the World Economic Forum, who has researched and written extensively on governance issues in biotechnology. "It's politics, economics, and culture."
The U.S. Welcomes Gene-Edited Food
Humans have been modifying the genomes of plants and animals for 10,000 years, using selective breeding—a hit-or-miss method that can take decades or more to deliver rewards. In the mid-20th century, we learned to speed up the process by exposing organisms to radiation or mutagenic chemicals. But it wasn't until the 1980s that scientists began modifying plants by altering specific stretches of their DNA.
Today, about 90 percent of the corn, cotton and soybeans planted in the U.S. are GMOs; such crops cover nearly 4 million square miles (10 million square kilometers) of land in 29 countries. Most of these plants are transgenic, meaning they contain genes from an unrelated species—often as biologically alien as a virus or a fish. Their modifications are designed primarily to boost profit margins for mechanized agribusiness: allowing crops to withstand herbicides so that weeds can be controlled by mass spraying, for example, or to produce their own pesticides to lessen the need for chemical inputs.
In the early days, the majority of GM crops were created by extracting the gene for a desired trait from a donor organism, multiplying it, and attaching it to other snippets of DNA—usually from a microbe called an agrobacterium—that could help it infiltrate the cells of the target plant. Biotechnologists injected these particles into the target, hoping at least one would land in a place where it would perform its intended function; if not, they kept trying. The process was quicker than conventional breeding, but still complex, scattershot, and costly.
Because agrobacteria can cause plant tumors, Kuzma explains, policymakers in the U.S. decided to regulate GMO crops under an existing law, the Plant Pest Act of 1957, which addressed dangers like imported trees infested with invasive bugs. Every GMO containing the DNA of agrobacterium or another plant pest had to be tested to see whether it behaved like a pest, and undergo a lengthy approval process. By 2010, however, new methods had been developed for creating GMOs without agrobacteria; such plants could typically be marketed without pre-approval.
Soon after that, the first gene-edited crops began appearing. If old-school genetic engineering was a shotgun, techniques like TALEN and CRISPR were a scalpel—or the search-and-replace function on a computer program. With CRISPR/Cas9, for example, an enzyme that bacteria use to recognize and chop up hostile viruses is reprogrammed to find and snip out a desired bit of a plant or other organism's DNA. The enzyme can also be used to insert a substitute gene. If a DNA sequence is simply removed, or the new gene comes from a similar species, the changes in the target plant's genotype and phenotype (its general characteristics) may be no different from those that could be produced through selective breeding. If a foreign gene is added, the plant becomes a transgenic GMO.
Companies are already teeing up gene-edited products for the U.S. market, like a cooking oil and waxy corn.
This development, along with the emergence of non-agrobacterium GMOs, eventually prompted the USDA to propose a tiered regulatory system for all genetically engineered crops, beginning with an initial screening for potentially hazardous metaboloids or ecological impacts. (The screening was intended, in part, to guard against the "off-target effects"—stray mutations—that occasionally appear in gene-edited organisms.) If no red flags appeared, the crop would be approved; otherwise, it would be subject to further review, and possible regulation.
The plan was unveiled in January 2017, during the last week of the Obama presidency. Then, under the Trump administration, it was shelved. Although the USDA continues to promise a new set of regulations, the only hint of what they might contain has been Secretary of Agriculture Sonny Perdue's statement last March that gene-edited plants would remain unregulated if they "could otherwise have been developed through traditional breeding techniques, as long as they are not plant pests or developed using plant pests."
Because transgenic plants could not be "developed through traditional breeding techniques," this statement could be taken to mean that gene editing in which foreign DNA is introduced might actually be regulated. But because the USDA regulates conventional transgenic GMOs only if they trigger the plant-pest stipulation, experts assume gene-edited crops will face similarly limited oversight.
Meanwhile, companies are already teeing up gene-edited products for the U.S. market. An herbicide-resistant oilseed rape, developed using a proprietary technique, has been available since 2016. A cooking oil made from TALEN-tweaked soybeans, designed to have a healthier fatty-acid profile, is slated for release within the next few months. A CRISPR-edited "waxy" corn, designed with a starch profile ideal for processed foods, should be ready by 2021.
In all likelihood, none of these products will have to be tested for safety.
In the E.U., Stricter Rules Apply
Now let's look at the European Union. Since the late 1990s, explains Gregory Jaffe, director of the Project on Biotechnology at the Center for Science in the Public Interest, the EU has had a "process-based trigger" for genetically engineered products: "If you use recombinant DNA, you are going to be regulated." All foods and animal feeds must be approved and labeled if they consist of or contain more than 0.9 percent GM ingredients. (In the U.S., "disclosure" of GM ingredients is mandatory, if someone asks, but labeling is not required.) The only GM crop that can be commercially grown in EU member nations is a type of insect-resistant corn, though some countries allow imports.
European scientists helped develop gene editing, and they—along with the continent's biotech entrepreneurs—have been busy developing applications for crops. But European farmers seem more divided over the technology than their American counterparts. The main French agricultural trades union, for example, supports research into non-transgenic gene editing and its exemption from GMO regulation. But it was the country's small-farmers' union, the Confédération Paysanne, along with several allied groups, that in 2015 submitted a complaint to the ECJ, asking that all plants produced via mutagenesis—including gene-editing—be regulated as GMOs.
At this point, it should be mentioned that in the past 30 years, large population studies have found no sign that consuming GM foods is harmful to human health. GMO critics can, however, point to evidence that herbicide-resistant crops have encouraged overuse of herbicides, giving rise to poison-proof "superweeds," polluting the environment with suspected carcinogens, and inadvertently killing beneficial plants. Those allegations were key to the French plaintiffs' argument that gene-edited crops might similarly do unexpected harm. (Disclosure: Leapsmag's parent company, Bayer, recently acquired Monsanto, a maker of herbicides and herbicide-resistant seeds. Also, Leaps by Bayer, an innovation initiative of Bayer and Leapsmag's direct founder, has funded a biotech startup called JoynBio that aims to reduce the amount of nitrogen fertilizer required to grow crops.)
The ruling was "scientifically nonsensical. It's because of things like this that I'll never go back to Europe."
In the end, the EU court found in the Confédération's favor on gene editing—though the court maintained the regulatory exemption for mutagenesis induced by chemicals or radiation, citing the 'long safety record' of those methods.
The ruling was "scientifically nonsensical," fumes Rodolphe Barrangou, a French food scientist who pioneered CRISPR while working for DuPont in Wisconsin and is now a professor at NC State. "It's because of things like this that I'll never go back to Europe."
Nonetheless, the decision was consistent with longstanding EU policy on crops made with recombinant DNA. Given the difficulty and expense of getting such products through the continent's regulatory system, many other European researchers may wind up following Barrangou to America.
Getting to the Root of the Cultural Divide
What explains the divergence between the American and European approaches to GMOs—and, by extension, gene-edited crops? In part, Jennifer Kuzma speculates, it's that Europeans have a different attitude toward eating. "They're generally more tied to where their food comes from, where it's produced," she notes. They may also share a mistrust of government assurances on food safety, borne of the region's Mad Cow scandals of the 1980s and '90s. In Catholic countries, consumers may have misgivings about tinkering with the machinery of life.
But the principal factor, Kuzma argues, is that European and American agriculture are structured differently. "GM's benefits have mostly been designed for large-scale industrial farming and commodity crops," she says. That kind of farming is dominant in the U.S., but not in Europe, leading to a different balance of political power. In the EU, there was less pressure on decisionmakers to approve GMOs or exempt gene-edited crops from regulation—and more pressure to adopt a GM-resistant stance.
Such dynamics may be operating in other regions as well. In China, for example, the government has long encouraged research in GMOs; a state-owned company recently acquired Syngenta, a Swiss-based multinational corporation that is a leading developer of GM and gene-edited crops. GM animal feed and cooking oil can be freely imported. Yet commercial cultivation of most GM plants remains forbidden, out of deference to popular suspicions of genetically altered food. "As a new item, society has debates and doubts on GMO techniques, which is normal," President Xi Jinping remarked in 2014. "We must be bold in studying it, [but] be cautious promoting it."
The proper balance between boldness and caution is still being worked out all over the world. Europe's process-based approach may prevent researchers from developing crops that, with a single DNA snip, could rescue millions from starvation. EU regulations will also make it harder for small entrepreneurs to challenge Big Ag with a technology that, as Barrangou puts it, "can be used affordably, quickly, scalably, by anyone, without even a graduate degree in genetics." America's product-based approach, conversely, may let crops with hidden genetic dangers escape detection. And by refusing to investigate such risks, regulators may wind up exacerbating consumers' doubts about GM and gene-edited products, rather than allaying them.
"Science...can't tell you what to regulate. That's a values-based decision."
Perhaps the solution lies in combining both approaches, and adding some flexibility and nuance to the mix. "I don't believe in regulation by the product or the process," says CSPI's Jaffe. "I think you need both." Deleting a DNA base pair to silence a gene, for example, might be less risky than inserting a foreign gene into a plant—unless the deletion enables the production of an allergen, and the transgene comes from spinach.
Kuzma calls for the creation of "cooperative governance networks" to oversee crop genome editing, similar to bodies that already help develop and enforce industry standards in fisheries, electronics, industrial cleaning products, and (not incidentally) organic agriculture. Such a network could include farmers, scientists, advocacy groups, private companies, and governmental agencies. "Safety isn't an all-or-nothing concept," Kuzma says. "Science can tell you what some of the issues are in terms of risk and benefit, but it can't tell you what to regulate. That's a values-based decision."
By drawing together a wide range of stakeholders to make such decisions, she adds, "we're more likely to anticipate future consequences, and to develop a robust approach—one that not only seems more legitimate to people, but is actually just plain old better."
COVID-19 prompted numerous companies to reconsider their approach to the future of work. Many leaders felt reluctant about maintaining hybrid and remote work options after vaccines became widely available. Yet the emergence of dangerous COVID variants such as Omicron has shown the folly of this mindset.
To mitigate the risks of new variants and other public health threats, as well as to satisfy the desires of a large majority of employees who express a strong desire in multiple surveys for a flexible hybrid or fully remote schedule, leaders are increasingly accepting that hybrid and remote options represent the future of work. No wonder that a February 2022 survey by the Federal Reserve Bank of Richmond showed that more and more firms are offering hybrid and fully-remote work options. The firms expect to have more remote workers next year and more geographically-distributed workers.
Although hybrid and remote work mitigates public health risks, it poses another set of health concerns relevant to employee wellbeing, due to the threat of proximity bias. This term refers to the negative impact on work culture from the prospect of inequality among office-centric, hybrid, and fully remote employees.
The difference in time spent in the office leads to concerns ranging from decreased career mobility for those who spend less facetime with their supervisor to resentment building up against the staff who have the most flexibility in where to work. In fact, a January 2022 survey by the company Slack of over 10,000 knowledge workers and their leaders shows that proximity bias is the top concern – expressed by 41% of executives - about hybrid and remote work.
To address this problem requires using best practices based on cognitive science for creating a culture of “Excellence From Anywhere.” This solution is based on guidance that I developed for leaders at 17 pioneering organizations for a company culture fit for the future of work.
Protect from proximity bias via the "Excellence From Anywhere" strategy
So why haven’t firms addressed the obvious problem of proximity bias? Any reasonable external observer could predict the issues arising from differences of time spent in the office.
Unfortunately, leaders often fail to see the clear threat in front of their nose. You might have heard of black swans: low-probability, high-impact threats. Well, the opposite kind of threats are called gray rhinos: obvious dangers that we fail to see because of our mental blindspots. The scientific name for these blindspots is cognitive biases, which cause leaders to resist best practices in transitioning to a hybrid-first model.
The core idea is to get all of your workforce to pull together to achieve business outcomes: the location doesn’t matter.
Leaders can address this by focusing on a shared culture of “Excellence From Anywhere.” This term refers to a flexible organizational culture that takes into account the nature of an employee's work and promotes evaluating employees based on task completion, allowing remote work whenever possible.
Addressing Resentments Due to Proximity Bias
The “Excellence From Anywhere” strategy addresses concerns about treatment of remote workers by focusing on deliverables, regardless of where you work. Doing so also involves adopting best practices for hybrid and remote collaboration and innovation.
By valuing deliverables, collaboration, and innovation through a focus on a shared work culture of “Excellence From Anywhere,” you can instill in your employees a focus on deliverables. The core idea is to get all of your workforce to pull together to achieve business outcomes: the location doesn’t matter.
This work culture addresses concerns about fairness by reframing the conversation to focus on accomplishing shared goals, rather than the method of doing so. After all, no one wants their colleagues to have to commute out of spite.
This technique appeals to the tribal aspect of our brains. We are evolutionarily adapted to living in small tribal groups of 50-150 people. Spending different amounts of time in the office splits apart the work tribe into different tribes. However, cultivating a shared focus on business outcomes helps mitigate such divisions and create a greater sense of unity, alleviating frustrations and resentments. Doing so helps improve employee emotional wellbeing and facilitates good collaboration.
Solving the facetime concerns of proximity bias
But what about facetime with the boss? To address this problem necessitates shifting from the traditional, high-stakes, large-scale quarterly or even annual performance evaluations to much more frequent weekly or biweekly, low-stakes, brief performance evaluation through one-on-one in-person or videoconference check-ins.
Supervisees agree with their supervisor on three to five weekly or biweekly performance goals. Then, 72 hours before their check-in meeting, they send a brief report, under a page, to their boss of how they did on these goals, what challenges they faced and how they overcame them, a quantitative self-evaluation, and proposed goals for next week. Twenty-four hours before the meeting, the supervisor responds in a paragraph-long response with their initial impressions of the report.
It’s hard to tell how much any employee should worry about not being able to chat by the watercooler with their boss: knowing exactly where they stand is the key concern for employees, and they can take proactive action if they see their standing suffer.
At the one-on-one, the supervisor reinforces positive aspects of performance and coaches the supervisee on how to solve challenges better, agrees or revises the goals for next time, and affirms or revises the performance evaluation. That performance evaluation gets fed into a constant performance and promotion review system, which can replace or complement a more thorough annual evaluation.
This type of brief and frequent performance evaluation meeting ensures that the employee’s work is integrated with efforts by the supervisor’s other employees, thereby ensuring more unity in achieving business outcomes. It also mitigates concerns about facetime, since all get at least some personalized attention from their team leader. But more importantly, it addresses the underlying concerns about career mobility by giving all staff a clear indication of where they stand at all times. After all, it’s hard to tell how much any employee should worry about not being able to chat by the watercooler with their boss: knowing exactly where they stand is the key concern for employees, and they can take proactive action if they see their standing suffer.
Such best practices help integrate employees into a work culture fit for the future of work while fostering good relationships with managers. Research shows supervisor-supervisee relationships are the most critical ones for employee wellbeing, engagement, and retention.
Conclusion
You don’t have to be the CEO to implement these techniques. Lower-level leaders of small rank-and-file teams can implement these shifts within their own teams, adapting their culture and performance evaluations. And if you are a staff member rather than a leader, send this article to your supervisor and other employees at your company: start a conversation about the benefits of addressing proximity bias using such research-based best practices.
When the COVID-19 pandemic began invading the world in late 2019, Peter Hotez and Maria Elena Bottazzi set out to create a low-cost vaccine that would help inoculate populations in low- and middle-income countries. The scientists, with their prior experience of developing inexpensive vaccines for the world’s poor, had anticipated that the global rollout of Covid-19 jabs would be marked with several inequities. They wanted to create a patent-free vaccine to bridge this gap, but the U.S. government did not seem impressed, forcing the researchers to turn to private philanthropies for funds.
Hotez and Bottazzi, both scientists at the Texas Children’s Hospital Center for Vaccine Development at Baylor College of Medicine, raised about $9 million in private funds. Meanwhile, the U.S. government’s contribution stood at $400,000.
“That was a very tough time early on in the pandemic, you know, trying to do the work and raise the money for it at the same time,” says Hotez, who was nominated in February for a Nobel Peace Prize with Bottazzi for their COVID-19 vaccine. He adds that at the beginning of the pandemic, governments emphasized speed, innovation and rapidly immunizing populations in North America and Europe with little consideration for poorer countries. “We knew this [vaccine] was going to be the answer to global vaccine inequality, but I just wish the policymakers had felt the same,” says Hotez.
Over the past two years, the world has witnessed 488 million COVID-19 infections and over 61 million deaths. Over 11 billion vaccine doses have been administered worldwide; however, the global rollout of COVID-19 vaccines is marked with alarming socio-economic inequities. For instance, 72 percent of the population in high-income countries has received at least one dose of the vaccine, whereas the number stands at 15 percent in low-income countries.
This inequity is worsening vulnerabilities across the world, says Lawrence Young, a virologist and co-lead of the Warwick Health Global Research Priority at the UK-based University of Warwick. “As long as the virus continues to spread and replicate, particularly in populations who are under-vaccinated, it will throw up new variants and these will remain a continual threat even to those countries with high rates of vaccination,” says Young, “Therefore, it is in all our interests to ensure that vaccines are distributed equitably across the world.”
“When your house is on fire, you don't call the patent attorney,” says Hotez. “We wanted to be the fire department.”
The vaccine developed by Hotez and Bottazzi recently received emergency use authorisation in India, which plans to manufacture 100 million doses every month. Dubbed ‘Corbevax’ by its Indian maker, Biological E Limited, the vaccine is now being administered in India to children aged 12-14. The patent-free arrangement means that other low- and middle-income countries could also produce and distribute the vaccine locally.
“When your house is on fire, you don't call the patent attorney, you call the fire department,” says Hotez, commenting on the intellectual property rights waiver. “We wanted to be the fire department.”
The Inequity
Vaccine equity simply means that all people, irrespective of their location, should have equal access to vaccines. However, data suggests that the global COVID-19 vaccine rollout has favoured those in richer countries. For instance, high-income countries like the UAE, Portugal, Chile, Singapore, Australia, Malta, Hong Kong and Canada have partially vaccinated over 85 percent of their populations. This percentage in poorer countries, meanwhile, is abysmally low – 2.1 percent in Yemen, 4.6 in South Sudan, 5 in Cameroon, 9.9 in Burkina Faso, 10 in Nigeria, 12 in Somalia, 12 in Congo, 13 in Afghanistan and 21 in Ethiopia.
In late 2019, scientists Peter Hotez and Maria Elena Bottazzi set out to create a low-cost vaccine that would help inoculate populations in low- and middle-income countries. In February, they were nominated for a Nobel Peace Prize.
Texas Children's Hospital
The COVID-19 vaccination coverage is particularly low in African countries, and according to Shabir Madhi, a vaccinologist at the University of the Witwatersrand, Johannesburg and co-director of African Local Initiative for Vaccinology Expertise, vaccine access and inequity remains a challenge in Africa. Madhi adds that a lack of vaccine access has affected the pandemic’s trajectory on the continent, but a majority of its people have now developed immunity through natural infection. “This has come at a high cost of loss of lives,” he says.
COVID-19 vaccines mean a significant financial burden for poorer countries, which spend an average of $41 per capita annually on health, while the average cost of every COVID-19 vaccine dose ranges between $2 and $40 in addition to a distribution cost of $3.70 per person for two doses. In December last year, the World Health Organisation (WHO) set a goal of immunizing 70 percent of the population of all countries by mid-2022. This, however, means that low-income countries would have to increase their health expenditure by an average of 56.6 percent to cover the cost, as opposed to 0.8 per cent in high-income countries.
Reflecting on the factors that have driven global inequity in COVID-19 vaccine distribution, Andrea Taylor, assistant director of programs at the Duke Global Health Innovation Center, says that wealthy nations took the risk of investing heavily in the development and scaling up of COVID-19 vaccines – at a time when there was little evidence to show that vaccines would work. This reserved a place for these nations at the front of the queue when doses started rolling off production lines. Lower-income countries, meanwhile, could not afford such investments.
“Now, however, global supply is not the issue,” says Taylor. “We are making plenty of doses to meet global need. The main problem is infrastructure to get the vaccine where it is most needed in a predictable and timely way and to ensure that countries have all the support they need to store, transport, and use the vaccine once it is received.”
Taufique Joarder, vice-chairperson of Bangladesh's Public Health Foundation, sees the need for more trials and data before Corbevax is made available to the general population.
In addition to global inequities in vaccination coverage, there are inequities within nations. Taufique Joarder, vice-chairperson of Bangladesh’s Public Health Foundation, points to the situation in his country, where vaccination coverage in rural and economically disadvantaged communities has suffered owing to weak vaccine-promotion initiatives and the difficulty many people face in registering online for jabs.
Joarder also cites the example of the COVID-19 immunization drive for children aged 12 years and above. “[Children] are given the Pfizer vaccine, which requires an ultralow temperature for storage. This is almost impossible to administer in many parts of the country, especially the rural areas. So, a large proportion of the children are being left out of vaccination,” says Joarder, adding that Corbevax, which is cheaper and requires regular temperature refrigeration “can be an excellent alternative to Pfizer for vaccinating rural children.”
Corbevax vs. mRNA Vaccines
As opposed to most other COVID-19 vaccines, which use the new Messenger RNA (mRNA) vaccine technology, Corbevax is an “old school” vaccine, says Hotez. The vaccine is made through microbial fermentation in yeast, similar to the process used to produce the recombinant hepatitis B vaccine, which has been administered to children in several countries for decades. Hence, says Hotez, the technology to produce Corbevax at large scales is already in place in countries like Vietnam, Bangladesh, India, Indonesia, Brazil, Argentina, among many others.
“So if you want to rapidly develop and produce and empower low- and middle-income countries, this is the technology to do it,” he says.
“Global access to high-quality vaccines will require serious investment in other types of COVID-19 vaccines," says Andrea Taylor.
The COVID-19 vaccines created by Pfizer-BioNTech and Moderna marked the first time that mRNA vaccine technology was approved for use. However, scientists like Young feel that there is “a need to be pragmatic and not seduced by new technologies when older, tried and tested approaches can also be effective.” Taylor, meanwhile, says that although mRNA vaccines have dominated the COVID-19 vaccine market in the U.S., “there is no clear grounding for this preference in the data we have so far.” She adds that there is also growing evidence that the immunity from these shots may not hold up as well over time as that of vaccines using different platforms.
“The mRNA vaccines are well suited to wealthy countries with sufficient ultra-cold storage and transportation infrastructure, but these vaccines are divas and do not travel well in the rest of the world,” says Taylor. “Global access to high-quality vaccines will require serious investment in other types of COVID-19 vaccines, such as the protein subunit platform used by Novavax and Corbevax. These require only standard refrigeration, can be manufactured using existing facilities all over the world, and are easy to transport.”
Joarder adds that Corbevax is cheaper due to the developers’ waived intellectual rights. It could also be used as a booster vaccine in Bangladesh, where only five per cent of the population has currently received booster doses. “If this vaccine is proved effective for heterologous boosting, [meaning] it works well and is well tolerated as a booster with other vaccines that are available in Bangladesh, this can be useful,” says Joarder.
According to Hotez, Corbevax can play several important roles - as a standalone adult or paediatric vaccine, and as a booster for other vaccines. Studies are underway to determine Corbevax’s effectiveness in these regards, he says.
Need for More Data
Biological E conducted two clinical trials involving 3000 subjects in India, and found Corbevax to be “safe and immunogenic,” with 90 percent effectiveness in preventing symptomatic infections from the original strain of COVID-19 and over 80 percent effectiveness against the Delta variant. The vaccine is currently in use in India, and according to Hotez, it’s in the pipeline at different stages in Indonesia, Bangladesh and Botswana.
However, Corbevax is yet to receive emergency use approval from the WHO. Experts such as Joarder see the need for more trials and data before it is made available to the general population. He says that while the WHO’s emergency approval is essential for global scale-up of the vaccine, we need data to determine age-stratified efficacy of the vaccine and whether it can be used for heterologous boosting with other vaccines. “According to the most recent data, the 100 percent circulating variant in Bangladesh is Omicron. We need to know how effective is Corbevax against the Omicron variant,” says Joarder.
Shabir Madhi, a vaccinologist at the University of the Witwatersrand, Johannesburg and co-director of the African Local Initiative for Vaccinology Expertise, says that a majority of people in Africa have now developed immunity through natural infection. “This has come at a high cost of loss of lives."
Shivan Parusnath
Others, meanwhile, believe that availing vaccines to poorer countries is not enough to resolve the inequity. Young, the Warwick virologist, says that the global vaccination rollout has also suffered from a degree of vaccine hesitancy, echoing similar observations by President Biden and Pfizer’s CEO. The problem can be blamed on poor communication about the benefits of vaccination. “The Corbevax vaccine [helps with the issues of] patent protection, vaccine storage and distribution, but governments need to ensure that their people are clearly informed.” Notably, however, some research has found higher vaccine willingness in lower-income countries than in the U.S.
Young also emphasized the importance of establishing local vaccination stations to improve access. For some countries, meanwhile, it may be too late. Speaking about the African continent, Madhi says that Corbevax has arrived following the peak of the crisis and won’t reverse the suffering and death that has transpired because of vaccine hoarding by high-income countries.
“The same goes for all the sudden donations from countries such as France - pretty much of little to no value when the pandemic is at its tail end,” says Madhi. “This, unfortunately, is a repeat of the swine flu pandemic in 2009, when vaccines only became available to Africa after the pandemic had very much subsided.”