The U.S. must fund more biotech innovation – or other countries will catch up faster than you think
The U.S. has approximately 58 percent of the market share in the biotech sector, followed by China with 11 percent. However, this market share is the result of several years of previous research and development (R&D) – it is a present picture of what happened in the past. In the future, this market share will decline unless the federal government makes investments to improve the quality and quantity of U.S. research in biotech.
The effectiveness of current R&D can be evaluated in a variety of ways such as monies invested and the number of patents filed. According to the UNESCO Institute for Statistics, the U.S. spends approximately 2.7 percent of GDP on R&D ($476,459.0M), whereas China spends 2 percent ($346,266.3M). However, investment levels do not necessarily translate into goods that end up contributing to innovation.
Patents are a better indication of innovation. The biotech industry relies on patents to protect their investments, making patenting a key tool in the process of translating scientific discoveries that can ultimately benefit patients. In 2020, China filed 1,497,159 patents, a 6.9 percent increase in growth rate. In contrast, the U.S. filed 597,172, a 3.9 percent decline. When it comes to patents filed, China has approximately 45 percent of the world share compared to 18 percent for the U.S.
So how did we get here? The nature of science in academia allows scientists to specialize by dedicating several years to advance discovery research and develop new inventions that can then be licensed by biotech companies. This makes academic science critical to innovation in the U.S. and abroad.
Academic scientists rely on government and foundation grants to pay for R&D, which includes salaries for faculty, investigators and trainees, as well as monies for infrastructure, support personnel and research supplies. Of particular interest to academic scientists to cover these costs is government support such as Research Project Grants, also known as R01 grants, the oldest grant mechanism from the National Institutes of Health. Unfortunately, this funding mechanism is extremely competitive, as applications have a success rate of only about 20 percent. To maximize the chances of getting funded, investigators tend to limit the innovation of their applications, since a project that seems overambitious is discouraged by grant reviewers.
Considering the difficulty in obtaining funding, the limited number of opportunities for scientists to become independent investigators capable of leading their own scientific projects, and the salaries available to pay for scientists with a doctoral degree, it is not surprising that the U.S. is progressively losing its workforce for innovation.
This approach affects the future success of the R&D enterprise in the U.S. Pursuing less innovative work tends to produce scientific results that are more obvious than groundbreaking, and when a discovery is obvious, it cannot be patented, resulting in fewer inventions that go on to benefit patients. Even though there are governmental funding options available for scientists in academia focused on more groundbreaking and translational projects, those options are less coveted by academic scientists who are trying to obtain tenure and long-term funding to cover salaries and other associated laboratory expenses. Therefore, since only a small percent of projects gets funded, the likelihood of scientists interested in pursuing academic science or even research in general keeps declining over time.
Efforts to raise the number of individuals who pursue a scientific education are paying off. However, the number of job openings for those trainees to carry out independent scientific research once they graduate has proved harder to increase. These limitations are not just in the number of faculty openings to pursue academic science, which are in part related to grant funding, but also the low salary available to pay those scientists after they obtain their doctoral degree, which ranges from $53,000 to $65,000, depending on years of experience.
Thus, considering the difficulty in obtaining funding, the limited number of opportunities for scientists to become independent investigators capable of leading their own scientific projects, and the salaries available to pay for scientists with a doctoral degree, it is not surprising that the U.S. is progressively losing its workforce for innovation, which results in fewer patents filed.
Perhaps instead of encouraging scientists to propose less innovative projects in order to increase their chances of getting grants, the U.S. government should give serious consideration to funding investigators for their potential for success -- or the success they have already achieved in contributing to the advancement of science. Such a funding approach should be tiered depending on career stage or years of experience, considering that 42 years old is the median age at which the first R01 is obtained. This suggests that after finishing their training, scientists spend 10 years before they establish themselves as independent academic investigators capable of having the appropriate funds to train the next generation of scientists who will help the U.S. maintain or even expand its market share in the biotech industry for years to come. Patenting should be given more weight as part of the academic endeavor for promotion purposes, or governmental investment in research funding should be increased to support more than just 20 percent of projects.
Remaining at the forefront of biotech innovation will give us the opportunity to not just generate more jobs, but it will also allow us to attract the brightest scientists from all over the world. This talented workforce will go on to train future U.S. scientists and will improve our standard of living by giving us the opportunity to produce the next generation of therapies intended to improve human health.
This problem cannot rely on just one solution, but what is certain is that unless there are more creative changes in funding approaches for scientists in academia, eventually we may be saying “remember when the U.S. was at the forefront of biotech innovation?”
The coronavirus pandemic exposed significant weaknesses in the country's food supply chain. Grocery store meat counters were bare. Transportation interruptions influenced supply. Finding beef, poultry, and pork at the store has been, in some places, as challenging as finding toilet paper.
In traditional agriculture models, it takes at least three months to raise chicken, six to nine months for pigs, and 18 months for cattle.
It wasn't a lack of supply -- millions of animals were in the pipeline.
"There's certainly enough food out there, but it can't get anywhere because of the way our system is set up," said Amy Rowat, an associate professor of integrative biology and physiology at UCLA. "Having a more self-contained, self-sufficient way to produce meat could make the supply chain more robust."
Cultured meat could be one way of making the meat supply chain more resilient despite disruptions due to pandemics such as COVID-19. But is the country ready to embrace lab-grown food?
According to a Good Food Institute study, GenZ is almost twice as likely to embrace meat alternatives for reasons related to social and environmental awareness, even prior to the pandemic. That's because this group wants food choices that reflect their values around food justice, equity, and animal welfare.
Largely, the interest in protein alternatives has been plant-based foods. However, factors directly related to COVID-19 may accelerate consumer interest in the scaling up of cell-grown products, according to Liz Specht, the associate director of science and technology at The Good Food Institute. The latter is a nonprofit organization that supports scientists, investors, and entrepreneurs working to develop food alternatives to conventional animal products.
While lab-grown food isn't ready yet to definitively crisis-proof the food supply chain, experts say it offers promise.
Matching Supply and Demand
Companies developing cell-grown meat claim it can take as few as two months to develop a cell into an edible product, according to Anthony Chow, CFA at Agronomics Limited, an investment company focused on meat alternatives. Tissue is taken from an animal and placed in a culture that contains nutrients and proteins the cells need to grow and expand. He cites a Good Food Institute report that claims a 2.5-millimeter sample can grow three and a half tons of meat in 40 days, allowing for exponential growth when needed.
In traditional agriculture models, it takes at least three months to raise chicken, six to nine months for pigs, and 18 months for cattle. To keep enough maturing animals in the pipeline, farms must plan the number of animals to raise months -- even years -- in advance. Lab-grown meat advocates say that because cultured meat supplies can be flexible, it theoretically allows for scaling up or down in significantly less time.
"Supply and demand has drastically changed in some way around the world and cultivated meat processing would be able to adapt much quicker than conventional farming," Chow said.
Scaling Up
Lab-grown meat may provide an eventual solution, but not in the immediate future, said Paul Mozdziak, a professor of physiology at North Carolina State University who researches animal cell culture techniques, transgenic animal production, and muscle biology.
"The challenge is in culture media," he said. "It's going to take some innovation to get the cells to grow at quantities that are going to be similar to what you can get from an animal. These are questions that everybody in the space is working on."
Chow says some of the most advanced cultured meat companies, such as BlueNal, anticipate introducing products to the market midway through next year. However, he thinks COVID-19 has slowed the process. Once introduced, they will be at a premium price, most likely available at restaurants before they hit grocery store shelves.
"I think in five years' time it will be in a different place," he said. "I don't think that this will have relevance for this pandemic, but certainly beyond that."
"Plant-based meats may be perceived as 'alternatives' to meat, whereas lab-grown meat is producing the same meat, just in a much more efficient manner, without the environmental implications."
Of course, all the technological solutions in the world won't solve the problem unless people are open-minded about embracing them. At least for now, a lab-grown burger or bluefin tuna might still be too strange for many people, especially in the U.S.
For instance, a 2019 article published by "Frontiers in Sustainable Food Systems" reflects results from a study of 3,030 consumers showing that 29 percent of U.S. customers, 59 percent of Chinese consumers, and 56 percent of Indian consumers were either 'very' or 'extremely likely' to try cultivated meat.
"Lab-grown meat is genuine meat, at the cellular level, and therefore will match conventional meat with regard to its nutritional content and overall sensory experience. It could be argued that plant-based meat will never be able to achieve this," says Laura Turner, who works with Chow at Agronomics Limited. "Plant-based meats may be perceived as 'alternatives' to meat, whereas lab-grown meat is producing the same meat, just in a much more efficient manner, without the environmental implications."
A Solution Beyond This Pandemic
The coronavirus has done more than raise awareness of the fragility of food supply chains. It has also been a wakeup call for consumers and policy makers that it is time to radically rethink our meat, Specht says. Those factors have elevated the profile of lab-grown meat.
"I think the economy is getting a little bit more steam and if I was an investor, I would be getting excited about it," adds Mozdziak.
Beyond crises, Mozdziak explains that as affluence continues to increase globally, meat consumption increases exponentially. Yet farm animals can only grow so quickly and traditional farming won't be able to keep up.
"Even Tyson is saying that by 2050, there's not going to be enough capacity in the animal meat space to meet demand," he notes. "If we don't look at some innovative technologies, how are we going to overcome that?"
By mid-March, Alpha Lee was growing restless. A pioneer of AI-driven drug discovery, Lee leads a team of researchers at the University of Cambridge, but his lab had been closed amidst the government-initiated lockdowns spreading inexorably across Europe.
If the Moonshot proves successful, they hope it could serve as a future benchmark for finding new medicines for chronic diseases.
Having spoken to his collaborators across the globe – many of whom were seeing their own experiments and research projects postponed indefinitely due to the pandemic – he noticed a similar sense of frustration and helplessness in the face of COVID-19.
While there was talk of finding a novel treatment for the virus, Lee was well aware the process was likely to be long and laborious. Traditional methods of drug discovery risked suffering the same fate as the efforts to find a cure for SARS in the early 2000, which took years and were ultimately abandoned long before a drug ever reached the market.
To avoid such an outcome, Lee was convinced that global collaboration was required. Together with a collection of scientists in the UK, US and Israel, he launched the 'COVID Moonshot' – a project which encouraged chemists worldwide to share their ideas for potential drug designs. If the Moonshot proves successful, they hope it could serve as a future benchmark for finding new medicines for chronic diseases.
Solving a Complex Jigsaw
In February, ShanghaiTech University published the first detailed snapshots of the SARS-CoV-2 coronavirus's proteins using a technique called X-ray crystallography. In particular, they revealed a high-resolution profile of the virus's main protease – the part of its structure that enables it to replicate inside a host – and the main drug target. The images were tantalizing.
"We could see all the tiny pieces sitting in the structure like pieces of a jigsaw," said Lee. "All we needed was for someone to come up with the best idea of joining these pieces together with a drug. Then you'd be left with a strong molecule which sits in the protease, and stops it from working, killing the virus in the process."
Normally, ideas for how best to design such a drug would be kept as carefully guarded secrets within individual labs and companies due to their potential value. But as a result, the steady process of trial and error to reach an optimum design can take years to come to fruition.
However, given the scale of the global emergency, Lee felt that the scientific community would be open to collective brainstorming on a mass scale. "Big Pharma usually wouldn't necessarily do this, but time is of the essence here," he said. "It was a case of, 'Let's just rethink every drug discovery stage to see -- ok, how can we go as fast as we can?'"
On March 13, he launched the COVID moonshot, calling for chemists around the globe to come up with the most creative ideas they could think of, on their laptops at home. No design was too weird or wacky to be considered, and crucially nothing would be patented. The entire project would be done on a not-for-profit basis, meaning that any drug that makes it to market will have been created simply for the good of humanity.
It caught fire: Within just two weeks, more than 2,300 potential drug designs had been submitted. By the middle of July, over 10,000 had been received from scientists around the globe.
The Road Toward Clinical Trials
With so many designs to choose from, the team has been attempting to whittle them down to a shortlist of the most promising. Computational drug discovery experts at Diamond and the Weizmann Institute of Science in Rehovot, Israel, have enabled the Moonshot team to develop algorithms for predicting how quick and easy each design would be to make, and to predict how well each proposed drug might bind to the virus in real life.
The latter is an approach known as computational covalent docking and has previously been used in cancer research. "This was becoming more popular even before COVID-19, with several covalent drugs approved by the FDA in recent years," said Nir London, professor of organic chemistry at the Weizmann Institute, and one of the Moonshot team members. "However, all of these were for oncology. A covalent drug against SARS-CoV-2 will certainly highlight covalent drug-discovery as a viable option."
Through this approach, the team have selected 850 compounds to date, which they have manufactured and tested in various preclinical trials already. Fifty of these compounds - which appear to be especially promising when it comes to killing the virus in a test tube – are now being optimized further.
Lee is hoping that at least one of these potential drugs will be shown to be effective in curing animals of COVID-19 within the next six months, a step that would allow the Moonshot team to reach out to potential pharmaceutical partners to test their compounds in humans.
Future Implications
If the project does succeed, some believe it could open the door to scientific crowdsourcing as a future means of generating novel medicine ideas for other diseases. Frank von Delft, professor of protein science and structural biology at the University of Oxford's Nuffield Department of Medicine, described it as a new form of 'citizen science.'
"There's a vast resource of expertise and imagination that is simply dying to be tapped into," he said.
Others are slightly more skeptical, pointing out that the uniqueness of the current crisis has meant that many scientists were willing to contribute ideas without expecting any future compensation in return. This meant that it was easy to circumvent the traditional hurdles that prevent large-scale global collaborations from happening – namely how to decide who will profit from the final product and who will hold the intellectual property (IP) rights.
"I think it is too early to judge if this is a viable model for future drug discovery," says London. "I am not sure that without the existential threat we would have seen so many contributions, and so many people and institutions willing to waive compensation and future royalties. Many scientists found themselves at home, frustrated that they don't have a way to contribute to the fight against COVID-19, and this project gave them an opportunity. Plus many can get behind the fact that this project has no associated IP and no one will get rich off of this effort. This breaks down a lot of the typical barriers and red-tape for wider collaboration."
"If a drug would sprout from one of these crowdsourced ideas, it would serve as a very powerful argument to consider this mode of drug discovery further in the future."
However the Moonshot team believes that if they can succeed, it will at the very least send a strong statement to policy makers and the scientific community that greater efforts should be made to make such large-scale collaborations more feasible.
"All across the scientific world, we've seen unprecedented adoption of open-science, collaboration and collegiality during this crisis, perhaps recognizing that only a coordinated global effort could address this global challenge," says London. "If a drug would sprout from one of these crowdsourced ideas, it would serve as a very powerful argument to consider this mode of drug discovery further in the future."
[An earlier version of this article was published on June 8th, 2020 as part of a standalone magazine called GOOD10: The Pandemic Issue. Produced as a partnership among LeapsMag, The Aspen Institute, and GOOD, the magazine is available for free online.]