The U.S. must fund more biotech innovation – or other countries will catch up faster than you think
The U.S. has approximately 58 percent of the market share in the biotech sector, followed by China with 11 percent. However, this market share is the result of several years of previous research and development (R&D) – it is a present picture of what happened in the past. In the future, this market share will decline unless the federal government makes investments to improve the quality and quantity of U.S. research in biotech.
The effectiveness of current R&D can be evaluated in a variety of ways such as monies invested and the number of patents filed. According to the UNESCO Institute for Statistics, the U.S. spends approximately 2.7 percent of GDP on R&D ($476,459.0M), whereas China spends 2 percent ($346,266.3M). However, investment levels do not necessarily translate into goods that end up contributing to innovation.
Patents are a better indication of innovation. The biotech industry relies on patents to protect their investments, making patenting a key tool in the process of translating scientific discoveries that can ultimately benefit patients. In 2020, China filed 1,497,159 patents, a 6.9 percent increase in growth rate. In contrast, the U.S. filed 597,172, a 3.9 percent decline. When it comes to patents filed, China has approximately 45 percent of the world share compared to 18 percent for the U.S.
So how did we get here? The nature of science in academia allows scientists to specialize by dedicating several years to advance discovery research and develop new inventions that can then be licensed by biotech companies. This makes academic science critical to innovation in the U.S. and abroad.
Academic scientists rely on government and foundation grants to pay for R&D, which includes salaries for faculty, investigators and trainees, as well as monies for infrastructure, support personnel and research supplies. Of particular interest to academic scientists to cover these costs is government support such as Research Project Grants, also known as R01 grants, the oldest grant mechanism from the National Institutes of Health. Unfortunately, this funding mechanism is extremely competitive, as applications have a success rate of only about 20 percent. To maximize the chances of getting funded, investigators tend to limit the innovation of their applications, since a project that seems overambitious is discouraged by grant reviewers.
Considering the difficulty in obtaining funding, the limited number of opportunities for scientists to become independent investigators capable of leading their own scientific projects, and the salaries available to pay for scientists with a doctoral degree, it is not surprising that the U.S. is progressively losing its workforce for innovation.
This approach affects the future success of the R&D enterprise in the U.S. Pursuing less innovative work tends to produce scientific results that are more obvious than groundbreaking, and when a discovery is obvious, it cannot be patented, resulting in fewer inventions that go on to benefit patients. Even though there are governmental funding options available for scientists in academia focused on more groundbreaking and translational projects, those options are less coveted by academic scientists who are trying to obtain tenure and long-term funding to cover salaries and other associated laboratory expenses. Therefore, since only a small percent of projects gets funded, the likelihood of scientists interested in pursuing academic science or even research in general keeps declining over time.
Efforts to raise the number of individuals who pursue a scientific education are paying off. However, the number of job openings for those trainees to carry out independent scientific research once they graduate has proved harder to increase. These limitations are not just in the number of faculty openings to pursue academic science, which are in part related to grant funding, but also the low salary available to pay those scientists after they obtain their doctoral degree, which ranges from $53,000 to $65,000, depending on years of experience.
Thus, considering the difficulty in obtaining funding, the limited number of opportunities for scientists to become independent investigators capable of leading their own scientific projects, and the salaries available to pay for scientists with a doctoral degree, it is not surprising that the U.S. is progressively losing its workforce for innovation, which results in fewer patents filed.
Perhaps instead of encouraging scientists to propose less innovative projects in order to increase their chances of getting grants, the U.S. government should give serious consideration to funding investigators for their potential for success -- or the success they have already achieved in contributing to the advancement of science. Such a funding approach should be tiered depending on career stage or years of experience, considering that 42 years old is the median age at which the first R01 is obtained. This suggests that after finishing their training, scientists spend 10 years before they establish themselves as independent academic investigators capable of having the appropriate funds to train the next generation of scientists who will help the U.S. maintain or even expand its market share in the biotech industry for years to come. Patenting should be given more weight as part of the academic endeavor for promotion purposes, or governmental investment in research funding should be increased to support more than just 20 percent of projects.
Remaining at the forefront of biotech innovation will give us the opportunity to not just generate more jobs, but it will also allow us to attract the brightest scientists from all over the world. This talented workforce will go on to train future U.S. scientists and will improve our standard of living by giving us the opportunity to produce the next generation of therapies intended to improve human health.
This problem cannot rely on just one solution, but what is certain is that unless there are more creative changes in funding approaches for scientists in academia, eventually we may be saying “remember when the U.S. was at the forefront of biotech innovation?”
Kira Peikoff was the editor-in-chief of Leaps.org from 2017 to 2021. As a journalist, her work has appeared in The New York Times, Newsweek, Nautilus, Popular Mechanics, The New York Academy of Sciences, and other outlets. She is also the author of four suspense novels that explore controversial issues arising from scientific innovation: Living Proof, No Time to Die, Die Again Tomorrow, and Mother Knows Best. Peikoff holds a B.A. in Journalism from New York University and an M.S. in Bioethics from Columbia University. She lives in New Jersey with her husband and two young sons. Follow her on Twitter @KiraPeikoff.
With a deadly pandemic sweeping the planet, many are questioning the comfort and security we have taken for granted in the modern world.
A century ago, when an influenza pandemic struck, we barely knew what viruses were.
More than a century after the germ theory, we are still at the mercy of a microbe we can neither treat, nor control, nor immunize against. Even more discouraging is that technology has in some ways exacerbated the problem: cars and air travel allow a new disease to quickly encompass the globe.
Some say we have grown complacent, that we falsely assume the triumphs of the past ensure a happy and prosperous future, that we are oblivious to the possibility of unpredictable "black swan" events that could cause our destruction. Some have begun to lose confidence in progress itself, and despair of the future.
But the new coronavirus should not defeat our spirit—if anything, it should spur us to redouble our efforts, both in the science and technology of medicine, and more broadly in the advance of industry. Because the best way to protect ourselves against future disasters is more progress, faster.
Science and technology have overall made us much better able to deal with disease. In the developed world, we have already tamed most categories of infectious disease. Most bacterial infections, such as tuberculosis or bacterial pneumonia, are cured with antibiotics. Waterborne diseases such as cholera are eliminated through sanitation; insect-borne ones such as malaria through pest control. Those that are not contagious until symptoms appear, such as SARS, can be handled through case isolation and contact tracing. For the rest, such as smallpox, polio, and measles, we develop vaccines, given enough time. COVID-19 could start a pandemic only because it fits a narrow category: a new, viral disease that is highly contagious via pre-symptomatic droplet/aerosol transmission, and that has a high mortality rate compared to seasonal influenza.
A century ago, when an influenza pandemic struck, we barely knew what viruses were; no one had ever seen one. Today we know what COVID-19 is down to its exact genome; in fact, we have sequenced thousands of COVID-19 genomes, and can track its history and its spread through their mutations. We can create vaccines faster today, too: where we once developed them in live animals, we now use cell cultures; where we once had to weaken or inactivate the virus itself, we can now produce vaccines based on the virus's proteins. And even though we don't yet have a treatment, the last century-plus of pharmaceutical research has given us a vast catalog of candidate drugs, already proven safe. Even now, over 50 candidate vaccines and almost 100 candidate treatments are in the research pipeline.
It's not just our knowledge that has advanced, but our methods. When smallpox raged in the 1700s, even the idea of calculating a case-fatality rate was an innovation. When the polio vaccine was trialled in the 1950s, the use of placebo-controlled trials was still controversial. The crucial measure of contagiousness, "R0", was not developed in epidemiology until the 1980s. And today, all of these methods are made orders of magnitude faster and more powerful by statistical and data visualization software.
If you're seeking to avoid COVID-19, the hand sanitizer gel you carry in a pocket or purse did not exist until the 1960s. If you start to show symptoms, the pulse oximeter that tests your blood oxygenation was not developed until the 1970s. If your case worsens, the mechanical ventilator that keeps you alive was invented in the 1950s—in fact, no form of artificial respiration was widely available until the "iron lung" used to treat polio patients in the 1930s. Even the modern emergency medical system did not exist until recently: if during the 1918 flu pandemic you became seriously ill, there was no 911 hotline to call, and any ambulance that showed up would likely have been a modified van or hearse, with no equipment or trained staff.
As many of us "shelter in place", we are far more able to communicate and collaborate, to maintain some semblance of normal life, than we ever would have been. To compare again to 1918: long-distance telephone service barely existed at that time, and only about a third of homes in the US even had electricity; now we can videoconference over Zoom and Skype. And the enormous selection and availability provided by online retail and food delivery have kept us stocked and fed, even when we don't want to venture out to the store.
Let the virus push us to redouble our efforts to make scientific, technological, and industrial progress on all fronts.
"Black swan" calamities can strike without warning at any time. Indeed, humanity has always been subject to them—drought and frost, fire and flood, war and plague. But we are better equipped now to deal with them than ever before. And the more progress we make, the better prepared we'll be for the next one. The accumulation of knowledge, technology, industrial infrastructure, and surplus wealth is the best buffer against any shock—whether a viral pandemic, a nuclear war, or an asteroid impact. In fact, the more worried we are about future crises, the more energetically we should accelerate science, technology and industry.
In this sense, we have grown complacent. We take the modern world for granted, so much so that some question whether further progress is even still needed. The new virus proves how much we do need it, and how far we still have to go. Imagine how different things would be if we had broad-spectrum antiviral drugs, or a way to enhance the immune system to react faster to infection, or a way to detect infection even before symptoms appear. These technologies may seem to belong to a Star Trek future—but so, at one time, did cell phones.
The virus reminds us that nature is indifferent to us, leaving us to fend entirely for ourselves. As we go to war against it, let us not take the need for such a war as reason for despair. Instead, let it push us to redouble our efforts to make scientific, technological, and industrial progress on all fronts. No matter the odds, applied intelligence is our best weapon against disaster.