The U.S. must fund more biotech innovation – or other countries will catch up faster than you think
The U.S. has approximately 58 percent of the market share in the biotech sector, followed by China with 11 percent. However, this market share is the result of several years of previous research and development (R&D) – it is a present picture of what happened in the past. In the future, this market share will decline unless the federal government makes investments to improve the quality and quantity of U.S. research in biotech.
The effectiveness of current R&D can be evaluated in a variety of ways such as monies invested and the number of patents filed. According to the UNESCO Institute for Statistics, the U.S. spends approximately 2.7 percent of GDP on R&D ($476,459.0M), whereas China spends 2 percent ($346,266.3M). However, investment levels do not necessarily translate into goods that end up contributing to innovation.
Patents are a better indication of innovation. The biotech industry relies on patents to protect their investments, making patenting a key tool in the process of translating scientific discoveries that can ultimately benefit patients. In 2020, China filed 1,497,159 patents, a 6.9 percent increase in growth rate. In contrast, the U.S. filed 597,172, a 3.9 percent decline. When it comes to patents filed, China has approximately 45 percent of the world share compared to 18 percent for the U.S.
So how did we get here? The nature of science in academia allows scientists to specialize by dedicating several years to advance discovery research and develop new inventions that can then be licensed by biotech companies. This makes academic science critical to innovation in the U.S. and abroad.
Academic scientists rely on government and foundation grants to pay for R&D, which includes salaries for faculty, investigators and trainees, as well as monies for infrastructure, support personnel and research supplies. Of particular interest to academic scientists to cover these costs is government support such as Research Project Grants, also known as R01 grants, the oldest grant mechanism from the National Institutes of Health. Unfortunately, this funding mechanism is extremely competitive, as applications have a success rate of only about 20 percent. To maximize the chances of getting funded, investigators tend to limit the innovation of their applications, since a project that seems overambitious is discouraged by grant reviewers.
Considering the difficulty in obtaining funding, the limited number of opportunities for scientists to become independent investigators capable of leading their own scientific projects, and the salaries available to pay for scientists with a doctoral degree, it is not surprising that the U.S. is progressively losing its workforce for innovation.
This approach affects the future success of the R&D enterprise in the U.S. Pursuing less innovative work tends to produce scientific results that are more obvious than groundbreaking, and when a discovery is obvious, it cannot be patented, resulting in fewer inventions that go on to benefit patients. Even though there are governmental funding options available for scientists in academia focused on more groundbreaking and translational projects, those options are less coveted by academic scientists who are trying to obtain tenure and long-term funding to cover salaries and other associated laboratory expenses. Therefore, since only a small percent of projects gets funded, the likelihood of scientists interested in pursuing academic science or even research in general keeps declining over time.
Efforts to raise the number of individuals who pursue a scientific education are paying off. However, the number of job openings for those trainees to carry out independent scientific research once they graduate has proved harder to increase. These limitations are not just in the number of faculty openings to pursue academic science, which are in part related to grant funding, but also the low salary available to pay those scientists after they obtain their doctoral degree, which ranges from $53,000 to $65,000, depending on years of experience.
Thus, considering the difficulty in obtaining funding, the limited number of opportunities for scientists to become independent investigators capable of leading their own scientific projects, and the salaries available to pay for scientists with a doctoral degree, it is not surprising that the U.S. is progressively losing its workforce for innovation, which results in fewer patents filed.
Perhaps instead of encouraging scientists to propose less innovative projects in order to increase their chances of getting grants, the U.S. government should give serious consideration to funding investigators for their potential for success -- or the success they have already achieved in contributing to the advancement of science. Such a funding approach should be tiered depending on career stage or years of experience, considering that 42 years old is the median age at which the first R01 is obtained. This suggests that after finishing their training, scientists spend 10 years before they establish themselves as independent academic investigators capable of having the appropriate funds to train the next generation of scientists who will help the U.S. maintain or even expand its market share in the biotech industry for years to come. Patenting should be given more weight as part of the academic endeavor for promotion purposes, or governmental investment in research funding should be increased to support more than just 20 percent of projects.
Remaining at the forefront of biotech innovation will give us the opportunity to not just generate more jobs, but it will also allow us to attract the brightest scientists from all over the world. This talented workforce will go on to train future U.S. scientists and will improve our standard of living by giving us the opportunity to produce the next generation of therapies intended to improve human health.
This problem cannot rely on just one solution, but what is certain is that unless there are more creative changes in funding approaches for scientists in academia, eventually we may be saying “remember when the U.S. was at the forefront of biotech innovation?”
23andMe Is Using Customers’ Genetic Data to Develop Drugs. Is This Brilliant or Dubious?
Leading direct-to-consumer (DTC) genetic testing companies are continuously unveiling novel ways to leverage their vast stores of genetic data.
"23andMe will tell you what diseases you have and then sell you the drugs to treat them."
As reported last week, 23andMe's latest concept is to develop and license new drugs using the data of consumers who have opted in to let their information be used for research. To date, over 10 million people have used the service and around 80 percent have opted in, making its database one of the largest in the world.
Culture researcher Dr. Julia Creet is one of the foremost experts on the DTC genetic testing industry, and in her forthcoming book, The Genealogical Sublime, she bluntly examines whether such companies' motives and interests are in sync with those of consumers.
Leapsmag caught up with Creet about the latest news and the wider industry's implications for health and privacy.
23andMe has just announced that it plans to license a newly developed anti-inflammatory drug, the first one created using its customers' genetic data, to Almirall, a pharma company in Spain. What's your take?
I think this development is the next step in the evolution of the company and its "double-sided" marketing model. In the past, as it enticed customers to give it their DNA, it sold the results and the medical information divulged by customers to other drug companies. Now it is positioning itself to reap the profits of a new model by developing treatments itself.
Given that there are many anti-inflammatory drugs on the market already, whatever Almirall produces might not have much of an impact. We might see this canny move as a "proof of concept," that 23andMe has learned how to "leverage" its genetic data without having to sell them to a third party. In a way, the privacy provisions will be much less complicated, and the company stands to attract investment as it turns itself into [a pseudo pharmaceutical company], a "pharma-psuedocal" company.
Emily Drabant Conley, the president of business development, has said that 23andMe is pursuing other drug compounds and may conduct their own clinical trials rather than licensing them out to their existing research partners. The end goal, it seems, is to make direct-to-consumer DNA testing to drug production and sales back to that same consumer base a seamless and lucrative circle. You have to admit it's a brilliant business model. 23andMe will tell you what diseases you have and then sell you the drugs to treat them.
In your new book, you describe how DTC genetic testing companies have capitalized on our innate human desire to connect with or ancestors and each other. I quote you: "This industry has taken that potent, spiritual, all-too-human need to belong... and monetized it in a particularly exploitative way." But others argue that DTC genetic testing companies are merely providing a service in exchange for fair-market compensation. So where does exploitation come into the picture?
Yes, the industry provides a fee for service, but that's only part of the story. The rest of the story reveals a pernicious industry that hides its business model behind the larger science project of health and heredity. All of the major testing companies play on the idea of "lack," that we can't know who we are unless we buy information about ourselves. When you really think about it, "Who do you think you are?" is a pernicious question that suggests that we don't or can't know who we or to whom we are related without advanced data searches and testing. This existential question used to be a philosophical question; now the answers are provided by databases that acquire more valuable information than they provide in the exchange.
"It's a brilliant business model that exploits consumer naiveté."
As you've said before, consumers are actually paying to be the product because the companies are likely to profit more from selling their genetic data. Could you elaborate?
The largest databases, AncestryDNA and 23andMe, have signed lucrative agreements with biotech companies that pay them for the de-identified data of their customers. What's so valuable is the DNA combined with the family relationships. Consumers provide the family relationships and the companies link and extrapolate the results to larger and larger family trees. Combined with the genetic markers for certain diseases, or increased susceptibility to certain diseases, these databases are very valuable for biotech research.
None of that value will ever be returned to consumers except in the form of for-profit drugs. Ancestry, in particular, has removed all information about its "research partners" from its website, making it very difficult to see how it is profiting from its third-party sales. 23andMe is more open about its "two-sided business model," but encourages consumers to donate their information to science. It's a brilliant business model that exploits consumer naiveté.
A WIRED journalist wrote that "23andMe has been sharing insights gleaned from consented customer data with GSK and at least six other pharmaceutical and biotechnology firms for the past three and a half years." Is this a consumer privacy risk?
I don't see that 23andMe did anything to which consumers didn't consent, albeit through arguably unreadable terms and conditions. The part that worries me more is the 300 phenotype data points that the company has collected on its consumers through longitudinal surveys designed, as Anne Wojcicki, CEO and Co-founder of 23andMe, put it, "to circumvent medical records and just self-report."
Everyone is focused on the DNA, but it's the combination of genetic samples, genealogical information and health records that is the most potent dataset, and 23andMe has figured out a way to extract all three from consumers.
Edible Silverware Is the Next Big Thing in Sustainable Eating
Sure, you may bring a reusable straw when you go out to eat. But what about digesting your silverware at the restaurant? The future is already here.
Edible cutlery feels like a natural progression post-reusable straw.
Air New Zealand just added the new edible coffee cup Twiice into their in-flight service. Made from vanilla, wheat flower, sugar, egg and vanilla essence, the Twiice cups will be standard issue for the international airline.
On the ground, the new, award-winning startup IncrEDIBLESpoon has shipped more than a quarter million edible scoopers. The spoons are all-natural, vegan, and made from wheat, oat, corn, chickpea and barley.
The technological breakthrough is in creating tasty, mass-market material durable enough for delivery in an assembly line environment like airplane service, as well as stable enough to hold a hot cup of coffee or a freezing scoop of ice cream. Twiice cups can last several hours after hot coffee is added, while IncrEDIBLESpoon cutlery holds up to 45 minutes.
"We already caught the interest of a couple major ice cream chains," says Dinesh Tadepalli, co-founder of the IncrEDIBLESpoon parent company Planeteer. "If all goes well, one of them will test out our spoons at their scoop shop early this year."
Next Up
Edible cutlery feels like a natural progression post-reusable straw. And more is already on the menu.
The coffee cup company Twiice is already planning on expanding. Co-founder Jamie Cashmore says other serving items are coming later this year.
IncrEDIBLESpoon is also getting into more utensils. "We plan to mass produce the complete set by year's end: Edible straws, edible forks and edible coffee stirrers," Tadepalli says.
Most notably, Twiice's partner Air New Zealand sees the coffee cup as just a start to other sustainable solutions. The airline estimates it currently serves eight million cups of coffee annually. It's even suggesting customers bring their own reusable cup to the plane – though that isn't as ergonomic nor as attractive as eating everything you are served.
Open Questions
Making everything edible has a few challenges. First is cultural acceptance: With respect to current success, changing eating habits will require going beyond eco-focused and curious eaters.
Second, it's unclear if the short-term economics will add up in favor of airline carriers and other companies. Like alternative fuel, organizations will be more likely to adopt new science when it doesn't require a retrofitting or expensive change to their current business model – even if it does create long-term benefits.
The changes will likely be lopsided, influencing cultures at different times. Airplanes are a great start, as passengers are a captive audience interested in removing waste as soon as possible.
"Imagine eating a black pepper spoon after your soup or a chocolate spoon after your ice cream?"
We can expect edible cutlery to make an easier impact with certain cultures or foods. For instance, injera, the spongy Ethiopian bread, has served as an African plate of sorts for years. It makes sense that IncrEDIBLESpoon's four flavors, Salt, Masala, Spinach and Root, all fit in another bread-as-plate friendly culture: Indian.
Coffee and desserts sound like a good bet for now, though, especially for foodies. "People are curious to try edible spoons as they never heard or experienced them before," Tadepalli says. "Imagine eating a black pepper spoon after your soup or a chocolate spoon after your ice cream?"