Where Are the Lab-Grown Replacement Organs?
The headline blared from newspapers all the way back in 2006: "First Lab-Grown Organs Implanted in Humans!" A team from Wake Forest University had biopsied cells from the bladders of patients with spina bifida and used them to create brand new full-size bladders, which they then implanted. Although the bladders had to be emptied via catheter, they were still functioning a few years after implantation, and the public grew confident that doctors had climbed an intermediary step on the way to the medicine of science fiction. Ten years later, though, more than 20 people a day are still dying while waiting for an organ transplant, which leads to a simple question: Where are our fake organs?
"We can make small organs and tissues but we can't make larger ones."
Not coming anytime soon, unfortunately. The company that was created to transition Wake Forest's bladders to the market failed. And while there are a few simple bioengineered skins and cartilages already on the market, they are hardly identical to the real thing. Something like a liver could take another 20 to 25 years, says Shay Soker, professor at Wake Forest's Institute for Regenerative Medicine. "The first barrier is the technology: We can make small organs and tissues but we can't make larger ones," he says. "Also there are several cell types or functions that you can reliably make from stem cells, but not all of them, so the technology of stem cells has to catch up with what the body can do." Finally, he says, you have support the new organ inside the body, providing it with a circulatory and nervous system and integrating it with the immune system.
While these are all challenging problems, circulation appears to be the most intractable. "Tissue's not able to survive if the cells don't have oxygen, and the bigger it gets, the more complex vasculature you need to keep that alive," says Chiara Ghezzi, research professor in the Tufts University Department of Biomedical Engineering. "Vasculature is highly organized in the body. It has a hierarchical structure, with different branches that have different roles depending on where they are." So far, she says, researchers have had trouble scaling up from capillaries to larger vessels that could be grafted onto blood vessels in a patient's body.
"The FDA is still getting its hands and minds around the field of tissue engineering."
Last, but hardly least, is the question of FDA approval. Lab-grown organs are neither drugs nor medical devices, and the agency is not set up to quickly or easily approve new technologies that don't fit into current categories. "The FDA is still getting its hands and minds around the field of tissue engineering," says Soker. "They were not used to that… so it requires the regulatory and financial federal agencies to really help and support these initiatives."
A pencil eraser-size model of the human brain is now being used for drug development and research.
If all of this sounds discouraging, it's worth mentioning some of the incredible progress the field has made since the first strides toward lab-grown organs began nearly 30 years ago: Though full-size replacement organs are still decades away, many labs have diverted their resources into what they consider an intermediate step, developing miniature organs and systems that can be used for drug development and research. This platform will yield more relevant results (Imagine! Testing cardiovascular drugs on an actual human heart!) and require the deaths of far fewer animals. And it's already here: Two years ago, scientists at Ohio State University developed a pencil eraser-size model of the human brain they intend to use for this exact purpose.
Perhaps the most exciting line of research these days is one that at first doesn't seem to have anything to do with bioengineered organs at all. Along with his colleagues, Chandan Sen, Director of the Center for Regenerative Medicine and Cell-based Therapies at Ohio State University, has developed a nanoscale chip that can turn any cell in the body into any other kind of cell—reverting fully differentiated adult cells into, essentially, stem cells, which can then grow into any tissue you want. Sen has used his chip to reprogram skin cells in the bodies of mice into neurons to help them recover from strokes, and blood vessels to save severe leg injuries. "There's this concept of a bioreactor, where you convince an organ to grow outside the body. They're getting more and more sophisticated over time. But to my mind it will never match the sophistication or complexity of the human body," Sen says. "I believe that in order to have an organ that behaves the way you want it to in the live body, you must use the body itself as a bioreactor, not a bunch of electronic gadgetry." There you have it, the next step in artificial organ manufacture is as crazy as it is intuitive: Grow it back where it was in the first place.
The Good, the Bad, and the Ugly in Personalized Medicine
Is the value of "personalized medicine" over-promised? Why is the quality of health care declining for many people despite the pace of innovation? Do patients and doctors have conflicting priorities? What is the best path forward?
"How do we generate evidence for value, which is what everyone is asking for?"
Some of the country's leading medical experts recently debated these questions at the prestigious annual Personalized Medicine Conference, held at Harvard Medical School in Boston, and LeapsMag was there to bring you the inside scoop.
Personalized Medicine: Is It Living Up to the Hype?
The buzzworthy phrase "personalized medicine" has been touted for years as the way of the future—customizing care to patients based on their predicted responses to treatments given their individual genetic profiles or other analyses. Since the initial sequencing of the human genome around fifteen years ago, the field of genomics has exploded as the costs have dramatically come down – from $2.7 billion to $1000 or less today. Given cheap access to such crucial information, the medical field has been eager to embrace an ultramodern world in which preventing illnesses is status quo, and treatments can be tailored for maximum effectiveness. But whether that world has finally arrived remains debatable.
"I've been portrayed as an advocate for genomics, because I'm excited about it," said Robert C. Green, Director of the Genomes2People Research Program at Harvard Medical School, the Broad Institute, and Brigham and Women's Hospital. He qualified his advocacy by saying that he tries to remain 'equipoised' or balanced in his opinions about the future of personalized medicine, and expressed skepticism about some aspects of its rapid commercialization.
"I have strong feelings about some of the [precision medicine] products that are rushing out to market in both the physician-mediated space and the consumer space," Green said, and challenged the value and sustainability of these products, such as their clinical utility and ability to help produce favorable health outcomes. He asked what most patients and providers want to know, which is, "What are the medical, behavioral, and economic outcomes? How do we generate evidence for value, which is what everyone is asking for?" He later questioned whether the use of 'sexy' and expensive diagnostic technologies is necessarily better than doing things the old-fashioned way. For instance, it is much easier and cheaper to ask a patient directly about their family history of disease, instead of spending thousands of dollars to obtain the same information with pricey diagnostic tests.
"Our mantra is to try to do data-driven health...to catch disease when it occurs early."
Michael Snyder, Professor & Chair of the Department of Genetics and Director of the Center for Genomics and Personalized Medicine at Stanford University, called himself more of an 'enthusiast' about precision medicine products like wearable devices that can digitally track vital signs, including heart rate and blood oxygen levels. "I'm certainly not equipoised," he said, adding, "Our mantra is to try to do data-driven health. We are using this to try to understand health and catch disease when it occurs early."
Snyder then shared his personal account about how his own wearable device alerted him to seek treatment while he was traveling in Norway. "My blood oxygen was low and my heart rate was high, so that told me something was up," he shared. After seeing a doctor, he discovered he was suffering from Lyme disease. He then shared other similar success stories about some of the patients in his department. Using wearable health sensors, he said, could significantly reduce health care costs: "$245 billion is spent every year on diabetes, and if we reduce that by ten percent we just saved $24 billion."
From left, Robert Green, Michael Snyder, Sandro Galea, and Thomas Miller.
(Courtesy Rachele Hendricks-Sturrup)
A Core Reality: Unresolved Societal Issues
Sandro Galea, Dean and Professor at Boston University's School of Public Health, coined himself as a 'skeptic' but also an 'enormous fan' of new technologies. He said, "I want to make sure that you all [the audience] have the best possible treatment for me when I get sick," but added, "In our rush and enthusiasm to embrace personalized and precision medicine approaches, we have done that at the peril of forgetting a lot of core realities."
"There's no one to pay for health care but all of us."
Galea stressed the need to first address certain difficult societal issues because failing to do so will deter precision medicine cures in the future. "Unless we pay attention to domestic violence, housing, racism, poor access to care, and poverty… we are all going to lose," he said. Then he quoted recent statistics about the country's growing gap in both health and wealth, which could potentially erode patient and provider interest in personalized medicine.
Thomas Miller, the founder and partner of a venture capital firm dedicated to advancing precision medicine, agreed with Galea and said that "there's no one to pay for health care but all of us." He recalled witnessing 'abuse' of diagnostic technologies that he had previously invested in. "They were often used as mechanisms to provide unnecessary care rather than appropriate care," he said. "The trend over my 30-year professional career has been that of sensitivity over specificity."
In other words: doctors rely too heavily on diagnostic tools that are sensitive enough to detect signs of a disease, but not accurate enough to confirm the presence of a specific disease. "You will always find that you're sick from something," Miller said. He lamented the counter-productivity and waste brought on by such 'abuse' and added, "That's money that could be used to address some of the problems that you [Galea] just talked about."
Do Patients and Providers Have Conflicting Priorities?
Distrust in the modern health care system is not new in the United States. That fact that medical errors were the third leading cause of death in 2016 may have fueled this mistrust even more. And the level of mistrust appears correlated with race; a recent survey of 118 adults between 18 to 75 years old showed that black respondents were less likely to trust their doctors than the non-Hispanic white respondents. The black respondents were also more concerned about personal privacy and potentially harmful hospital experimentation.
"The vast majority of physicians in this country are incentivized to keep you sick."
As if this context weren't troubling enough, some of the panelists suggested that health care providers and patients have misaligned goals, which may be financially driven.
For instance, Galea stated that health care is currently 'curative' even though that money is better spent on prevention versus cures. "The vast majority of physicians in this country are incentivized to keep you sick," he declared. "They are paid by sick patient visits. Hospital CEOs are paid by the number of sick people they have in their beds." He highlighted this issue as a national priority and mentioned some case studies showing that the behaviors of hospital CEOs quickly change when payment is based on the number of patients in beds versus the number of patients being kept out of the beds. Green lauded Galea's comment as "good sense."
Green also cautioned the audience about potential financial conflicts of interest held by proponents of precision medicine technologies. "Many of the people who are promoting genomics and personalized medicine are people who have financial interests in that arena," he warned. He emphasized that those who are perhaps curbing the over-enthusiasm do not have financial interests at stake.
What is the Best Path Forward for Personalized Medicine?
As useful as personalized medicine may be for selecting the best course of treatment, there is also the flip side: It can allow doctors to predict who will not respond well—and this painful reality must be acknowledged.
Miller argued, "We have a duty to call out therapies that won't work, that will not heal, that need to be avoided, and that will ultimately lead to you saying to a patient, 'There is nothing for you that will work.'"
Although that may sound harsh, it captures the essence of this emerging paradigm, which is to maximize health by using tailored methods that are based on comparative effectiveness, evidence of outcomes, and patient preferences. After all, as Miller pointed out, it wouldn't do much good to prescribe someone a regimen with little reason to think it might help.
For the hype around personalized medicine to be fully realized, Green concluded, "We have to prove to people that [the value of it] is true."
The rise of remote work is a win-win for people with disabilities and employers
Disability advocates see remote work as a silver lining of the pandemic, a win-win for adults with disabilities and the business world alike.
Any corporate leader would jump at the opportunity to increase their talent pool of potential employees by 15 percent, with all these new hires belonging to an underrepresented minority. That’s especially true given tight labor markets and CEO desires to increase headcount. Yet, too few leaders realize that people with disabilities are the largest minority group in this country, numbering 50 million.
Some executives may dread the extra investments in accommodating people’s disabilities. Yet, providing full-time remote work could suffice, according to a new study by the Economic Innovation Group think tank. The authors found that the employment rate for people with disabilities did not simply reach the pre-pandemic level by mid-2022, but far surpassed it, to the highest rate in over a decade. “Remote work and a strong labor market are helping [individuals with disabilities] find work,” said Adam Ozimek, who led the research and is chief economist at the Economic Innovation Group.
Disability advocates see this development as a silver lining of the pandemic, a win-win for adults with disabilities and the business world alike. For decades before the pandemic, employers had refused requests from workers with disabilities to work remotely, according to Thomas Foley, executive director of the National Disability Institute. During the pandemic, "we all realized that...many of us could work remotely,” Foley says. “[T]hat was disproportionately positive for people with disabilities."
Charles-Edouard Catherine, director of corporate and government relations for the National Organization on Disability, said that remote-work options had been advocated for many years to accommodate disabilities. “It’s a little frustrating that for decades corporate America was saying it’s too complicated, we’ll lose productivity, and now suddenly it’s like, sure, let’s do it.”
The pandemic opened doors for people with disabilities
Early in the pandemic, employment rates dropped for everyone, including people with disabilities, according to Ozimek’s research. However, these rates recovered quickly. In the second quarter of 2022, people with disabilities aged 25 to 54, the prime working age, are 3.5 percent more likely to be employed, compared to before the pandemic.
What about people without disabilites? They are still 1.1 percent less likely to be employed.
These numbers suggest that remote work has enabled a substantial number of people with disabilities to find and retain employment.
“We have a last-in, first-out labor market, and [people with disabilities] are often among the last in and the first out,” Ozimek says. However, this dynamic has changed, with adults with disabilities seeing employment rates recover much faster. Now, the question is whether the new trend will endure, Ozimek adds. “And my conclusion is that not only is it a permanent thing, but it’s going to improve.”
Gene Boes, president and chief executive of the Northwest Center, a Seattle organization that helps people with disabilities become more independent, confirms this finding. “The new world we live in has opened the door a little bit more…because there’s just more demand for labor.”
Long COVID disabilities put a premium on remote work
Remote work can help mitigate the impact of long COVID. The U.S. Centers for Disease Control and Prevention reports that about 19 percent of those who had COVID developed long COVID. Recent Census Bureau data indicates that 16 million working age Americans suffer from it, with economic costs estimated at $3.7 trillion.
Certainly, many of these so-called long-haulers experience relatively mild symptoms - such as loss of smell - which, while troublesome, are not disabling. But other symptoms are serious enough to be disabilities.
According to a recent study from the Federal Reserve Bank of Minneapolis, about a quarter of those with long COVID changed their employment status or working hours. That means long COVID was serious enough to interfere with work for 4 million people. For many, the issue was serious enough to qualify them as disabled.
Indeed, the Federal Reserve Bank of New York found in a just-released study that the number of individuals with disabilities in the U.S. grew by 1.7 million. That growth stemmed mainly from long COVID conditions such as fatigue and brain fog, meaning difficulties with concentration or memory, with 1.3 million people reporting an increase in brain fog since mid-2020.
Many had to drop out of the labor force due to long COVID. Yet, about 900,000 people who are newly disabled have managed to continue working. Without remote work, they might have lost these jobs.
For example, a software engineer at one of my client companies has struggled with brain fog related to long COVID. With remote work, this employee can work during the hours when she feels most mentally alert and focused, even if that means short bursts of productivity throughout the day. With flexible scheduling, she can take rests, meditate, or engage in activities that help her regain focus and energy. Without the need to commute to the office, she can save energy and time and reduce stress, which is crucial when dealing with brain fog.
In fact, the author of the Federal Reserve Bank of New York study notes that long COVID can be considered a disability under the Americans with Disability Act, depending on the specifics of the condition. That means the law can require private employers with fifteen or more staff, as well as government agencies, to make reasonable accommodations for those with long COVID. Richard Deitz, the author of this study, writes in the paper that “telework and flexible scheduling are two accommodations that can be particularly beneficial for workers dealing with fatigue and brain fog.”
The current drive to return to the office, led by many C-suite executives, may need to be reconsidered in light of legal and HR considerations. Arlene S. Kanter, director of the disability law and policy program at the Syracuse University College of Law, said that the question should depend on whether people with disabilities can perform their work well at home, as they did during Covid outbreaks. “[T]hen people with disabilities, as a matter of accommodation, shouldn’t be denied that right,” Kanter said.
Diversity benefits
But companies shouldn’t need to worry about legal regulations. It simply makes dollars and sense to expand their talent pool by 15% of an underrepresented minority. After all, extensive research shows that improving diversity boosts both decision-making and financial performance.
Companies that are offering more flexible work options have already gained significant benefits in terms of diverse hires. In its efforts to adapt to the post-pandemic environment, Meta, the owner of Facebook and Instagram, decided to offer permanent fully remote work options to its entire workforce. And according to Meta chief diversity officer Maxine Williams, the candidates who accepted job offers for remote positions were “substantially more likely” to come from diverse communities: people with disabilities, Black, Hispanic, Alaskan Native, Native American, veterans, and women. The numbers bear out these claims: people with disabilities increased from 4.7 to 6.2 percent of Meta’s employees.
Having consulted for 21 companies to help them transition to hybrid work arrangements, I can confirm that Meta’s numbers aren’t a fluke. The more my clients proved willing to offer remote work, the more staff with disabilities they recruited - and retained. That includes employees with mobility challenges. But it also includes employees with less visible disabilities, such as people with long COVID and immunocompromised people who feel reluctant to put themselves at risk of getting COVID by coming into the office.
Unfortunately, many leaders fail to see the benefits of remote work for underrepresented groups, such as those with disabilities. Some even say the opposite is true, with JP Morgan CEO Jamie Dimon claiming that returning to the office will aid diversity.
What explains this poor executive decision making? Part of the answer comes from a mental blindspot called the in-group bias. Our minds tend to favor and pay attention to the concerns of those in the group of people who seem to look and think like us. Dimon and other executives without disabilities don’t perceive people with disabilities to be part of their in-group. They thus are blind to the concerns of those with disabilities, which leads to misperceptions such as Dimon’s that returning to the office will aid diversity.
In-group bias is one of many dangerous judgment errors known as cognitive biases. They impact decision making in all life areas, ranging from the future of work to relationships.
Another relevant cognitive bias is the empathy gap. This term refers to our difficulty empathizing with those outside of our in-group. The lack of empathy combines with the blindness from the in-group bias, causing executives to ignore the feelings of employees with disabilities and prospective hires.
Omission bias also plays a role. This dangerous judgment error causes us to perceive failure to act as less problematic than acting. Consequently, executives perceive a failure to support the needs of those with disabilities as a minor matter.
Conclusion
The failure to empower people with disabilities through remote work options will prove costly to the bottom lines of companies. Not only are limiting their talent pool by 15 percent, they’re harming their ability to recruit and retain diverse candidates. And as their lawyers and HR departments will tell them, by violating the ADA, they are putting themselves in legal jeopardy.
By contrast, companies like Meta - and my clients - that offer remote work opportunities are seizing a competitive advantage by recruiting these underrepresented candidates. They’re lowering costs of labor while increasing diversity. The future belongs to the savvy companies that offer the flexibility that people with disabilities need.