Hours after a baby is born, its heel is pricked with a lancet. Drops of the infant's blood are collected on a porous card, which is then mailed to a state laboratory. The dried blood spots are screened for around thirty conditions, including phenylketonuria (PKU), the metabolic disorder that kick-started this kind of newborn screening over 60 years ago. In the U.S., parents are not asked for permission to screen their child. Newborn screening programs are public health programs, and the assumption is that no good parent would refuse a screening test that could identify a serious yet treatable condition in their baby.
Learning as much as you can about your child's health might seem like a natural obligation of parenting. But it's an assumption that I think needs to be much more closely examined.
Today, with the introduction of genome sequencing into clinical medicine, some are asking whether newborn screening goes far enough. As the cost of sequencing falls, should parents take a more expansive look at their children's health, learning not just whether they have a rare but treatable childhood condition, but also whether they are at risk for untreatable conditions or for diseases that, if they occur at all, will strike only in adulthood? Should genome sequencing be a part of every newborn's care?
It's an idea that appeals to Anne Wojcicki, the founder and CEO of the direct-to-consumer genetic testing company 23andMe, who in a 2016 interview with The Guardian newspaper predicted that having newborns tested would soon be considered standard practice—"as critical as testing your cholesterol"—and a new responsibility of parenting. Wojcicki isn't the only one excited to see everyone's genes examined at birth. Francis Collins, director of the National Institutes of Health and perhaps the most prominent advocate of genomics in the United States, has written that he is "almost certain … that whole-genome sequencing will become part of new-born screening in the next few years." Whether that would happen through state-mandated screening programs, or as part of routine pediatric care—or perhaps as a direct-to-consumer service that parents purchase at birth or receive as a baby-shower gift—is not clear.
Learning as much as you can about your child's health might seem like a natural obligation of parenting. But it's an assumption that I think needs to be much more closely examined, both because the results that genome sequencing can return are more complex and more uncertain than one might expect, and because parents are not actually responsible for their child's lifelong health and well-being.
What is a parent supposed to do about such a risk except worry?
Existing newborn screening tests look for the presence of rare conditions that, if identified early in life, before the child shows any symptoms, can be effectively treated. Sequencing could identify many of these same kinds of conditions (and it might be a good tool if it could be targeted to those conditions alone), but it would also identify gene variants that confer an increased risk rather than a certainty of disease. Occasionally that increased risk will be significant. About 12 percent of women in the general population will develop breast cancer during their lives, while those who have a harmful BRCA1 or BRCA2 gene variant have around a 70 percent chance of developing the disease. But for many—perhaps most—conditions, the increased risk associated with a particular gene variant will be very small. Researchers have identified over 600 genes that appear to be associated with schizophrenia, for example, but any one of those confers only a tiny increase in risk for the disorder. What is a parent supposed to do about such a risk except worry?
Sequencing results are uncertain in other important ways as well. While we now have the ability to map the genome—to create a read-out of the pairs of genetic letters that make up a person's DNA—we are still learning what most of it means for a person's health and well-being. Researchers even have a name for gene variants they think might be associated with a disease or disorder, but for which they don't have enough evidence to be sure. They are called "variants of unknown (or uncertain) significance (VUS), and they pop up in most people's sequencing results. In cancer genetics, where much research has been done, about 1 in 5 gene variants are reclassified over time. Most are downgraded, which means that a good number of VUS are eventually designated benign.
While one parent might reasonably decide to learn about their child's risk for a condition about which nothing can be done medically, a different, yet still thoroughly reasonable, parent might prefer to remain ignorant so that they can enjoy the time before their child is afflicted.
Then there's the puzzle of what to do about results that show increased risk or even certainty for a condition that we have no idea how to prevent. Some genomics advocates argue that even if a result is not "medically actionable," it might have "personal utility" because it allows parents to plan for their child's future needs, to enroll them in research, or to connect with other families whose children carry the same genetic marker.
Finding a certain gene variant in one child might inform parents' decisions about whether to have another—and if they do, about whether to use reproductive technologies or prenatal testing to select against that variant in a future child. I have no doubt that for some parents these personal utility arguments are persuasive, but notice how far we've now strayed from the serious yet treatable conditions that motivated governments to set up newborn screening programs, and to mandate such testing for all.
Which brings me to the other problem with the call for sequencing newborn babies: the idea that even if it's not what the law requires, it's what good parents should do. That idea is very compelling when we're talking about sequencing results that show a serious threat to the child's health, especially when interventions are available to prevent or treat that condition. But as I have shown, many sequencing results are not of this type.
While one parent might reasonably decide to learn about their child's risk for a condition about which nothing can be done medically, a different, yet still thoroughly reasonable, parent might prefer to remain ignorant so that they can enjoy the time before their child is afflicted. This parent might decide that the worry—and the hypervigilence it could inspire in them—is not in their child's best interest, or indeed in their own. This parent might also think that it should be up to the child, when he or she is older, to decide whether to learn about his or her risk for adult-onset conditions, especially given that many adults at high familial risk for conditions like Alzheimer's or Huntington's disease choose never to be tested. This parent will value the child's future autonomy and right not to know more than they value the chance to prepare for a health risk that won't strike the child until 40 or 50 years in the future.
Parents are not obligated to learn about their children's risk for a condition that cannot be prevented, has a small risk of occurring, or that would appear only in adulthood.
Contemporary understandings of parenting are famously demanding. We are asked to do everything within our power to advance our children's health and well-being—to act always in our children's best interests. Against that backdrop, the need to sequence every newborn baby's genome might seem obvious. But we should be skeptical. Many sequencing results are complex and uncertain. Parents are not obligated to learn about their children's risk for a condition that cannot be prevented, has a small risk of occurring, or that would appear only in adulthood. To suggest otherwise is to stretch parental responsibilities beyond the realm of childhood and beyond factors that parents can control.
The Brave New World of Using DNA to Store Data
Netscape co-founder-turned-venture capitalist billionaire investor Marc Andreessen once posited that software was eating the world. He was right, and the takeover of software resulted in many things. One of them is data. Lots and lots and lots of data. In the previous two years, humanity created more data than it did during its entire existence combined, and the amount will only increase. Think about it: The hundreds of 50KB emails you write a day, the dozens of 10MB photos, the minute-long, 350MB 4K video you shoot on your iPhone X add up to vast quantities of information. All that information needs to be stored. And that's becoming an issue as data volume outpaces storage space.
The race is on to find another medium capable of storing massive amounts of information in as small a space as possible.
"There won't be enough silicon to store all the data we need. It's unlikely that we can make flash memory smaller. We have reached the physical limits," Victor Zhirnov, chief scientist at the Semiconductor Research Corporation, says. "We are facing a crisis that's comparable to the oil crisis in the 1970s. By 2050, we're going to need to store 10 to the 30 bits, compared to 10 to the 23 bits in 2016." That amount of storage space is equivalent to each of the world's seven billion people owning almost six trillion -- that's 10 to the 12th power -- iPhone Xs with 256GB storage space.
The race is on to find another medium capable of storing massive amounts of information in as small a space as possible. Zhirnov and other scientists are looking at the human body, looking to DNA. "Nature has nailed it," Luis Ceze, a professor in the Department of Computer Science and Engineering at the University of Washington, says. "DNA is a molecular storage medium that is remarkable. It's incredibly dense, many, many thousands of times denser than the densest technology that we have today. And DNA is remarkably general. Any information you can map in bits you can store in DNA." It's so dense -- able to store a theoretical maximum of 215 petabytes (215 million gigabytes) in a single gram -- that all the data ever produced could be stored in the back of a tractor trailer truck.
Writing DNA can be an energy-efficient process, too. Consider how the human body is constantly writing and rewriting DNA, and does so on a couple thousand calories a day. And all it needs for storage is a cool, dark place, a significant energy savings when compared to server farms that require huge amounts of energy to run and even more energy to cool.
Picture it: tiny specks of inert DNA made from silicon or another material, stored in cool, dark, dry areas, preserved for all time.
Researchers first succeeded in encoding data onto DNA in 2012, when Harvard University geneticists George Church and Sri Kosuri wrote a 52,000-word book on A, C, G, and T base pairs. Their method only produced 1.28 petabytes per gram of DNA, however, a volume exceeded the next year when a group encoded all 154 Shakespeare sonnets and a 26-second clip of Martin Luther King's "I Have A Dream" speech. In 2017, Columbia University researchers Yaniv Erlich and Dina Zielinski made the process 60 percent more efficient.
The limiting factor today is cost. Erlich said the work his team did cost $7,000 to encode and decode two megabytes of data. To become useful in a widespread way, the price per megabyte needs to plummet. Even advocates concede this point. "Of course it is expensive," Zhirnov says. "But look how much magnetic storage cost in the 1980s. What you store today in your iPhone for virtually nothing would cost many millions of dollars in 1982." There's reason to think the price will continue to fall. Genome readers are improving, getting cheaper, faster, and smaller, and genome sequencing becomes cheaper every year, too. Picture it: tiny specks of inert DNA made from silicon or another material, stored in cool, dark, dry areas, preserved for all time.
"It just takes a few minutes to double a sample. A few more minutes, you double it again. Very quickly, you have thousands or millions of new copies."
Plus, DNA has another advantage over more traditional forms of storage: It's very easy to reproduce. "If you want a second copy of a hard disk drive, you need components for a disk drive, hook both drives up to a computer, and copy. That's a pain," Nick Goldman, a researcher at the European Bioinformatics Institute, says. "DNA, once you have that first sample, it's a process that is absolutely routine in thousands of laboratories around the world to multiply that using polymerase chain reaction [which uses temperature changes or other processes]. It just takes a few minutes to double a sample. A few more minutes, you double it again. Very quickly, you have thousands or millions of new copies."
This ability to duplicate quickly and easily is a positive trait. But, of course, there's also the potential for danger. Does encoding on DNA, the very basis for life, present ethical issues? Could it get out of control and fundamentally alter life as we know it?
The chance is there, but it's remote. The first reason is that storage could be done with only two base pairs, which would serve as replacements for the 0 and 1 digits that make up all digital data. While doing so would decrease the possible density of the storage, it would virtually eliminate the risk that the sequences would be compatible with life.
But even if scientists and researchers choose to use four base pairs, other safeguards are in place that will prevent trouble. According to Ceze, the computer science professor, the snippets of DNA that they write are very short, around 150 nucleotides. This includes the title, the information that's being encoded, and tags to help organize where the snippet should fall in the larger sequence. Furthermore, they generally avoid repeated letters, which dramatically reduces the chance that a protein could be synthesized from the snippet.
"In the future, we'll know enough about someone from a sample of their DNA that we could make a specific poison. That's the danger, not those of us who want to encode DNA for storage."
Inevitably, some DNA will get spilt. "But it's so unlikely that anything that gets created for storage would have a biological interpretation that could interfere with the mechanisms going on in a living organism that it doesn't worry me in the slightest," Goldman says. "We're not of concern for the people who are worried about the ethical issues of synthetic DNA. They are much more concerned about people deliberately engineering anthrax. In the future, we'll know enough about someone from a sample of their DNA that we could make a specific poison. That's the danger, not those of us who want to encode DNA for storage."
In the end, the reality of and risks surrounding encoding on DNA are the same as any scientific advancement: It's another system that is vulnerable to people with bad intentions but not one that is inherently unethical.
"Every human action has some ethical implications," Zhirnov says. "I can use a hammer to build a house or I can use it to harm another person. I don't see why DNA is in any way more or less ethical."
If that house can store all the knowledge in human history, it's worth learning how to build it.
Editor's Note: In response to readers' comments that silicon is one of the earth's most abundant materials, we reached back out to our source, Dr. Victor Zhirnov. He stands by his statement about a coming shortage of silicon, citing this research. The silicon oxide found in beach sand is unsuitable for semiconductors, he says, because the cost of purifying it would be prohibitive. For use in circuit-making, silicon must be refined to a purity of 99.9999999 percent. So the process begins by mining for pure quartz, which can only be found in relatively few places around the world.
The rise of remote work is a win-win for people with disabilities and employers
Disability advocates see remote work as a silver lining of the pandemic, a win-win for adults with disabilities and the business world alike.
Any corporate leader would jump at the opportunity to increase their talent pool of potential employees by 15 percent, with all these new hires belonging to an underrepresented minority. That’s especially true given tight labor markets and CEO desires to increase headcount. Yet, too few leaders realize that people with disabilities are the largest minority group in this country, numbering 50 million.
Some executives may dread the extra investments in accommodating people’s disabilities. Yet, providing full-time remote work could suffice, according to a new study by the Economic Innovation Group think tank. The authors found that the employment rate for people with disabilities did not simply reach the pre-pandemic level by mid-2022, but far surpassed it, to the highest rate in over a decade. “Remote work and a strong labor market are helping [individuals with disabilities] find work,” said Adam Ozimek, who led the research and is chief economist at the Economic Innovation Group.
Disability advocates see this development as a silver lining of the pandemic, a win-win for adults with disabilities and the business world alike. For decades before the pandemic, employers had refused requests from workers with disabilities to work remotely, according to Thomas Foley, executive director of the National Disability Institute. During the pandemic, "we all realized that...many of us could work remotely,” Foley says. “[T]hat was disproportionately positive for people with disabilities."
Charles-Edouard Catherine, director of corporate and government relations for the National Organization on Disability, said that remote-work options had been advocated for many years to accommodate disabilities. “It’s a little frustrating that for decades corporate America was saying it’s too complicated, we’ll lose productivity, and now suddenly it’s like, sure, let’s do it.”
The pandemic opened doors for people with disabilities
Early in the pandemic, employment rates dropped for everyone, including people with disabilities, according to Ozimek’s research. However, these rates recovered quickly. In the second quarter of 2022, people with disabilities aged 25 to 54, the prime working age, are 3.5 percent more likely to be employed, compared to before the pandemic.
What about people without disabilites? They are still 1.1 percent less likely to be employed.
These numbers suggest that remote work has enabled a substantial number of people with disabilities to find and retain employment.
“We have a last-in, first-out labor market, and [people with disabilities] are often among the last in and the first out,” Ozimek says. However, this dynamic has changed, with adults with disabilities seeing employment rates recover much faster. Now, the question is whether the new trend will endure, Ozimek adds. “And my conclusion is that not only is it a permanent thing, but it’s going to improve.”
Gene Boes, president and chief executive of the Northwest Center, a Seattle organization that helps people with disabilities become more independent, confirms this finding. “The new world we live in has opened the door a little bit more…because there’s just more demand for labor.”
Long COVID disabilities put a premium on remote work
Remote work can help mitigate the impact of long COVID. The U.S. Centers for Disease Control and Prevention reports that about 19 percent of those who had COVID developed long COVID. Recent Census Bureau data indicates that 16 million working age Americans suffer from it, with economic costs estimated at $3.7 trillion.
Certainly, many of these so-called long-haulers experience relatively mild symptoms - such as loss of smell - which, while troublesome, are not disabling. But other symptoms are serious enough to be disabilities.
According to a recent study from the Federal Reserve Bank of Minneapolis, about a quarter of those with long COVID changed their employment status or working hours. That means long COVID was serious enough to interfere with work for 4 million people. For many, the issue was serious enough to qualify them as disabled.
Indeed, the Federal Reserve Bank of New York found in a just-released study that the number of individuals with disabilities in the U.S. grew by 1.7 million. That growth stemmed mainly from long COVID conditions such as fatigue and brain fog, meaning difficulties with concentration or memory, with 1.3 million people reporting an increase in brain fog since mid-2020.
Many had to drop out of the labor force due to long COVID. Yet, about 900,000 people who are newly disabled have managed to continue working. Without remote work, they might have lost these jobs.
For example, a software engineer at one of my client companies has struggled with brain fog related to long COVID. With remote work, this employee can work during the hours when she feels most mentally alert and focused, even if that means short bursts of productivity throughout the day. With flexible scheduling, she can take rests, meditate, or engage in activities that help her regain focus and energy. Without the need to commute to the office, she can save energy and time and reduce stress, which is crucial when dealing with brain fog.
In fact, the author of the Federal Reserve Bank of New York study notes that long COVID can be considered a disability under the Americans with Disability Act, depending on the specifics of the condition. That means the law can require private employers with fifteen or more staff, as well as government agencies, to make reasonable accommodations for those with long COVID. Richard Deitz, the author of this study, writes in the paper that “telework and flexible scheduling are two accommodations that can be particularly beneficial for workers dealing with fatigue and brain fog.”
The current drive to return to the office, led by many C-suite executives, may need to be reconsidered in light of legal and HR considerations. Arlene S. Kanter, director of the disability law and policy program at the Syracuse University College of Law, said that the question should depend on whether people with disabilities can perform their work well at home, as they did during Covid outbreaks. “[T]hen people with disabilities, as a matter of accommodation, shouldn’t be denied that right,” Kanter said.
Diversity benefits
But companies shouldn’t need to worry about legal regulations. It simply makes dollars and sense to expand their talent pool by 15% of an underrepresented minority. After all, extensive research shows that improving diversity boosts both decision-making and financial performance.
Companies that are offering more flexible work options have already gained significant benefits in terms of diverse hires. In its efforts to adapt to the post-pandemic environment, Meta, the owner of Facebook and Instagram, decided to offer permanent fully remote work options to its entire workforce. And according to Meta chief diversity officer Maxine Williams, the candidates who accepted job offers for remote positions were “substantially more likely” to come from diverse communities: people with disabilities, Black, Hispanic, Alaskan Native, Native American, veterans, and women. The numbers bear out these claims: people with disabilities increased from 4.7 to 6.2 percent of Meta’s employees.
Having consulted for 21 companies to help them transition to hybrid work arrangements, I can confirm that Meta’s numbers aren’t a fluke. The more my clients proved willing to offer remote work, the more staff with disabilities they recruited - and retained. That includes employees with mobility challenges. But it also includes employees with less visible disabilities, such as people with long COVID and immunocompromised people who feel reluctant to put themselves at risk of getting COVID by coming into the office.
Unfortunately, many leaders fail to see the benefits of remote work for underrepresented groups, such as those with disabilities. Some even say the opposite is true, with JP Morgan CEO Jamie Dimon claiming that returning to the office will aid diversity.
What explains this poor executive decision making? Part of the answer comes from a mental blindspot called the in-group bias. Our minds tend to favor and pay attention to the concerns of those in the group of people who seem to look and think like us. Dimon and other executives without disabilities don’t perceive people with disabilities to be part of their in-group. They thus are blind to the concerns of those with disabilities, which leads to misperceptions such as Dimon’s that returning to the office will aid diversity.
In-group bias is one of many dangerous judgment errors known as cognitive biases. They impact decision making in all life areas, ranging from the future of work to relationships.
Another relevant cognitive bias is the empathy gap. This term refers to our difficulty empathizing with those outside of our in-group. The lack of empathy combines with the blindness from the in-group bias, causing executives to ignore the feelings of employees with disabilities and prospective hires.
Omission bias also plays a role. This dangerous judgment error causes us to perceive failure to act as less problematic than acting. Consequently, executives perceive a failure to support the needs of those with disabilities as a minor matter.
Conclusion
The failure to empower people with disabilities through remote work options will prove costly to the bottom lines of companies. Not only are limiting their talent pool by 15 percent, they’re harming their ability to recruit and retain diverse candidates. And as their lawyers and HR departments will tell them, by violating the ADA, they are putting themselves in legal jeopardy.
By contrast, companies like Meta - and my clients - that offer remote work opportunities are seizing a competitive advantage by recruiting these underrepresented candidates. They’re lowering costs of labor while increasing diversity. The future belongs to the savvy companies that offer the flexibility that people with disabilities need.