Some companies claim remote work hurts wellbeing. Research shows the opposite.
Many leaders at top companies are trying to get workers to return to the office. They say remote and hybrid work are bad for their employees’ mental well-being and lead to a sense of social isolation, meaninglessness, and lack of work-life boundaries, so we should just all go back to office-centric work.
One example is Google, where the company’s leadership is defending its requirement of mostly in-office work for all staff as necessary to protect social capital, meaning people’s connections to and trust in one another. That’s despite a survey of over 1,000 Google employees showing that two-thirds feel unhappy about being forced to work in the office three days per week. In internal meetings and public letters, many have threatened to leave, and some are already quitting to go to other companies with more flexible options.
Last month, GM rolled out a policy similar to Google’s, but had to backtrack because of intense employee opposition. The same is happening in some places outside of the U.S. For instance, three-fifths of all Chinese employers are refusing to offer permanent remote work options, according to a survey this year from The Paper.
For their claims that remote work hurts well-being, some of these office-centric traditionalists cite a number of prominent articles. For example, Arthur Brooks claimed in an essay that “aggravation from commuting is no match for the misery of loneliness, which can lead to depression, substance abuse, sedentary behavior, and relationship damage, among other ills.” An article in Forbes reported that over two-thirds of employees who work from home at least part of the time had trouble getting away from work at the end of the day. And Fast Company has a piece about how remote work can “exacerbate existing mental health issues” like depression and anxiety.
For his part, author Malcolm Gladwell has also championed a swift return to the office, saying there is a “core psychological truth, which is we want you to have a feeling of belonging and to feel necessary…I know it’s a hassle to come into the office, but if you’re just sitting in your pajamas in your bedroom, is that the work life you want to live?”
These arguments may sound logical to some, but they fly in the face of research and my own experience as a behavioral scientist and as a consultant to Fortune 500 companies. In these roles, I have seen the pitfalls of in-person work, which can be just as problematic, if not more so. Remote work is not without its own challenges, but I have helped 21 companies implement a series of simple steps to address them.
Research finds that remote work is actually better for you
The trouble with the articles described above - and claims by traditionalist business leaders and gurus - stems from a sneaky misdirection. They decry the negative impact of remote and hybrid work for wellbeing. Yet they gloss over the damage to wellbeing caused by the alternative, namely office-centric work.
It’s like comparing remote and hybrid work to a state of leisure. Sure, people would feel less isolated if they could hang out and have a beer with their friends instead of working. They could take care of their existing mental health issues if they could visit a therapist. But that’s not in the cards. What’s in the cards is office-centric work. That means the frustration of a long commute to the office, sitting at your desk in an often-uncomfortable and oppressive open office for at least 8 hours, having a sad desk lunch and unhealthy snacks, sometimes at an insanely expensive cost and, for making it through this series of insults, you’re rewarded with more frustration while commuting back home.
In a 2022 survey, the vast majority of respondents felt that working remotely improved their work-life balance. Much of that improvement stemmed from saving time due to not needing to commute and having a more flexible schedule.
So what happens when we compare apples to apples? That’s when we need to hear from the horse’s mouth: namely, surveys of employees themselves, who experienced both in-office work before the pandemic, and hybrid and remote work after COVID struck.
Consider a 2022 survey by Cisco of 28,000 full-time employees around the globe. Nearly 80 percent of respondents say that remote and hybrid work improved their overall well-being: that applies to 83 percent of Millennials, 82 percent of Gen Z, 76 percent of Gen Z, and 66 percent of Baby Boomers. The vast majority of respondents felt that working remotely improved their work-life balance.
Much of that improvement stemmed from saving time due to not needing to commute and having a more flexible schedule: 90 percent saved 4 to 8 hours or more per week. What did they do with that extra time? The top choice for almost half was spending more time with family, friends and pets, which certainly helped address the problem of isolation from the workplace. Indeed, three-quarters of them report that working from home improved their family relationships, and 51 percent strengthened their friendships. Twenty percent used the freed up hours for self-care.
Of the small number who report their work-life balance has not improved or even worsened, the number one reason is the difficulty of disconnecting from work, but 82 percent report that working from anywhere has made them happier. Over half say that remote work decreased their stress levels.
Other surveys back up Cisco’s findings. For example, a 2022 Future Forum survey compared knowledge workers who worked full-time in the office, in a hybrid modality, and fully remote. It found that full-time in-office workers felt the least satisfied with work-life balance, hybrid workers were in the middle, and fully remote workers felt most satisfied. The same distribution applied to questions about stress and anxiety. A mental health website called Tracking Happiness found in a 2022 survey of over 12,000 workers that fully remote employees report a happiness level about 20 percent greater than office-centric ones. Another survey by CNBC in June found that fully remote workers are more often very satisfied with their jobs than workers who are fully in-person.
Academic peer-reviewed research provides further support. Consider a 2022 study published in the International Journal of Environmental Research and Public Health of bank workers who worked on the same tasks of advising customers either remotely or in-person. It found that fully remote workers experienced higher meaningfulness, self-actualization, happiness, and commitment than in-person workers. Another study, published by the National Bureau of Economic Research, reported that hybrid workers, compared to office-centric ones, experienced higher satisfaction with work and had 35 percent more job retention.
What about the supposed burnout crisis associated with remote work? Indeed, burnout is a concern. A survey by Deloitte finds that 77 percent of workers experienced burnout at their current job. Gallup came up with a slightly lower number of 67 percent in its survey. But guess what? Both of those surveys are from 2018, long before the era of widespread remote work.
By contrast, in a Gallup survey in late 2021, 58 percent of respondents reported less burnout. An April 2021 McKinsey survey found burnout in 54 percent of Americans and 49 percent globally. A September 2021 survey by The Hartford reported 61 percent burnout. Arguably, the increase in full or part-time remote opportunities during the pandemic helped to address feelings of burnout, rather than increasing them. Indeed, that finding aligns with the earlier surveys and peer-reviewed research suggesting remote and hybrid work improves wellbeing.
Remote work isn’t perfect – here’s how to fix its shortcomings
Still, burnout is a real problem for hybrid and remote workers, as it is for in-office workers. Employers need to offer mental health benefits with online options to help employees address these challenges, regardless of where they’re working.
Moreover, while they’re better overall for wellbeing, remote and hybrid work arrangements do have specific disadvantages around work-life separation. To address work-life issues, I advise my clients who I helped make the transition to hybrid and remote work to establish norms and policies that focus on clear expectations and setting boundaries.
For working at home and collaborating with others, there’s sometimes an unhealthy expectation that once you start your workday in your home office chair, and that you’ll work continuously while sitting there.
Some people expect their Slack or Microsoft Teams messages to be answered within an hour, while others check Slack once a day. Some believe email requires a response within three hours, and others feel three days is fine. As a result of such uncertainty and lack of clarity about what’s appropriate, too many people feel uncomfortable disconnecting and not replying to messages or doing work tasks after hours. That might stem from a fear of not meeting their boss’s expectations or not wanting to let their colleagues down.
To solve this problem, companies need to establish and incentivize clear expectations and boundaries. They should develop policies and norms around response times for different channels of communication. They also need to clarify work-life boundaries – for example, the frequency and types of unusual circumstances that will require employees to work outside of regular hours.
Moreover, for working at home and collaborating with others, there’s sometimes an unhealthy expectation that once you start your workday in your home office chair, and that you’ll work continuously while sitting there (except for your lunch break). That’s not how things work in the office, which has physical and mental breaks built in throughout the day. You took 5-10 minutes to walk from one meeting to another, or you went to get your copies from the printer and chatted with a coworker on the way.
Those and similar physical and mental breaks, research shows, decrease burnout, improve productivity, and reduce mistakes. That’s why companies should strongly encourage employees to take at least a 10-minute break every hour during remote work. At least half of those breaks should involve physical activity, such as stretching or walking around, to counteract the dangerous effects of prolonged sitting. Other breaks should be restorative mental activities, such as meditation, brief naps, walking outdoors, or whatever else feels restorative to you.
To facilitate such breaks, my client organizations such as the University of Southern California’s Information Sciences Institute shortened hour-long meetings to 50 minutes and half-hour meetings to 25 minutes, to give everyone – both in-person and remote workers – a mental and physical break and transition time.
Very few people will be reluctant to have shorter meetings. After that works out, move to other aspects of setting boundaries and expectations. Doing so will require helping team members get on the same page and reduce conflicts and tensions. By setting clear expectations, you’ll address the biggest challenge for wellbeing for remote and hybrid work: establishing clear work-life boundaries.
Technology is Redefining the Age of 'Older Mothers'
In October 2021, a woman from Gujarat, India, stunned the world when it was revealed she had her first child through in vitro fertilization (IVF) at age 70. She had actually been preceded by a compatriot of hers who, two years before, gave birth to twins at the age of 73, again with the help of IVF treatment. The oldest known mother to conceive naturally lived in the UK; in 1997, Dawn Brooke conceived a son at age 59.
These women may seem extreme outliers, almost freaks of nature; in the US, for example, the average age of first-time mothers is 26. A few decades from now, though, the sight of 70-year-old first-time mothers may not even raise eyebrows, say futurists.
“We could absolutely have more 70-year-old mothers because we are learning how to regulate the aging process better,” says Andrew Hessel, a microbiologist and geneticist, who cowrote "The Genesis Machine," a book about “rewriting life in the age of synthetic biology,” with Amy Webb, the futurist who recently wondered why 70-year-old women shouldn’t give birth.
Technically, we're already doing this, says Hessel, pointing to a technique known as in vitro gametogenesis (IVG). IVG refers to turning adult cells into sperm or egg cells. “You can think of it as the upgrade to IVF,” Hessel says. These vanguard stem cell research technologies can take even skin cells and turn them into induced pluripotent stem cells (iPSCs), which are basically master cells capable of maturing into any human cell, be it kidney cells, liver cells, brain cells or gametes, aka eggs and sperm, says Henry T. “Hank” Greely, a Stanford law professor who specializes in ethical, legal, and social issues in biosciences.
Mothers over 70 will be a minor blip, statistically speaking, Greely predicts.
In 2016, Greely wrote "The End of Sex," a book in which he described the science of making gametes out of iPSCs in detail. Greely says science will indeed enable us to see 70-year-old new mums fraternize with mothers several decades younger at kindergartens in the (not far) future. And it won’t be that big of a deal.
“An awful lot of children all around the world have been raised by grandmothers for millennia. To have 70-year-olds and 30-year-olds mingling in maternal roles is not new,” he says. That said, he doubts that many women will want to have a baby in the eighth decade of their life, even if science allows it. “Having a baby and raising a child is hard work. Even if 1% of all mothers are over 65, they aren’t going to change the world,” Greely says. Mothers over 70 will be a minor blip, statistically speaking, he predicts. But one thing is certain: the technology is here.
And more technologies for the same purpose could be on the way. In March 2021, researchers from Monash University in Melbourne, Australia, published research in Nature, where they successfully reprogrammed skin cells into a three-dimensional cellular structure that was morphologically and molecularly similar to a human embryo–the iBlastoid. In compliance with Australian law and international guidelines referencing the “primitive streak rule," which bans the use of embryos older than 14 days in scientific research, Monash scientists stopped growing their iBlastoids in vitro on day 11.
“The research was both cutting-edge and controversial, because it essentially created a new human life, not for the purpose of a patient who's wanting to conceive, but for basic research,” says Lindsay Wu, a senior lecturer in the School of Medical Sciences at the University of New South Wales (UNSW), in Kensington, Australia. If you really want to make sure what you are breeding is an embryo, you need to let it develop into a viable baby. “This is the real proof in the pudding,'' says Wu, who runs UNSW’s Laboratory for Ageing Research. Then you get to a stage where you decide for ethical purposes you have to abort it. “Fiddling here a bit too much?” he asks. Wu believes there are other approaches to tackling declining fertility due to older age that are less morally troubling.
He is actually working on them. Why would it be that women, who are at peak physical health in almost every other regard, in their mid- to late- thirties, have problems conceiving, asked Wu and his team in a research paper published in 2020 in Cell Reports. The simple answer is the egg cell. An average girl in puberty has between 300,000 and 400,000 eggs, while at around age 37, the same woman has only 25,000 eggs left. Things only go downhill from there. So, what torments the egg cells?
The UNSW team found that the levels of key molecules called NAD+ precursors, which are essential to the metabolism and genome stability of egg cells, decline with age. The team proceeded to add these vitamin-like substances back into the drinking water of reproductively aged, infertile lab mice, which then had babies.
“It's an important proof of concept,” says Wu. He is investigating how safe it is to replicate the experiment with humans in two ongoing studies. The ultimate goal is to restore the quality of egg cells that are left in patients in their late 30s and early- to mid-40s, says Wu. He sees the goal of getting pregnant for this age group as less ethically troubling, compared to 70-year-olds.
But what is ethical, anyway? “It is a tricky word,” says Hessel. He differentiates between ethics, which represent a personal position and may, thus, be more transient, and morality, longer lasting principles embraced across society such as, “Thou shalt not kill.” Unprecedented advances often bring out fear and antagonism until time passes and they just become…ordinary. When IVF pioneer Landrum Shettles tried to perform IVF in 1973, the chairman of Columbia’s College of Physicians and Surgeons interdicted the procedure at the last moment. Almost all countries in the world have IVF clinics today, and the global IVF services market is clearly a growth industry.
Besides, you don’t have a baby at 70 by accident: you really want it, Greely and Hessel agree. And by that age, mothers may be wiser and more financially secure, Hessel says (though he is quick to add that even the pregnancy of his own wife, who had her child at 40, was a high-risk one).
As a research question, figuring out whether older mothers are better than younger ones and vice-versa entails too many confounding variables, says Greely. And why should we focus on who’s the better mother anyway? “We've had 70-year-old and 80-year-old fathers forever–why should people have that much trouble getting used to mothers doing the same?” Greely wonders. For some women having a child at an old(er) age would be comforting; maybe that’s what matters.
And the technology to enable older women to have children is already here or coming very soon. That, perhaps, matters even more. Researchers have already created mice–and their offspring–entirely from scratch in the lab. “Doing this to produce human eggs is similar," says Hessel. "It is harder to collect tissues, and the inducing cocktails are different, but steady advances are being made." He predicts that the demand for fertility treatments will keep financing research and development in the area. He says that big leaps will be made if ethical concerns don’t block them: it is not far-fetched to believe that the first baby produced from lab-grown eggs will be born within the next decade.
In an op-ed in 2020 with Stat, Greely argued that we’ve already overcome the technical barrier for human cloning, but no one's really talking about it. Likewise, scientists are also working on enabling 70-year-old women to have babies, says Hessel, but most commentators are keeping really quiet about it. At least so far.
New Cell Therapies Give Hope to Diabetes Patients
For nearly four decades, George Huntley has thought constantly about his diabetes. Diagnosed in 1983 with Type 1 (insulin-dependent) diabetes, Huntley began managing his condition with daily finger sticks to check his blood glucose levels and doses of insulin that he injected into his abdomen. Even now, with an insulin pump and a device that continuously monitors his glucose, he must consider how every meal will affect his blood sugar, checking his monitor multiple times each hour.
Like many of those who depend on insulin injections, Huntley is simultaneously grateful for the technology that makes his condition easier to manage and tired of thinking about diabetes. If he could wave a magic wand, he says, he would make his diabetes disappear. So when he read about biotechs like ViaCyte and Vertex Pharmaceuticals developing new cell therapies that have the potential to cure Type 1 diabetes, Huntley was excited.
You also won’t see him signing up any time soon. The therapies under development by both companies would require a lifelong regimen of drugs for suppressing the immune system to prevent the body from rejecting the foreign cells. It’s a problem also seen in the transplant of insulin-producing cells of the pancreas – called islet cells – from deceased donors. To Howard Foyt, chief medical officer at ViaCyte, a San Diego-based biotech specializing in the development of cell therapies for diabetes, the tradeoff is worth it.
“A lot of the symptoms of diabetes are not something that you wear on your arm, so to speak. You’re not necessarily conscious of them until you’re successfully treated, and you feel better,” Foyt says.
For many with diabetes, managing these symptoms is a constant game of Whack-a-Mole. “Any form of treatment that gets someone closer to feeling good is a victory,” he says.
“Am I going to be trading diabetes for cancer? That’s not a chance I
want to take."
But not everyone is convinced. What’s more, it’s likely that the availability of these cell therapies will be limited to those with life-threatening diabetes symptoms, such as hypoglycemia unawareness. To Huntley, these therapies remain a bit of a Faustian bargain.
“Am I going to be trading diabetes for cancer? That’s not a chance I want to take,” he says.
The discovery of insulin in 1921 transformed Type 1 diabetes from a death sentence into a potentially manageable condition. Even as better versions of insulin hit the market—ones that weren’t derived from pigs and wouldn’t provoke an allergic response, longer-acting insulin, insulin pens—they didn’t change the reality that those with Type 1 diabetes remained dependent on insulin. Even the most advanced continuous glucose monitors (which tests blood sugar levels every few minutes, 24/7) and insulin pumps don’t perform as well as a healthy pancreas.
Whether by injection or pump, someone with diabetes needs to administer the insulin their body no longer makes. With advances in organ transplantation, the concept of transplanting insulin-producing pancreatic beta cells seemed obvious. After more than a decade of painstaking work, James Shapiro, who directs the Islet Transplant Program at the University of Albania, honed a process called the Edmonton Protocol for pancreas transplants. For a few patients who couldn’t control their blood sugars any other way, the Edmonton Protocol became a life saver. Some of these patients were even able to stop insulin completely, Shapiro says. But the high cost of organ transplant and a chronic shortage of donor organs, pancreas or otherwise, meant that only a small handful of patients could benefit.
Stem cells, however, can be grown in vats, meaning that supply would never be an issue. “We would be going from a very successful treatment of today to a potential cure tomorrow,” Shapiro says.
In 2014, spurred by his own children’s diagnoses with Type 1 Diabetes, stem cell biologist Doug Melton of Harvard University figured out a way to differentiate embryonic stem cells into functional pancreatic beta cells. It was a long process, explains immunoengineer Alice Tomei at the University of Miami, because “the islet is not one cell, it's like a mini-organ that has its own needs.”
Add on the risk of rejection and autoimmunity, and Tomei says that scientists soon realized that chronic and systemic immunosuppression was the only way forward. Over the next several years, Melton improved his approach to yield more cells with fewer impurities. Melton partnered with Boston-based Vertex Pharmaceuticals to create a cell therapy called VX-880.
The first patient received his dose earlier in 2021. In October, Vertex released 90-day results from the Phase 1/2 trial, which revealed the patient was able to reduce his insulin usage from an average of 34 units per day to just 2.9 units per day. The tradeoff is a lifelong need for immunosuppressive drugs to prevent the body from attacking both foreign cells and pancreatic beta cells. It’s what recipients of ViaCyte’s first-gen PEC-Direct will also need. For Foyt, it’s an easy choice.
“At this point in time, immunosuppression is the necessary evil,” he says. “For parents, would you like to worry about going into your child’s bedroom every morning and not knowing if they’re going to be alive or dead? It’s uncommon, but it does occur.”
Not everyone, however, finds the trade-off easy to swallow. Especially with COVID-19 cases reaching record highs, the prospect of reducing his immune function at a time when he needs it most doesn’t sit well with Huntley. The risks of immunosuppression also mean that diabetes cell therapies are limited to those patients with life-threatening complications.
It’s why ViaCyte has created two new iterations of cellular therapies that would eliminate this need. The ViaCyte-Encap contains the cells in a permeable container that allows oxygen, insulin, and nutrients to flow freely but prevents immune system access. Their latest model, PEC-QT, just began safety trials with Shapiro’s lab at the University of Alberta and uses gene editing to eliminate any cellular markers that would trigger an immune response.
Sanjoy Dutta, vice president of research at JDRF International, a nonprofit that funds the study of diabetes, is thrilled with the progress that’s been made around cell therapies, but he cautions it’s still early days. “We have proven that these cells can be made. What we haven’t seen is are they going to work for six months, two years, five years? It’s a challenge we still need to overcome,” he says.
Iowa social worker Jodi Lynn’s concerns echo Dutta’s. Lynn was diagnosed with diabetes in 1998 at age 14 after a bout of severe influenza, spends each day inventorying supplies, planning her food intake, and maintaining her insulin pump and glucose monitor. These newer technologies dramatically improved her blood sugar control but, like everyone with diabetes, Lynn remains at high risk for complications, such as diabetic ketoacidosis, heart disease, vision loss, and kidney failure. Lynn, already considered immunocompromised due to medications she takes for another autoimmune condition, is less concerned with immune suppression than the untested nature of these therapies.
“I want to know that they will work long-term,” she says.