Clever Firm Predicts Patients Most at Risk, Then Tries to Intervene Before They Get Sicker
The diabetic patient hit the danger zone.
Ideally, blood sugar, measured by an A1C test, rests at 5.9 or less. A 7 is elevated, according to the Diabetes Council. Over 10, and you're into the extreme danger zone, at risk of every diabetic crisis from kidney failure to blindness.
In three months of working with a case manager, Jen's blood sugar had dropped to 7.2, a much safer range.
This patient's A1C was 10. Let's call her Jen for the sake of this story. (Although the facts of her case are real, the patient's actual name wasn't released due to privacy laws.).
Jen happens to live in Pennsylvania's Lehigh Valley, home of the nonprofit Lehigh Valley Health Network, which has eight hospital campuses and various clinics and other services. This network has invested more than $1 billion in IT infrastructure and founded Populytics, a spin-off firm that tracks and analyzes patient data, and makes care suggestions based on that data.
When Jen left the doctor's office, the Populytics data machine started churning, analyzing her data compared to a wealth of information about future likely hospital visits if she did not comply with recommendations, as well as the potential positive impacts of outreach and early intervention.
About a month after Jen received the dangerous blood test results, a community outreach specialist with psychological training called her. She was on a list generated by Populytics of follow-up patients to contact.
"It's a very gentle conversation," says Cathryn Kelly, who manages a care coordination team at Populytics. "The case manager provides them understanding and support and coaching." The goal, in this case, was small behavioral changes that would actually stick, like dietary ones.
In three months of working with a case manager, Jen's blood sugar had dropped to 7.2, a much safer range. The odds of her cycling back to the hospital ER or veering into kidney failure, or worse, had dropped significantly.
While the health network is extremely localized to one area of one state, using data to inform precise medical decision-making appears to be the wave of the future, says Ann Mongovern, the associate director of Health Care Ethics at the Markkula Center for Applied Ethics at Santa Clara University in California.
"Many hospitals and hospital systems don't yet try to do this at all, which is striking given where we're at in terms of our general technical ability in this society," Mongovern says.
How It Happened
While many hospitals make money by filling beds, the Lehigh Valley Health Network, as a nonprofit, accepts many patients on Medicaid and other government insurances that don't cover some of the costs of a hospitalization. The area's population is both poorer and older than national averages, according to the U.S. Census data, meaning more people with higher medical needs that may not have the support to care for themselves. They end up in the ER, or worse, again and again.
In the early 2000s, LVHN CEO Dr. Brian Nester started wondering if his health network could develop a way to predict who is most likely to land themselves a pricey ICU stay -- and offer support before those people end up needing serious care.
Embracing data use in such specific ways also brings up issues of data security and patient safety.
"There was an early understanding, even if you go back to the (federal) balanced budget act of 1997, that we were just kicking the can down the road to having a functional financial model to deliver healthcare to everyone with a reasonable price," Nester says. "We've got a lot of people living longer without more of an investment in the healthcare trust."
Popultyics, founded in 2013, was the result of years of planning and agonizing over those population numbers and cost concerns.
"We looked at our own health plan," Nester says. Out of all the employees and dependants on the LVHN's own insurance network, "roughly 1.5 percent of our 25,000 people — under 400 people — drove $30 million of our $130 million on insurance costs -- about 25 percent."
"You don't have to boil the ocean to take cost out of the system," he says. "You just have to focus on that 1.5%."
Take Jen, the diabetic patient. High blood sugar can lead to kidney failure, which can mean weekly expensive dialysis for 20 years. Investing in the data and staff to reach patients, he says, is "pennies compared to $100 bills."
For most doctors, "there's no awareness for providers to know who they should be seeing vs. who they are seeing. There's no incentive, because the incentive is to see as many patients as you can," he says.
To change that, first the LVHN invested in the popular medical management system, Epic. Then, they negotiated with the top 18 insurance companies that cover patients in the region to allow access to their patient care data, which means they have reams of patient history to feed the analytics machine in order to make predictions about outcomes. Nester admits not every hospital could do that -- with 52 percent of the market share, LVHN had a very strong negotiating position.
Third party services take that data and churn out analytics that feeds models and care management plans. All identifying information is stripped from the data.
"We can do predictive modeling in patients," says Populytics President and CEO Gregory Kile. "We can identify care gaps. Those care gaps are noted as alerts when the patient presents at the office."
Kile uses himself as a hypothetical patient.
"I pull up Gregory Kile, and boom, I see a flag or an alert. I see he hasn't been in for his last blood test. There is a care gap there we need to complete."
"There's just so much more you can do with that information," he says, envisioning a future where follow-up for, say, knee replacement surgery and outcomes could be tracked, and either validated or changed.
Ethical Issues at the Forefront
Of course, embracing data use in such specific ways also brings up issues of security and patient safety. For example, says medical ethicist Mongovern, there are many touchpoints where breaches could occur. The public has a growing awareness of how data used to personalize their experiences, such as social media analytics, can also be monetized and sold in ways that benefit a company, but not the user. That's not to say data supporting medical decisions is a bad thing, she says, just one with potential for public distrust if not handled thoughtfully.
"You're going to need to do this to stay competitive," she says. "But there's obviously big challenges, not the least of which is patient trust."
So far, a majority of the patients targeted – 62 percent -- appear to embrace the effort.
Among the ways the LVHN uses the data is monthly reports they call registries, which include patients who have just come in contact with the health network, either through the hospital or a doctor that works with them. The community outreach team members at Populytics take the names from the list, pull their records, and start calling. So far, a majority of the patients targeted – 62 percent -- appear to embrace the effort.
Says Nester: "Most of these are vulnerable people who are thrilled to have someone care about them. So they engage, and when a person engages in their care, they take their insulin shots. It's not rocket science. The rocket science is in identifying who the people are — the delivery of care is easy."
As countries around the world combat the coronavirus outbreak, governments that already operated sophisticated surveillance programs are ramping up the tracking of their citizens.
"The potential for invasions of privacy, abuse, and stigmatization is enormous."
Countries like China, South Korea, Israel, Singapore and others are closely monitoring citizens to track the spread of the virus and prevent further infections, and policymakers in the United States have proposed similar steps. These shifts in policy have civil liberties defenders alarmed, as history has shown increases in surveillance tend to stick around after an emergency is over.
In China, where the virus originated and surveillance is already ubiquitous, the government has taken measures like having people scan a QR code and answer questions about their health and travel history to enter their apartment building. The country has also increased the tracking of cell phones, encouraged citizens to report people who appear to be sick, utilized surveillance drones, and developed facial recognition that can identify someone even if they're wearing a mask.
In Israel, the government has begun tracking people's cell phones without a court order under a program that was initially meant to counter terrorism. Singapore has also been closely tracking people's movements using cell phone data. In South Korea, the government has been monitoring citizens' credit card and cell phone data and has heavily utilized facial recognition to combat the spread of the coronavirus.
Here at home, the United States government and state governments have been using cell phone data to determine where people are congregating. White House senior adviser Jared Kushner's task force to combat the coronavirus outbreak has proposed using cell phone data to track coronavirus patients. Cities around the nation are also using surveillance drones to maintain social distancing orders. Companies like Apple and Google that work closely with the federal government are currently developing systems to track Americans' cell phones.
All of this might sound acceptable if you're worried about containing the outbreak and getting back to normal life, but as we saw when the Patriot Act was passed in 2001 in the wake of the 9/11 terrorist attacks, expansions of the surveillance state can persist long after the emergency that seemed to justify them.
Jay Stanley, senior policy analyst with the ACLU Speech, Privacy, and Technology Project, says that this public health emergency requires bold action, but he worries that actions may be taken that will infringe on our privacy rights.
"This is an extraordinary crisis that justifies things that would not be justified in ordinary times, but we, of course, worry that any such things would be made permanent," Stanley says.
Stanley notes that the 9/11 situation was different from this current situation because we still face the threat of terrorism today, and we always will. The Patriot Act was a response to that threat, even if it was an extreme response. With this pandemic, it's quite possible we won't face something like this again for some time.
"We know that for the last seven or eight decades, we haven't seen a microbe this dangerous become a pandemic, and it's reasonable to expect it's not going to be happening for a while afterward," Stanley says. "We do know that when a vaccine is produced and is produced widely enough, the COVID crisis will be over. This does, unlike 9/11, have a definitive ending."
The ACLU released a white paper last week outlining the problems with using location data from cell phones and how policymakers should proceed when they discuss the usage of surveillance to combat the outbreak.
"Location data contains an enormously invasive and personal set of information about each of us, with the potential to reveal such things as people's social, sexual, religious, and political associations," they wrote. "The potential for invasions of privacy, abuse, and stigmatization is enormous. Any uses of such data should be temporary, restricted to public health agencies and purposes, and should make the greatest possible use of available techniques that allow for privacy and anonymity to be protected, even as the data is used."
"The first thing you need to combat pervasive surveillance is to know that it's occurring."
Sara Collins, policy counsel at the digital rights organization Public Knowledge, says that one of the problems with the current administration is that there's not much transparency, so she worries surveillance could be increased without the public realizing it.
"You'll often see the White House come out with something—that they're going to take this action or an agency just says they're going to take this action—and there's no congressional authorization," Collins says. "There's no regulation. There's nothing there for the public discourse."
Collins says it's almost impossible to protect against infringements on people's privacy rights if you don't actually know what kind of surveillance is being done and at what scale.
"I think that's very concerning when there's no accountability and no way to understand what's actually happening," Collins says. "The first thing you need to combat pervasive surveillance is to know that it's occurring."
We should also be worried about corporate surveillance, Collins says, because the tech companies that keep track of our data work closely with the government and do not have a good track record when it comes to protecting people's privacy. She suspects these companies could use the coronavirus outbreak to defend the kind of data collection they've been engaging in for years.
Collins stresses that any increase in surveillance should be transparent and short-lived, and that there should be a limit on how long people's data can be kept. Otherwise, she says, we're risking an indefinite infringement on privacy rights. Her organization will be keeping tabs as the crisis progresses.
It's not that we shouldn't avail ourselves of modern technology to fight the pandemic. Indeed, once lockdown restrictions are gradually lifted, public health officials must increase their ability to isolate new cases and trace, test, and quarantine contacts.
But tracking the entire populace "Big Brother"-style is not the ideal way out of the crisis. Last week, for instance, a group of policy experts -- including former FDA Commissioner Scott Gottlieb -- published recommendations for how to achieve containment. They emphasized the need for widespread diagnostic and serologic testing as well as rapid case-based interventions, among other measures -- and they, too, were wary of pervasive measures to follow citizens.
The group wrote: "Improved capacity [for timely contact tracing] will be most effective if coordinated with health care providers, health systems, and health plans and supported by timely electronic data sharing. Cell phone-based apps recording proximity events between individuals are unlikely to have adequate discriminating ability or adoption to achieve public health utility, while introducing serious privacy, security, and logistical concerns."
The bottom line: Any broad increases in surveillance should be carefully considered before we go along with them out of fear. The Founders knew that privacy is integral to freedom; that's why they wrote the Fourth Amendment to protect it, and that right shouldn't be thrown away because we're in an emergency. Once you lose a right, you don't tend to get it back.
Kira Peikoff was the editor-in-chief of Leaps.org from 2017 to 2021. As a journalist, her work has appeared in The New York Times, Newsweek, Nautilus, Popular Mechanics, The New York Academy of Sciences, and other outlets. She is also the author of four suspense novels that explore controversial issues arising from scientific innovation: Living Proof, No Time to Die, Die Again Tomorrow, and Mother Knows Best. Peikoff holds a B.A. in Journalism from New York University and an M.S. in Bioethics from Columbia University. She lives in New Jersey with her husband and two young sons. Follow her on Twitter @KiraPeikoff.