New device can diagnose concussions using AI
For a long time after Mary Smith hit her head, she was not able to function. Test after test came back normal, so her doctors ruled out the concussion, but she knew something was wrong. Finally, when she took a test with a novel EyeBOX device, recently approved by the FDA, she learned she indeed had been dealing with the aftermath of a concussion.
“I felt like even my husband and doctors thought I was faking it or crazy,” recalls Smith, who preferred not to disclose her real name. “When I took the EyeBOX test it showed that my eyes were not moving together and my BOX score was abnormal.” To her diagnosticians, scientists at the Minneapolis-based company Oculogica who developed the EyeBOX, these markers were concussion signs. “I cried knowing that finally someone could figure out what was wrong with me and help me get better,” she says.
Concussion affects around 42 million people worldwide. While it’s increasingly common in the news because of sports injuries, anything that causes damage to the head, from a fall to a car accident, can result in a concussion. The sudden blow or jolt can disrupt the normal way the brain works. In the immediate aftermath, people may suffer from headaches, lose consciousness and experience dizziness, confusion and vomiting. Some recover but others have side effects that can last for years, particularly affecting memory and concentration.
There is no simple standard-of-care test to confirm a concussion or rule it out. Neither do they appear on MRI and CT scans. Instead, medical professionals use more indirect approaches that test symptoms of concussions, such as assessments of patients’ learning and memory skills, ability to concentrate and problem solving. They also look at balance and coordination. Most tests are in the form of questionnaires or symptom checklists. Consequently, they have limitations, can be biased and may miss a concussion or produce a false positive. Some people suspected of having a concussion may ordinarily have difficulties with literary and problem-solving tests because of language challenges or education levels.
Another problem with current tests is that patients, particularly soldiers who want to return to combat and athletes who would like to keep competing, could try and hide their symptoms to avoid being diagnosed with a brain injury. Trauma physicians who work with concussion patients have the need for a tool that is more objective and consistent.
“This type of assessment doesn’t rely on the patient's education level, willingness to follow instructions or cooperation. You can’t game this.” -- Uzma Samadani, founder of Oculogica
“The importance of having an objective measurement tool for the diagnosis of concussion is of great importance,” says Douglas Powell, associate professor of biomechanics at the University of Memphis, with research interests in sports injury and concussion. “While there are a number of promising systems or metrics, we have yet to develop a system that is portable, accessible and objective for use on the sideline and in the clinic. The EyeBOX may be able to address these issues, though time will be the ultimate test of performance.”
The EyeBOX as a window inside the brain
Using eye movements to diagnose a concussion has emerged as a promising technique since around 2010. Oculogica combined eye movements with AI to develop the EyeBOX to develop an unbiased objective diagnostic tool.
“What’s so great about this type of assessment is it doesn’t rely on the patient's education level, willingness to follow instructions or cooperation,” says Uzma Samadani, a neurosurgeon and brain injury researcher at the University of Minnesota, who founded Oculogica. “You can’t game this. It assesses functions that are prompted by your brain.”
In 2010, Samadani was working on a clinical trial to improve the outcome of brain injuries. The team needed some way to measure if seriously brain injured patients were improving. One thing patients could do was watch TV. So Samadani designed and patented an AI-based algorithm that tracks the relationship between eye movement and concussion.
The EyeBOX test requires patients to watch movie or music clips for 220 seconds. An eye tracking camera records subconscious eye movements, tracking eye positions 500 times per seconds as patients watch the video. It collects over 100,000 data points. The device then uses AI to assess whether there’s any disruptions from the normal way the eyes move.
Cranial nerves are responsible for transmitting information between the brain and the body. Many are involved in eye movement. Pressure caused by a concussion can affect how these nerves work. So tracking how the eyes move can indicate if there’s anything wrong with the cranial nerves and where the problem lies.
If someone is healthy, their eyes should be able to focus on an object, follow movement and both eyes should be coordinated with each other. The EyeBox can detect abnormalities. For example, if a patient’s eyes are coordinated but they are not moving as they should, that indicates issues in the central brain stem, whilst only one eye moving abnormally suggests that a particular nerve section is affected.
Uzma Samadani with the EyeBOX device
Courtesy Oculogica
“The EyeBOX is a monitor for cranial nerves,” says Samadani. “Essentially it’s a form of digital neurological exam. “Several other eye-tracking techniques already exist, but they rely on subjective self-reported symptoms. Many also require a baseline, a measure of how patients reacted when they were healthy, which often isn’t available.
VOMS (Vestibular Ocular Motor Screen) is one of the most accurate diagnostic tests used in clinics in combination with other tests, but it is subjective. It involves a therapist getting patients to move their head or eyes as they focus or follow a particular object. Patients then report their symptoms.
The King-Devick test measures how fast patients can read numbers and compares it to a baseline. Since it is mainly used for athletes, the initial test is completed before the season starts. But participants can manipulate it. It also cannot be used in emergency rooms because the majority of patients wouldn’t have prior baseline tests.
Unlike these tests, EyeBOX doesn’t use a baseline and is objective because it doesn’t rely on patients’ answers. “It shows great promise,” says Thomas Wilcockson, a senior lecturer of psychology in Loughborough University, who is an expert in using eye tracking techniques in neurological disorders. “Baseline testing of eye movements is not always possible. Alternative measures of concussion currently in development, including work with VR headsets, seem to currently require it. Therefore the EyeBOX may have an advantage.”
A technology that’s still evolving
In their last clinical trial, Oculogica used the EyeBOX to test 46 patients who had concussion and 236 patients who did not. The sensitivity of the EyeBOX, or the probability of it correctly identifying the patient’s concussion, was 80.4 percent. Meanwhile, the test accurately ruled out a concussion in 66.1 percent of cases. This is known as its specificity score.
While the team is working on improving the numbers, experts who treat concussion patients find the device promising. “I strongly support their use of eye tracking for diagnostic decision making,” says Douglas Powell. “But for diagnostic tests, we would prefer at least one of the sensitivity or specificity values to be greater than 90 percent. Powell compares EyeBOX with the Buffalo Concussion Treadmill Test, which has sensitivity and specificity values of 73 and 78 percent, respectively. The VOMS also has shown greater accuracy than the EyeBOX, at least for now. Still, EyeBOX is competitive with the best diagnostic testing available for concussion and Powell hopes that its detection prowess will improve. “I anticipate that the algorithms being used by Oculogica will be under continuous revision and expect the results will improve within the next several years.”
“The color of your skin can have a huge impact in how quickly you are triaged and managed for brain injury. People of color have significantly worse outcomes after traumatic brain injury than people who are white.” -- Uzma Samadani, founder of Oculogica
Powell thinks the EyeBOX could be an important complement to other concussion assessments.
“The Oculogica product is a viable diagnostic tool that supports clinical decision making. However, concussion is an injury that can present with a wide array of symptoms, and the use of technology such as the Oculogica should always be a supplement to patient interaction.”
Ioannis Mavroudis, a consultant neurologist at Leeds Teaching Hospital, agrees that the EyeBOX has promise, but cautions that concussions are too complex to rely on the device alone. For example, not all concussions affect how eyes move. “I believe that it can definitely help, however not all concussions show changes in eye movements. I believe that if this could be combined with a cognitive assessment the results would be impressive.”
The Oculogica team submitted their clinical data for FDA approval and received it in 2018. Now, they’re working to bring the test to the commercial market and using the device clinically to help diagnose concussions for clients. They also want to look at other areas of brain health in the next few years. Samadani believes that the EyeBOX could possibly be used to detect diseases like multiple sclerosis or other neurological conditions. “It’s a completely new way of figuring out what someone’s neurological exam is and we’re only beginning to realize the potential,” says Samadani.
One of Samadani’s biggest aspirations is to help reduce inequalities in healthcare because of skin color and other factors like money or language barriers. From that perspective, the EyeBOX’s greatest potential could be in emergency rooms. It can help diagnose concussions in addition to the questionnaires, assessments and symptom checklists, currently used in the emergency departments. Unlike these more subjective tests, EyeBOX can produce an objective analysis of brain injury through AI when patients are admitted and assessed, unrelated to their socioeconomic status, education, or language abilities. Studies suggest that there are racial disparities in how patients with brain injuries are treated, such as how quickly they're assessed and get a treatment plan.
“The color of your skin can have a huge impact in how quickly you are triaged and managed for brain injury,” says Samadani. “As a result of that, people of color have significantly worse outcomes after traumatic brain injury than people who are white. The EyeBOX has the potential to reduce inequalities,” she explains.
“If you had a digital neurological tool that you could screen and triage patients on admission to the emergency department you would potentially be able to make sure that everybody got the same standard of care,” says Samadani. “My goal is to change the way brain injury is diagnosed and defined.”
Your phone could show if a bridge is about to collapse
In summer 2017, Thomas Matarazzo, then a postdoctoral researcher at the Massachusetts Institute of Technology, landed in San Francisco with a colleague. They rented two cars, drove up to the Golden Gate bridge, timing it to the city’s rush hour, and rode over to the other side in heavy traffic. Once they reached the other end, they turned around and did it again. And again. And again.
“I drove over that bridge 100 times over five days, back and forth,” says Matarazzo, now an associate director of High-Performance Computing in the Center for Innovation in Engineering at the United States Military Academy, West Point. “It was surprisingly stressful, I never anticipated that. I had to maintain the speed of about 30 miles an hour when the speed limit is 45. I felt bad for everybody behind me.”
Matarazzo had to drive slowly because the quality of data they were collecting depended on it. The pair was designing and testing a new smartphone app that could gather data about the bridge’s structural integrity—a low-cost citizen-scientist alternative to the current industrial methods, which aren’t always possible, partly because they’re expensive and complex. In the era of aging infrastructure, when some bridges in the United States and other countries are structurally unsound to the point of collapsing, such an app could inform authorities about the need for urgent repairs, or at least prompt closing the most dangerous structures.
There are 619,588 bridges in the U.S., and some of them are very old. For example, the Benjamin Franklin Bridge connecting Philadelphia to Camden, N.J., is 96-years-old while the Brooklyn Bridge is 153. So it’s hardly surprising that many could use some upgrades. “In the U.S., a lot of them were built in the post-World War II period to accommodate the surge of motorization,” says Carlo Ratti, architect and engineer who directs the Senseable City Lab at Massachusetts Institute of Technology. “They are beginning to reach the end of their life.”
According to the 2022 American Road & Transportation Builders Association’s report, one in three U.S. bridges needs repair or replacement. The Department of Transportation (DOT) National Bridge Inventory (NBI) database reveals concerning numbers. Thirty-six percent of U.S. bridges need repair work and over 78,000 bridges should be replaced. More than 43,500 bridges are rated in poor condition and classified as “structurally deficient” – an alarming description. Yet, people drive over them 167.5 million times a day. The Pittsburgh bridge which collapsed in January this year—only hours before President Biden arrived to discuss the new infrastructure law—was on the “poor” rating list.
Assessing the structural integrity of a bridge is not an easy endeavor. Most of the time, these are visual inspections, Matarazzo explains. Engineers check cracks, rust and other signs of wear and tear. They also check for wildlife—birds which may build nests or even small animals that make homes inside the bridge structures, which can slowly chip at the structure. However, visual inspections may not tell the whole story. A more sophisticated and significantly more expensive inspection requires placing special sensors on the bridge that essentially listen to how the bridge vibrates.
“Some bridges can afford expensive sensors to do the job, but that comes at a very high cost—hundreds of thousands of dollars per bridge per year,” Ratti says.
We may think of bridges as immovable steel and concrete monoliths, but they naturally vibrate, oscillating slightly. That movement can be influenced by the traffic that passes over them, and even by wind. Bridges of different types vibrate differently—some have longer vibrational frequencies and others shorter ones. A good way to visualize this phenomenon is to place a ruler over the edge of a desk and flick it slightly. If the ruler protrudes far off the desk, it will vibrate slowly. But if you shorten the end that hangs off, it will vibrate much faster. It works similarly with bridges, except there are more factors at play, including not only the length, but also the design and the materials used.
The long suspension bridges such as the Golden Gate or Verrazano Narrows, which hang on a series of cables, are more flexible, and their vibration amplitudes are longer. The Golden Gate Bridge can vibrate at 0.106 Hertz, where one Hertz is one oscillation per second. “Think about standing on the bridge for about 10 seconds—that's how long it takes for it to move all the way up and all the way down in one oscillation,” Matarazzo says.
On the contrary, the concrete span bridges that rest on multiple columns like Brooklyn Bridge or Manhattan Bridge, are “stiffer” and have greater vibrational frequencies. A concrete bridge can have a frequency of 10 Hertz, moving 10 times in one second—like that shorter stretch of a ruler.
The special devices that can pick up and record these vibrations over time are called accelerometers. A network of these devices for each bridge can cost $20,000 to $50,000, and more—and require trained personnel to place them. The sensors also must stay on the bridge for some time to establish what’s a healthy vibrational baseline for a given bridge. Maintaining them adds to the cost. “Some bridges can afford expensive sensors to do the job, but that comes at a very high cost—hundreds of thousands of dollars per bridge per year,” Ratti says.
Making sense of the readouts they gather is another challenge, which requires a high level of technical expertise. “You generally need somebody, some type of expert capable of doing the analysis to translate that data into information,” says Matarazzo, which ticks up the price, so doing visual inspections often proves to be a more economical choice for state-level DOTs with tight budgets. “The existing systems work well, but have downsides,” Ratti says. The team thought the old method could use some modernizing.
Smartphones, which are carried by millions of people, contain dozens of sensors, including the accelerometers capable of picking up the bridges’ vibrations. That’s why Matarazzo and his colleague drove over the bridge 100 times—they were trying to pick up enough data. Timing it to rush hour supported that goal because traffic caused more “excitation,” Matarazzo explains. “Excitation is a big word we use when we talk about what drives the vibration,” he says. “When there's a lot of traffic, there's more excitation and more vibration.” They also collaborated with Uber, whose drivers made 72 trips across the bridge to gather data in different cars.
The next step was to clean the data from “noise”—various vibrations that weren’t relevant to the bridge but came from the cars themselves. “It could be jumps in speed, it could be potholes, it could be a bunch of other things," Matarazzo says. But as the team gathered more data, it became easier to tell the bridge vibrational frequencies from all others because the noises generated by cars, traffic and other things tend to “cancel out.”
The team specifically picked the Golden Gate bridge because the civil structural engineering community had studied it extensively over the years and collected a host of vibrational data, using traditional sensors. When the researchers compared their app-collected frequencies with those gathered by 240 accelerometers formerly placed on the Golden Gate, the results were the same—the data from the phones converged with that from the bridge’s sensors. The smartphone-collected data were just as good as those from industry devices.
The study authors estimate that officials could use crowdsourced data to make key improvements that would help new bridges to last about 14 years longer.
The team also tested their method on a different type of bridge—not a suspension one like the Golden Gate, but a concrete span bridge in Ciampino, Italy. There they compared 280 car trips over the bridge to the six sensors that had been placed on the bridge for seven months. The results were slightly less matching, but a larger volume of trips would fix the divergence, the researchers wrote in their study, titled Crowdsourcing bridge dynamic monitoring with smartphone vehicle trips, published last month in Nature Communications Engineering.
Although the smartphones proved effective, the app is not quite ready to be rolled out commercially for people to start using. “It is still a pilot version,” so there’s room for improvement, says Ratti, who co-authored the study. “But on a more optimistic note, it has really low barriers to entry—all you need is smartphones on cars—so that makes the system easy to reach a global audience.” And the study authors estimate that the use of crowdsourced data would result in a new bridge lasting about 14 years longer.
Matarazzo hopes that the app could be eventually accessible for your average citizen scientist to collect the data and supply it to their local transportation authorities. “I hope that this idea can spark a different type of relationship with infrastructure where people think about the data they're collecting as some type of contribution or investment into their communities,” he says. “So that they can help their own department of transportation, their own municipality to support that bridge and keep it maintained better, longer and safer.”
Lina Zeldovich has written about science, medicine and technology for Popular Science, Smithsonian, National Geographic, Scientific American, Reader’s Digest, the New York Times and other major national and international publications. A Columbia J-School alumna, she has won several awards for her stories, including the ASJA Crisis Coverage Award for Covid reporting, and has been a contributing editor at Nautilus Magazine. In 2021, Zeldovich released her first book, The Other Dark Matter, published by the University of Chicago Press, about the science and business of turning waste into wealth and health. You can find her on http://linazeldovich.com/ and @linazeldovich.
The Friday Five: Sugar could help catch cancer early
The Friday Five covers five stories in research that you may have missed this week. There are plenty of controversies and troubling ethical issues in science – and we get into many of them in our online magazine – but this news roundup focuses on scientific creativity and progress to give you a therapeutic dose of inspiration headed into the weekend.
Listen on Apple | Listen on Spotify | Listen on Stitcher | Listen on Amazon | Listen on Google
Here are the promising studies covered in this week's Friday Five:
- Catching cancer early could depend on sugar
- How to boost memory in a flash
- This is your brain on books
- A tiny sandwich cake could help the heart
- Meet the top banana for fighting Covid variants