New device finds breast cancer like earthquake detection
Mammograms are necessary breast cancer checks for women as they reach the recommended screening age between 40 and 50 years. Yet, many find the procedure uncomfortable. “I have large breasts, and to be able to image the full breast, the radiographer had to manipulate my breast within the machine, which took time and was quite uncomfortable,” recalls Angela, who preferred not to disclose her last name.
Breast cancer is the most widespread cancer in the world, affecting 2.3 million women in 2020. Screening exams such as mammograms can help find breast cancer early, leading to timely diagnosis and treatment. If this type of cancer is detected before the disease has spread, the 5-year survival rate is 99 percent. But some women forgo mammograms due to concerns about radiation or painful compression of breasts. Other issues, such as low income and a lack of access to healthcare, can also serve as barriers, especially for underserved populations.
Researchers at the University of Canterbury and startup Tiro Medical in Christchurch, New Zealand are hoping their new device—which doesn’t involve any radiation or compression of the breasts—could increase the accuracy of breast cancer screening, broaden access and encourage more women to get checked. They’re digging into clues from the way buildings move in an earthquake to help detect more cases of this disease.
Earthquake engineering inspires new breast cancer screening tech
What’s underneath a surface affects how it vibrates. Earthquake engineers look at the vibrations of swaying buildings to identify the underlying soil and tissue properties. “As the vibration wave travels, it reflects the stiffness of the material between that wave and the surface,” says Geoff Chase, professor of engineering at the University of Canterbury in Christchurch, New Zealand.
Chase is applying this same concept to breasts. Analyzing the surface motion of the breast as it vibrates could reveal the stiffness of the tissues underneath. Regions of high stiffness could point to cancer, given that cancerous breast tissue can be up to 20 times stiffer than normal tissue. “If in essence every woman’s breast is soft soil, then if you have some granite rocks in there, we’re going to see that on the surface,” explains Chase.
The earthquake-inspired device exceeds the 87 percent sensitivity of a 3D mammogram.
That notion underpins a new breast screening device, the brainchild of Chase. Women lie face down, with their breast being screened inside a circular hole and the nipple resting on a small disc called an actuator. The actuator moves up and down, between one and two millimeters, so there’s a small vibration, “almost like having your phone vibrate on your nipple,” says Jessica Fitzjohn, a postdoctoral fellow at the University of Canterbury who collaborated on the device design with Chase.
Cameras surrounding the device take photos of the breast surface motion as it vibrates. The photos are fed into image processing algorithms that convert them into data points. Then, diagnostic algorithms analyze those data points to find any differences in the breast tissue. “We’re looking for that stiffness contrast which could indicate a tumor,” Fitzjohn says.
A nascent yet promising technology
The device has been tested in a clinical trial of 14 women: one with healthy breasts and 13 with a tumor in one breast. The cohort was small but diverse, varying in age, breast volume and tumor size.
Results from the trial yielded a sensitivity rate, or the likelihood of correctly detecting breast cancer, of 85 percent. Meanwhile, the device’s specificity rate, or the probability of diagnosing healthy breasts, was 77 percent. By combining and optimizing certain diagnostic algorithms, the device reached between 92 and 100 percent sensitivity and between 80 and 86 percent specificity, which is comparable to the latest 3D mammogram technology. Called tomosynthesis, these 3D mammograms take a number of sharper, clearer and more detailed 3D images compared to the single 2D image of a conventional mammogram, and have a specificity score of 92 percent. Although the earthquake-inspired device’s specificity is lower, it exceeds the 87 percent sensitivity of a 3D mammogram.
The team hopes that cameras with better resolution can help improve the numbers. And with a limited amount of data in the first trial, the researchers are looking into funding for another clinical trial to validate their results on a larger cohort size.
Additionally, during the trial, the device correctly identified one woman’s breast as healthy, while her prior mammogram gave a false positive. The device correctly identified it as being healthy tissue. It was also able to capture the tiniest tumor at 7 millimeters—around a third of an inch or half as long as an aspirin tablet.
Diagnostic findings from the device are immediate.
When using the earthquake-inspired device, women lie face down, with their breast being screened inside circular holes.
University of Canterbury.
But more testing is needed to “prove the device’s ability to pick up small breast cancers less than 10 to 15 millimeters in size, as we know that finding cancers when they are small is the best way of improving outcomes,” says Richard Annand, a radiologist at Pacific Radiology in New Zealand. He explains that mammography already detects most precancerous lesions, so if the device will only be able to find large masses or lumps it won’t be particularly useful. While not directly involved in administering the clinical trial for the device, Annand was a director at the time for Canterbury Breastcare, where the trial occurred.
Meanwhile, Monique Gary, a breast surgical oncologist and medical director of the Grand View Health Cancer program in Pennsylvania, U.S., is excited to see new technologies advancing breast cancer screening and early detection. But she notes that the device may be challenging for “patients who are unable to lay prone, such as pregnant women as well as those who are differently abled, and this machine might exclude them.” She adds that it would also be interesting to explore how breast implants would impact the device’s vibrational frequency.
Diagnostic findings from the device are immediate, with the results available “before you put your clothes back on,” Chase says. The absence of any radiation is another benefit, though Annand considers it a minor edge “as we know the radiation dose used in mammography is minimal, and the advantages of having a mammogram far outweigh the potential risk of radiation.”
The researchers also conducted a separate ergonomic trial with 40 women to assess the device’s comfort, safety and ease of use. Angela was part of that trial and described the experience as “easy, quick, painless and required no manual intervention from an operator.” And if a person is uncomfortable being topless or having their breasts touched by someone else, “this type of device would make them more comfortable and less exposed,” she says.
While mammograms remain “the ‘gold standard’ in breast imaging, particularly screening, physicians need an option that can be used in combination with mammography.
Fitzjohn acknowledges that “at the moment, it’s quite a crude prototype—it’s just a block that you lie on.” The team prioritized function over form initially, but they’re now planning a few design improvements, including more cushioning for the breasts and the surface where the women lie on.
While mammograms remains “the ‘gold standard’ in breast imaging, particularly screening, physicians need an option that is good at excluding breast cancer when used in combination with mammography, has good availability, is easy to use and is affordable. There is the possibility that the device could fill this role,” Annand says.
Indeed, the researchers envision their new breast screening device as complementary to mammograms—a prescreening tool that could make breast cancer checks widely available. As the device is portable and doesn’t require specialized knowledge to operate, it can be used in clinics, pop-up screening facilities and rural communities. “If it was easily accessible, particularly as part of a checkup with a [general practitioner] or done in a practice the patient is familiar with, it may encourage more women to access this service,” Angela says. For those who find regular mammograms uncomfortable or can’t afford them, the earthquake-inspired device may be an option—and an even better one.
Broadening access could prompt more women to go for screenings, particularly younger women at higher risk of getting breast cancer because of a family history of the disease or specific gene mutations. “If we can provide an option for them then we can catch those cancers earlier,” Fitzjohn syas. “By taking screening to people, we’re increasing patient-centric care.”
With the team aiming to lower the device’s cost to somewhere between five and eight times less than mammography equipment, it would also be valuable for low-to-middle-income nations that are challenged to afford the infrastructure for mammograms or may not have enough skilled radiologists.
For Fitzjohn, the ultimate goal is to “increase equity in breast screening and catch cancer early so we have better outcomes for women who are diagnosed with breast cancer.”
In today’s podcast episode, Leaps.org Deputy Editor Lina Zeldovich speaks about the health and ecological benefits of farming crickets for human consumption with Bicky Nguyen, who joins Lina from Vietnam. Bicky and her business partner Nam Dang operate an insect farm named CricketOne. Motivated by the idea of sustainable and healthy protein production, they started their unconventional endeavor a few years ago, despite numerous naysayers who didn’t believe that humans would ever consider munching on bugs.
Yet, making creepy crawlers part of our diet offers many health and planetary advantages. Food production needs to match the rise in global population, estimated to reach 10 billion by 2050. One challenge is that some of our current practices are inefficient, polluting and wasteful. According to nonprofit EarthSave.org, it takes 2,500 gallons of water, 12 pounds of grain, 35 pounds of topsoil and the energy equivalent of one gallon of gasoline to produce one pound of feedlot beef, although exact statistics vary between sources.
Meanwhile, insects are easy to grow, high on protein and low on fat. When roasted with salt, they make crunchy snacks. When chopped up, they transform into delicious pâtes, says Bicky, who invents her own cricket recipes and serves them at industry and public events. Maybe that’s why some research predicts that edible insects market may grow to almost $10 billion by 2030. Tune in for a delectable chat on this alternative and sustainable protein.
Listen on Apple | Listen on Spotify | Listen on Stitcher | Listen on Amazon | Listen on Google
Further reading:
More info on Bicky Nguyen
https://yseali.fulbright.edu.vn/en/faculty/bicky-n...
The environmental footprint of beef production
https://www.earthsave.org/environment.htm
https://www.watercalculator.org/news/articles/beef-king-big-water-footprints/
https://www.frontiersin.org/articles/10.3389/fsufs.2019.00005/full
https://ourworldindata.org/carbon-footprint-food-methane
Insect farming as a source of sustainable protein
https://www.insectgourmet.com/insect-farming-growing-bugs-for-protein/
https://www.sciencedirect.com/topics/agricultural-and-biological-sciences/insect-farming
Cricket flour is taking the world by storm
https://www.cricketflours.com/
https://talk-commerce.com/blog/what-brands-use-cricket-flour-and-why/
Lina Zeldovich has written about science, medicine and technology for Popular Science, Smithsonian, National Geographic, Scientific American, Reader’s Digest, the New York Times and other major national and international publications. A Columbia J-School alumna, she has won several awards for her stories, including the ASJA Crisis Coverage Award for Covid reporting, and has been a contributing editor at Nautilus Magazine. In 2021, Zeldovich released her first book, The Other Dark Matter, published by the University of Chicago Press, about the science and business of turning waste into wealth and health. You can find her on http://linazeldovich.com/ and @linazeldovich.
Autonomous, indoor farming gives a boost to crops
The glass-encased cabinet looks like a display meant to hold reasonably priced watches, or drugstore beauty creams shipped from France. But instead of this stagnant merchandise, each of its five shelves is overgrown with leaves — moss-soft pea sprouts, spikes of Lolla rosa lettuces, pale bok choy, dark kale, purple basil or red-veined sorrel or green wisps of dill. The glass structure isn’t a cabinet, but rather a “micro farm.”
The gadget is on display at the Richmond, Virginia headquarters of Babylon Micro-Farms, a company that aims to make indoor farming in the U.S. more accessible and sustainable. Babylon’s soilless hydroponic growing system, which feeds plants via nutrient-enriched water, allows chefs on cruise ships, cafeterias and elsewhere to provide home-grown produce to patrons, just seconds after it’s harvested. Currently, there are over 200 functioning systems, either sold or leased to customers, and more of them are on the way.
The chef-farmers choose from among 45 types of herb and leafy-greens seeds, plop them into grow trays, and a few weeks later they pick and serve. While success is predicated on at least a small amount of these humans’ care, the systems are autonomously surveilled round-the-clock from Babylon’s base of operations. And artificial intelligence is helping to run the show.
Babylon piloted the use of specialized cameras that take pictures in different spectrums to gather some less-obvious visual data about plants’ wellbeing and alert people if something seems off.
Imagine consistently perfect greens and tomatoes and strawberries, grown hyper-locally, using less water, without chemicals or environmental contaminants. This is the hefty promise of controlled environment agriculture (CEA) — basically, indoor farms that can be hydroponic, aeroponic (plant roots are suspended and fed through misting), or aquaponic (where fish play a role in fertilizing vegetables). But whether they grow 4,160 leafy-green servings per year, like one Babylon farm, or millions of servings, like some of the large, centralized facilities starting to supply supermarkets across the U.S., they seek to minimize failure as much as possible.
Babylon’s soilless hydroponic growing system
Courtesy Babylon Micro-Farms
Here, AI is starting to play a pivotal role. CEA growers use it to help “make sense of what’s happening” to the plants in their care, says Scott Lowman, vice president of applied research at the Institute for Advanced Learning and Research (IALR) in Virginia, a state that’s investing heavily in CEA companies. And although these companies say they’re not aiming for a future with zero human employees, AI is certainly poised to take a lot of human farming intervention out of the equation — for better and worse.
Most of these companies are compiling their own data sets to identify anything that might block the success of their systems. Babylon had already integrated sensor data into its farms to measure heat and humidity, the nutrient content of water, and the amount of light plants receive. Last year, they got a National Science Foundation grant that allowed them to pilot the use of specialized cameras that take pictures in different spectrums to gather some less-obvious visual data about plants’ wellbeing and alert people if something seems off. “Will this plant be healthy tomorrow? Are there things…that the human eye can't see that the plant starts expressing?” says Amandeep Ratte, the company’s head of data science. “If our system can say, Hey, this plant is unhealthy, we can reach out to [users] preemptively about what they’re doing wrong, or is there a disease at the farm?” Ratte says. The earlier the better, to avoid crop failures.
Natural light accounts for 70 percent of Greenswell Growers’ energy use on a sunny day.
Courtesy Greenswell Growers
IALR’s Lowman says that other CEA companies are developing their AI systems to account for the different crops they grow — lettuces come in all shapes and sizes, after all, and each has different growing needs than, for example, tomatoes. The ways they run their operations differs also. Babylon is unusual in its decentralized structure. But centralized growing systems with one main location have variabilities, too. AeroFarms, which recently declared bankruptcy but will continue to run its 140,000-square foot vertical operation in Danville, Virginia, is entirely enclosed and reliant on the intense violet glow of grow lights to produce microgreens.
Different companies have different data needs. What data is essential to AeroFarms isn’t quite the same as for Greenswell Growers located in Goochland County, Virginia. Raising four kinds of lettuce in a 77,000-square-foot automated hydroponic greenhouse, the vagaries of naturally available light, which accounts for 70 percent of Greenswell’s energy use on a sunny day, affect operations. Their tech needs to account for “outside weather impacts,” says president Carl Gupton. “What adjustments do we have to make inside of the greenhouse to offset what's going on outside environmentally, to give that plant optimal conditions? When it's 85 percent humidity outside, the system needs to do X, Y and Z to get the conditions that we want inside.”
AI will help identify diseases, as well as when a plant is thirsty or overly hydrated, when it needs more or less calcium, phosphorous, nitrogen.
Nevertheless, every CEA system has the same core needs — consistent yield of high quality crops to keep up year-round supply to customers. Additionally, “Everybody’s got the same set of problems,” Gupton says. Pests may come into a facility with seeds. A disease called pythium, one of the most common in CEA, can damage plant roots. “Then you have root disease pressures that can also come internally — a change in [growing] substrate can change the way the plant performs,” Gupton says.
AI will help identify diseases, as well as when a plant is thirsty or overly hydrated, when it needs more or less calcium, phosphorous, nitrogen. So, while companies amass their own hyper-specific data sets, Lowman foresees a time within the next decade “when there will be some type of [open-source] database that has the most common types of plant stress identified” that growers will be able to tap into. Such databases will “create a community and move the science forward,” says Lowman.
In fact, IALR is working on assembling images for just such a database now. On so-called “smart tables” inside an Institute lab, a team is growing greens and subjects them to various stressors. Then, they’re administering treatments while taking images of every plant every 15 minutes, says Lowman. Some experiments generate 80,000 images; the challenge lies in analyzing and annotating the vast trove of them, marking each one to reflect outcome—for example increasing the phosphate delivery and the plant’s response to it. Eventually, they’ll be fed into AI systems to help them learn.
For all the enthusiasm surrounding this technology, it’s not without downsides. Training just one AI system can emit over 250,000 pounds of carbon dioxide, according to MIT Technology Review. AI could also be used “to enhance environmental benefit for CEA and optimize [its] energy consumption,” says Rozita Dara, a computer science professor at the University of Guelph in Canada, specializing in AI and data governance, “but we first need to collect data to measure [it].”
The chef-farmers can choose from 45 types of herb and leafy-greens seeds.
Courtesy Babylon Micro-Farms
Any system connected to the Internet of Things is also vulnerable to hacking; if CEA grows to the point where “there are many of these similar farms, and you're depending on feeding a population based on those, it would be quite scary,” Dara says. And there are privacy concerns, too, in systems where imaging is happening constantly. It’s partly for this reason, says Babylon’s Ratte, that the company’s in-farm cameras all “face down into the trays, so the only thing [visible] is pictures of plants.”
Tweaks to improve AI for CEA are happening all the time. Greenswell made its first harvest in 2022 and now has annual data points they can use to start making more intelligent choices about how to feed, water, and supply light to plants, says Gupton. Ratte says he’s confident Babylon’s system can already “get our customers reliable harvests. But in terms of how far we have to go, it's a different problem,” he says. For example, if AI could detect whether the farm is mostly empty—meaning the farm’s user hasn’t planted a new crop of greens—it can alert Babylon to check “what's going on with engagement with this user?” Ratte says. “Do they need more training? Did the main person responsible for the farm quit?”
Lowman says more automation is coming, offering greater ability for systems to identify problems and mitigate them on the spot. “We still have to develop datasets that are specific, so you can have a very clear control plan, [because] artificial intelligence is only as smart as what we tell it, and in plant science, there's so much variation,” he says. He believes AI’s next level will be “looking at those first early days of plant growth: when the seed germinates, how fast it germinates, what it looks like when it germinates.” Imaging all that and pairing it with AI, “can be a really powerful tool, for sure.”