Food Poisoning Outbreaks Are Still A Problem. Powerful Tech Is Fighting Back.
With the pandemic at the forefront of everyone's minds, many people have wondered if food could be a source of coronavirus transmission. Luckily, that "seems unlikely," according to the CDC, but foodborne illnesses do still sicken a whopping 48 million people per year.
Whole genome sequencing is like "going from an eight-bit image—maybe like what you would see in Minecraft—to a high definition image."
In normal times, when there isn't a historic global health crisis infecting millions and affecting the lives of billions, foodborne outbreaks are real and frightening, potentially deadly, and can cause widespread fear of particular foods. Think of Romaine lettuce spreading E. coli last year— an outbreak that infected more than 500 people and killed eight—or peanut butter spreading salmonella in 2008, which infected 167 people.
The technologies available to detect and prevent the next foodborne disease outbreak have improved greatly over the past 30-plus years, particularly during the past decade, and better, more nimble technologies are being developed, according to experts in government, academia, and private industry. The key to advancing detection of harmful foodborne pathogens, they say, is increasing speed and portability of detection, and the precision of that detection.
Getting to Rapid Results
Researchers at Purdue University have recently developed a lateral flow assay that, with the help of a laser, can detect toxins and pathogenic E. coli. Lateral flow assays are cheap and easy to use; a good example is a home pregnancy test. You place a liquid or liquefied sample on a piece of paper designed to detect a single substance and soon after you get the results in the form of a colored line: yes or no.
"They're a great portable tool for us for food contaminant detection," says Carmen Gondhalekar, a fifth-year biomedical engineering graduate student at Purdue. "But one of the areas where paper-based lateral flow assays could use improvement is in multiplexing capability and their sensitivity."
J. Paul Robinson, a professor in Purdue's Colleges of Veterinary Medicine and Engineering, and Gondhalekar's advisor, agrees. "One of the fundamental problems that we have in detection is that it is hard to identify pathogens in complex samples," he says.
When it comes to foodborne disease outbreaks, you don't always know what substance you're looking for, so an assay made to detect only a single substance isn't always effective. The goal of the project at Purdue is to make assays that can detect multiple substances at once.
These assays would be more complex than a pregnancy test. As detailed in Gondhalekar's recent paper, a laser pulse helps create a spectral signal from the sample on the assay paper, and the spectral signal is then used to determine if any unique wavelengths associated with one of several toxins or pathogens are present in the sample. Though the handheld technology has yet to be built, the idea is that the results would be given on the spot. So someone in the field trying to track the source of a Salmonella infection could, for instance, put a suspected lettuce sample on the assay and see if it has the pathogen on it.
"What our technology is designed to do is to give you a rapid assessment of the sample," says Robinson. "The goal here is speed."
Seeing the Pathogen in "High-Def"
"One in six Americans will get a foodborne illness every year," according to Dr. Heather Carleton, a microbiologist at the Centers for Disease Control and Prevention's Enteric Diseases Laboratory Branch. But not every foodborne outbreak makes the news. In 2017 alone, the CDC monitored between 18 and 37 foodborne poison clusters per week and investigated 200 multi-state clusters. Hardboiled eggs, ground beef, chopped salad kits, raw oysters, frozen tuna, and pre-cut melon are just a taste of the foods that were investigated last year for different strains of listeria, salmonella, and E. coli.
At the heart of the CDC investigations is PulseNet, a national network of laboratories that uses DNA fingerprinting to detect outbreaks at local and regional levels. This is how it works: When a patient gets sick—with symptoms like vomiting and fever, for instance—they will go to a hospital or clinic for treatment. Since we're talking about foodborne illnesses, a clinician will likely take a stool sample from the patient and send it off to a laboratory to see if there is a foodborne pathogen, like salmonella, E. Coli, or another one. If it does contain a potentially harmful pathogen, then a bacterial isolate of that identified sample is sent to a regional public health lab so that whole genome sequencing can be performed.
Whole genome sequencing can differentiate "virtually all" strains of foodborne pathogens, no matter the species, according to the FDA.
Whole genome sequencing is a method for reading the entire genome of a bacterial isolate (or from any organism, for that matter). Instead of working with a couple dozen data points, now you're working with millions of base pairs. Carleton likes to describe it as "going from an eight-bit image—maybe like what you would see in Minecraft—to a high definition image," she says. "It's really an evolution of how we detect foodborne illnesses and identify outbreaks."
If the bacterial isolate matches another in the CDC's database, this means there could be a potential outbreak and an investigation may be started, with the goal of tracking the pathogen to its source.
Whole genome sequencing has been a relatively recent shift in foodborne disease detection. For more than 20 years, the standard technique for analyzing pathogens in foodborne disease outbreaks was pulsed-field gel electrophoresis. This method creates a DNA fingerprint for each sample in the form of a pattern of about 15-30 "bands," with each band representing a piece of DNA. Researchers like Carleton can use this fingerprint to see if two samples are from the same bacteria. The problem is that 15-30 bands are not enough to differentiate all isolates. Some isolates whose bands look very similar may actually come from different sources and some whose bands look different may be from the same source. But if you can see the entire DNA fingerprint, then you don't have that issue. That's where whole genome sequencing comes in.
Although the PulseNet team had piloted whole genome sequencing as early as 2013, it wasn't until July of last year that the transition to using whole genome sequencing for all pathogens was complete. Though whole genome sequencing requires far more computing power to generate, analyze, and compare those millions of data points, the payoff is huge.
Stopping Outbreaks Sooner
The U.S. Food and Drug Administration (FDA) acquired their first whole genome sequencers in 2008, according to Dr. Eric Brown, the Director of the Division of Microbiology in the FDA's Office of Regulatory Science. Since then, through their GenomeTrakr program, a network of more than 60 domestic and international labs, the FDA has sequenced and publicly shared more than 400,000 isolates. "The impact of what whole genome sequencing could do to resolve a foodborne outbreak event was no less impactful than when NASA turned on the Hubble Telescope for the first time," says Brown.
Whole genome sequencing has helped identify strains of Salmonella that prior methods were unable to differentiate. In fact, whole genome sequencing can differentiate "virtually all" strains of foodborne pathogens, no matter the species, according to the FDA. This means it takes fewer clinical cases—fewer sick people—to detect and end an outbreak.
And perhaps the largest benefit of whole genome sequencing is that these detailed sequences—the millions of base pairs—can imply geographic location. The genomic information of bacterial strains can be different depending on the area of the country, helping these public health agencies eventually track the source of outbreaks—a restaurant, a farm, a food-processing center.
Coming Soon: "Lab in a Backpack"
Now that whole genome sequencing has become the go-to technology of choice for analyzing foodborne pathogens, the next step is making the process nimbler and more portable. Putting "the lab in a backpack," as Brown says.
The CDC's Carleton agrees. "Right now, the sequencer we use is a fairly big box that weighs about 60 pounds," she says. "We can't take it into the field."
A company called Oxford Nanopore Technologies is developing handheld sequencers. Their devices are meant to "enable the sequencing of anything by anyone anywhere," according to Dan Turner, the VP of Applications at Oxford Nanopore.
"The sooner that we can see linkages…the sooner the FDA gets in action to mitigate the problem and put in some kind of preventative control."
"Right now, sequencing is very much something that is done by people in white coats in laboratories that are set up for that purpose," says Turner. Oxford Nanopore would like to create a new, democratized paradigm.
The FDA is currently testing these types of portable sequencers. "We're very excited about it. We've done some pilots, to be able to do that sequencing in the field. To actually do it at a pond, at a river, at a canal. To do it on site right there," says Brown. "This, of course, is huge because it means we can have real-time sequencing capability to stay in step with an actual laboratory investigation in the field."
"The timeliness of this information is critical," says Marc Allard, a senior biomedical research officer and Brown's colleague at the FDA. "The sooner that we can see linkages…the sooner the FDA gets in action to mitigate the problem and put in some kind of preventative control."
At the moment, the world is rightly focused on COVID-19. But as the danger of one virus subsides, it's only a matter of time before another pathogen strikes. Hopefully, with new and advancing technology like whole genome sequencing, we can stop the next deadly outbreak before it really gets going.
Is there a robot nanny in your child's future?
From ROBOTS AND THE PEOPLE WHO LOVE THEM: Holding on to Our Humanity in an Age of Social Robots by Eve Herold. Copyright © 2024 by the author and reprinted by permission of St. Martin’s Publishing Group.
Could the use of robots take some of the workload off teachers, add engagement among students, and ultimately invigorate learning by taking it to a new level that is more consonant with the everyday experiences of young people? Do robots have the potential to become full-fledged educators and further push human teachers out of the profession? The preponderance of opinion on this subject is that, just as AI and medical technology are not going to eliminate doctors, robot teachers will never replace human teachers. Rather, they will change the job of teaching.
A 2017 study led by Google executive James Manyika suggested that skills like creativity, emotional intelligence, and communication will always be needed in the classroom and that robots aren’t likely to provide them at the same level that humans naturally do. But robot teachers do bring advantages, such as a depth of subject knowledge that teachers can’t match, and they’re great for student engagement.
The teacher and robot can complement each other in new ways, with the teacher facilitating interactions between robots and students. So far, this is the case with teaching “assistants” being adopted now in China, Japan, the U.S., and Europe. In this scenario, the robot (usually the SoftBank child-size robot NAO) is a tool for teaching mainly science, technology, engineering, and math (the STEM subjects), but the teacher is very involved in planning, overseeing, and evaluating progress. The students get an entertaining and enriched learning experience, and some of the teaching load is taken off the teacher. At least, that’s what researchers have been able to observe so far.
To be sure, there are some powerful arguments for having robots in the classroom. A not-to-be-underestimated one is that robots “speak the language” of today’s children, who have been steeped in technology since birth. These children are adept at navigating a media-rich environment that is highly visual and interactive. They are plugged into the Internet 24-7. They consume music, games, and huge numbers of videos on a weekly basis. They expect to be dazzled because they are used to being dazzled by more and more spectacular displays of digital artistry. Education has to compete with social media and the entertainment vehicles of students’ everyday lives.
Another compelling argument for teaching robots is that they help prepare students for the technological realities they will encounter in the real world when robots will be ubiquitous. From childhood on, they will be interacting and collaborating with robots in every sphere of their lives from the jobs they do to dealing with retail robots and helper robots in the home. Including robots in the classroom is one way of making sure that children of all socioeconomic backgrounds will be better prepared for a highly automated age, when successfully using robots will be as essential as reading and writing. We’ve already crossed this threshold with computers and smartphones.
Students need multimedia entertainment with their teaching. This is something robots can provide through their ability to connect to the Internet and act as a centralized host to videos, music, and games. Children also need interaction, something robots can deliver up to a point, but which humans can surpass. The education of a child is not just intended to make them technologically functional in a wired world, it’s to help them grow in intellectual, creative, social, and emotional ways. When considered through this perspective, it opens the door to questions concerning just how far robots should go. Robots don’t just teach and engage children; they’re designed to tug at their heartstrings.
It’s no coincidence that many toy makers and manufacturers are designing cute robots that look and behave like real children or animals, says Turkle. “When they make eye contact and gesture toward us, they predispose us to view them as thinking and caring,” she has written in The Washington Post. “They are designed to be cute, to provide a nurturing response” from the child. As mentioned previously, this nurturing experience is a powerful vehicle for drawing children in and promoting strong attachment. But should children really love their robots?
ROBOTS AND THE PEOPLE WHO LOVE THEM: Holding on to Our Humanity in an Age of Social Robots by Eve Herold (January 9, 2024).
St. Martin’s Publishing Group
The problem, once again, is that a child can be lulled into thinking that she’s in an actual relationship, when a robot can’t possibly love her back. If adults have these vulnerabilities, what might such asymmetrical relationships do to the emotional development of a small child? Turkle notes that while we tend to ascribe a mind and emotions to a socially interactive robot, “simulated thinking may be thinking, but simulated feeling is never feeling, and simulated love is never love.”
Always a consideration is the fact that in the first few years of life, a child’s brain is undergoing rapid growth and development that will form the foundation of their lifelong emotional health. These formative experiences are literally shaping the child’s brain, their expectations, and their view of the world and their place in it. In Alone Together, Turkle asks: What are we saying to children about their importance to us when we’re willing to outsource their care to a robot? A child might be superficially entertained by the robot while his self-esteem is systematically undermined.
Research has emerged showing that there are clear downsides to child-robot relationships.
Still, in the case of robot nannies in the home, is active, playful engagement with a robot for a few hours a day any more harmful than several hours in front of a TV or with an iPad? Some, like Xiong, regard interacting with a robot as better than mere passive entertainment. iPal’s manufacturers say that their robot can’t replace parents or teachers and is best used by three- to eight-year-olds after school, while they wait for their parents to get off work. But as robots become ever-more sophisticated, they’re expected to perform more of the tasks of day-to-day care and to be much more emotionally advanced. There is no question children will form deep attachments to some of them. And research has emerged showing that there are clear downsides to child-robot relationships.
Some studies, performed by Turkle and fellow MIT colleague Cynthia Breazeal, have revealed a darker side to the child-robot bond. Turkle has reported extensively on these studies in The Washington Post and in her book Alone Together. Most children love robots, but some act out their inner bully on the hapless machines, hitting and kicking them and otherwise trying to hurt them. The trouble is that the robot can’t fight back, teaching children that they can bully and abuse without consequences. As in any other robot relationship, such harmful behavior could carry over into the child’s human relationships.
And, ironically, it turns out that communicative machines don’t actually teach kids good communication skills. It’s well known that parent-child communication in the first three years of life sets the stage for a very young child’s intellectual and academic success. Verbal back-and-forth with parents and care-givers is like fuel for a child’s growing brain. One article that examined several types of play and their effect on children’s communication skills, published in JAMA Pediatrics in 2015, showed that babies who played with electronic toys—like the popular robot dog Aibo—show a decrease in both the quantity and quality of their language skills.
Anna V. Sosa of the Child Speech and Language Lab at Northern Arizona University studied twenty-six ten- to sixteen- month-old infants to compare the growth of their language skills after they played with three types of toys: electronic toys like a baby laptop and talking farm; traditional toys like wooden puzzles and building blocks; and books read aloud by their parents. The play that produced the most growth in verbal ability was having books read to them by a caregiver, followed by play with traditional toys. Language gains after playing with electronic toys came dead last. This form of play involved the least use of adult words, the least conversational turntaking, and the least verbalizations from the children. While the study sample was small, it’s not hard to extrapolate that no electronic toy or even more abled robot could supply the intimate responsiveness of a parent reading stories to a child, explaining new words, answering the child’s questions, and modeling the kind of back- and-forth interaction that promotes empathy and reciprocity in relationships.
***
Most experts acknowledge that robots can be valuable educational tools. But they can’t make a child feel truly loved, validated, and valued. That’s the job of parents, and when parents abdicate this responsibility, it’s not only the child who misses out on one of life’s most profound experiences.
We really don’t know how the tech-savvy children of today will ultimately process their attachments to robots and whether they will be excessively predisposed to choosing robot companionship over that of humans. It’s possible their techno literacy will draw for them a bold line between real life and a quasi-imaginary history with a robot. But it will be decades before we see long-term studies culminating in sufficient data to help scientists, and the rest of us, to parse out the effects of a lifetime spent with robots.
This is an excerpt from ROBOTS AND THE PEOPLE WHO LOVE THEM: Holding on to Our Humanity in an Age of Social Robots by Eve Herold. The book will be published on January 9, 2024.
Story by Big Think
In rare cases, a woman’s heart can start to fail in the months before or after giving birth. The all-important muscle weakens as its chambers enlarge, reducing the amount of blood pumped with each beat. Peripartum cardiomyopathy can threaten the lives of both mother and child. Viral illness, nutritional deficiency, the bodily stress of pregnancy, or an abnormal immune response could all play a role, but the causes aren’t concretely known.
If there is a silver lining to peripartum cardiomyopathy, it’s that it is perhaps the most survivable form of heart failure. A remarkable 50% of women recover spontaneously. And there’s an even more remarkable explanation for that glowing statistic: The fetus‘ stem cells migrate to the heart and regenerate the beleaguered muscle. In essence, the developing or recently born child saves its mother’s life.
Saving mama
While this process has not been observed directly in humans, it has been witnessed in mice. In a 2015 study, researchers tracked stem cells from fetal mice as they traveled to mothers’ damaged cardiac cells and integrated themselves into hearts.
Evolutionarily, this function makes sense: It is in the fetus’ best interest that its mother remains healthy.
Scientists also have spotted cells from the fetus within the hearts of human mothers, as well as countless other places inside the body, including the skin, spleen, liver, brain, lung, kidney, thyroid, lymph nodes, salivary glands, gallbladder, and intestine. These cells essentially get everywhere. While most are eliminated by the immune system during pregnancy, some can persist for an incredibly long time — up to three decades after childbirth.
This integration of the fetus’ cells into the mother’s body has been given a name: fetal microchimerism. The process appears to start between the fourth and sixth week of gestation in humans. Scientists are actively trying to suss out its purpose. Fetal stem cells, which can differentiate into all sorts of specialized cells, appear to target areas of injury. So their role in healing seems apparent. Evolutionarily, this function makes sense: It is in the fetus’ best interest that its mother remains healthy.
Sending cells into the mother’s body may also prime her immune system to grow more tolerant of the developing fetus. Successful pregnancy requires that the immune system not see the fetus as an interloper and thus dispatch cells to attack it.
Fetal microchimerism
But fetal microchimerism might not be entirely beneficial. Greater concentrations of the cells have been associated with various autoimmune diseases such as lupus, Sjogren’s syndrome, and even multiple sclerosis. After all, they are foreign cells living in the mother’s body, so it’s possible that they might trigger subtle, yet constant inflammation. Fetal cells also have been linked to cancer, although it isn’t clear whether they abet or hinder the disease.
A team of Spanish scientists summarized the apparent give and take of fetal microchimerism in a 2022 review article. “On the one hand, fetal microchimerism could be a source of progenitor cells with a beneficial effect on the mother’s health by intervening in tissue repair, angiogenesis, or neurogenesis. On the other hand, fetal microchimerism might have a detrimental function by activating the immune response and contributing to autoimmune diseases,” they wrote.
Regardless of a fetus’ cells net effect, their existence alone is intriguing. In a paper published earlier this year, University of London biologist Francisco Úbeda and University of Western Ontario mathematical biologist Geoff Wild noted that these cells might very well persist within mothers for life.
“Therefore, throughout their reproductive lives, mothers accumulate fetal cells from each of their past pregnancies including those resulting in miscarriages. Furthermore, mothers inherit, from their own mothers, a pool of cells contributed by all fetuses carried by their mothers, often referred to as grandmaternal microchimerism.”
So every mother may carry within her literal pieces of her ancestors.