Is there a robot nanny in your child's future?
From ROBOTS AND THE PEOPLE WHO LOVE THEM: Holding on to Our Humanity in an Age of Social Robots by Eve Herold. Copyright © 2024 by the author and reprinted by permission of St. Martin’s Publishing Group.
Could the use of robots take some of the workload off teachers, add engagement among students, and ultimately invigorate learning by taking it to a new level that is more consonant with the everyday experiences of young people? Do robots have the potential to become full-fledged educators and further push human teachers out of the profession? The preponderance of opinion on this subject is that, just as AI and medical technology are not going to eliminate doctors, robot teachers will never replace human teachers. Rather, they will change the job of teaching.
A 2017 study led by Google executive James Manyika suggested that skills like creativity, emotional intelligence, and communication will always be needed in the classroom and that robots aren’t likely to provide them at the same level that humans naturally do. But robot teachers do bring advantages, such as a depth of subject knowledge that teachers can’t match, and they’re great for student engagement.
The teacher and robot can complement each other in new ways, with the teacher facilitating interactions between robots and students. So far, this is the case with teaching “assistants” being adopted now in China, Japan, the U.S., and Europe. In this scenario, the robot (usually the SoftBank child-size robot NAO) is a tool for teaching mainly science, technology, engineering, and math (the STEM subjects), but the teacher is very involved in planning, overseeing, and evaluating progress. The students get an entertaining and enriched learning experience, and some of the teaching load is taken off the teacher. At least, that’s what researchers have been able to observe so far.
To be sure, there are some powerful arguments for having robots in the classroom. A not-to-be-underestimated one is that robots “speak the language” of today’s children, who have been steeped in technology since birth. These children are adept at navigating a media-rich environment that is highly visual and interactive. They are plugged into the Internet 24-7. They consume music, games, and huge numbers of videos on a weekly basis. They expect to be dazzled because they are used to being dazzled by more and more spectacular displays of digital artistry. Education has to compete with social media and the entertainment vehicles of students’ everyday lives.
Another compelling argument for teaching robots is that they help prepare students for the technological realities they will encounter in the real world when robots will be ubiquitous. From childhood on, they will be interacting and collaborating with robots in every sphere of their lives from the jobs they do to dealing with retail robots and helper robots in the home. Including robots in the classroom is one way of making sure that children of all socioeconomic backgrounds will be better prepared for a highly automated age, when successfully using robots will be as essential as reading and writing. We’ve already crossed this threshold with computers and smartphones.
Students need multimedia entertainment with their teaching. This is something robots can provide through their ability to connect to the Internet and act as a centralized host to videos, music, and games. Children also need interaction, something robots can deliver up to a point, but which humans can surpass. The education of a child is not just intended to make them technologically functional in a wired world, it’s to help them grow in intellectual, creative, social, and emotional ways. When considered through this perspective, it opens the door to questions concerning just how far robots should go. Robots don’t just teach and engage children; they’re designed to tug at their heartstrings.
It’s no coincidence that many toy makers and manufacturers are designing cute robots that look and behave like real children or animals, says Turkle. “When they make eye contact and gesture toward us, they predispose us to view them as thinking and caring,” she has written in The Washington Post. “They are designed to be cute, to provide a nurturing response” from the child. As mentioned previously, this nurturing experience is a powerful vehicle for drawing children in and promoting strong attachment. But should children really love their robots?
ROBOTS AND THE PEOPLE WHO LOVE THEM: Holding on to Our Humanity in an Age of Social Robots by Eve Herold (January 9, 2024).
St. Martin’s Publishing Group
The problem, once again, is that a child can be lulled into thinking that she’s in an actual relationship, when a robot can’t possibly love her back. If adults have these vulnerabilities, what might such asymmetrical relationships do to the emotional development of a small child? Turkle notes that while we tend to ascribe a mind and emotions to a socially interactive robot, “simulated thinking may be thinking, but simulated feeling is never feeling, and simulated love is never love.”
Always a consideration is the fact that in the first few years of life, a child’s brain is undergoing rapid growth and development that will form the foundation of their lifelong emotional health. These formative experiences are literally shaping the child’s brain, their expectations, and their view of the world and their place in it. In Alone Together, Turkle asks: What are we saying to children about their importance to us when we’re willing to outsource their care to a robot? A child might be superficially entertained by the robot while his self-esteem is systematically undermined.
Research has emerged showing that there are clear downsides to child-robot relationships.
Still, in the case of robot nannies in the home, is active, playful engagement with a robot for a few hours a day any more harmful than several hours in front of a TV or with an iPad? Some, like Xiong, regard interacting with a robot as better than mere passive entertainment. iPal’s manufacturers say that their robot can’t replace parents or teachers and is best used by three- to eight-year-olds after school, while they wait for their parents to get off work. But as robots become ever-more sophisticated, they’re expected to perform more of the tasks of day-to-day care and to be much more emotionally advanced. There is no question children will form deep attachments to some of them. And research has emerged showing that there are clear downsides to child-robot relationships.
Some studies, performed by Turkle and fellow MIT colleague Cynthia Breazeal, have revealed a darker side to the child-robot bond. Turkle has reported extensively on these studies in The Washington Post and in her book Alone Together. Most children love robots, but some act out their inner bully on the hapless machines, hitting and kicking them and otherwise trying to hurt them. The trouble is that the robot can’t fight back, teaching children that they can bully and abuse without consequences. As in any other robot relationship, such harmful behavior could carry over into the child’s human relationships.
And, ironically, it turns out that communicative machines don’t actually teach kids good communication skills. It’s well known that parent-child communication in the first three years of life sets the stage for a very young child’s intellectual and academic success. Verbal back-and-forth with parents and care-givers is like fuel for a child’s growing brain. One article that examined several types of play and their effect on children’s communication skills, published in JAMA Pediatrics in 2015, showed that babies who played with electronic toys—like the popular robot dog Aibo—show a decrease in both the quantity and quality of their language skills.
Anna V. Sosa of the Child Speech and Language Lab at Northern Arizona University studied twenty-six ten- to sixteen- month-old infants to compare the growth of their language skills after they played with three types of toys: electronic toys like a baby laptop and talking farm; traditional toys like wooden puzzles and building blocks; and books read aloud by their parents. The play that produced the most growth in verbal ability was having books read to them by a caregiver, followed by play with traditional toys. Language gains after playing with electronic toys came dead last. This form of play involved the least use of adult words, the least conversational turntaking, and the least verbalizations from the children. While the study sample was small, it’s not hard to extrapolate that no electronic toy or even more abled robot could supply the intimate responsiveness of a parent reading stories to a child, explaining new words, answering the child’s questions, and modeling the kind of back- and-forth interaction that promotes empathy and reciprocity in relationships.
***
Most experts acknowledge that robots can be valuable educational tools. But they can’t make a child feel truly loved, validated, and valued. That’s the job of parents, and when parents abdicate this responsibility, it’s not only the child who misses out on one of life’s most profound experiences.
We really don’t know how the tech-savvy children of today will ultimately process their attachments to robots and whether they will be excessively predisposed to choosing robot companionship over that of humans. It’s possible their techno literacy will draw for them a bold line between real life and a quasi-imaginary history with a robot. But it will be decades before we see long-term studies culminating in sufficient data to help scientists, and the rest of us, to parse out the effects of a lifetime spent with robots.
This is an excerpt from ROBOTS AND THE PEOPLE WHO LOVE THEM: Holding on to Our Humanity in an Age of Social Robots by Eve Herold. The book will be published on January 9, 2024.
Story by Big Think
For most of history, artificial intelligence (AI) has been relegated almost entirely to the realm of science fiction. Then, in late 2022, it burst into reality — seemingly out of nowhere — with the popular launch of ChatGPT, the generative AI chatbot that solves tricky problems, designs rockets, has deep conversations with users, and even aces the Bar exam.
But the truth is that before ChatGPT nabbed the public’s attention, AI was already here, and it was doing more important things than writing essays for lazy college students. Case in point: It was key to saving the lives of tens of millions of people.
AI-designed mRNA vaccines
As Dave Johnson, chief data and AI officer at Moderna, told MIT Technology Review‘s In Machines We Trust podcast in 2022, AI was integral to creating the company’s highly effective mRNA vaccine against COVID. Moderna and Pfizer/BioNTech’s mRNA vaccines collectively saved between 15 and 20 million lives, according to one estimate from 2022.
Johnson described how AI was hard at work at Moderna, well before COVID arose to infect billions. The pharmaceutical company focuses on finding mRNA therapies to fight off infectious disease, treat cancer, or thwart genetic illness, among other medical applications. Messenger RNA molecules are essentially molecular instructions for cells that tell them how to create specific proteins, which do everything from fighting infection, to catalyzing reactions, to relaying cellular messages.
Johnson and his team put AI and automated robots to work making lots of different mRNAs for scientists to experiment with. Moderna quickly went from making about 30 per month to more than one thousand. They then created AI algorithms to optimize mRNA to maximize protein production in the body — more bang for the biological buck.
For Johnson and his team’s next trick, they used AI to automate science, itself. Once Moderna’s scientists have an mRNA to experiment with, they do pre-clinical tests in the lab. They then pore over reams of data to see which mRNAs could progress to the next stage: animal trials. This process is long, repetitive, and soul-sucking — ill-suited to a creative scientist but great for a mindless AI algorithm. With scientists’ input, models were made to automate this tedious process.
“We don’t think about AI in the context of replacing humans,” says Dave Johnson, chief data and AI officer at Moderna. “We always think about it in terms of this human-machine collaboration, because they’re good at different things. Humans are really good at creativity and flexibility and insight, whereas machines are really good at precision and giving the exact same result every single time and doing it at scale and speed.”
All these AI systems were in put in place over the past decade. Then COVID showed up. So when the genome sequence of the coronavirus was made public in January 2020, Moderna was off to the races pumping out and testing mRNAs that would tell cells how to manufacture the coronavirus’s spike protein so that the body’s immune system would recognize and destroy it. Within 42 days, the company had an mRNA vaccine ready to be tested in humans. It eventually went into hundreds of millions of arms.
Biotech harnesses the power of AI
Moderna is now turning its attention to other ailments that could be solved with mRNA, and the company is continuing to lean on AI. Scientists are still coming to Johnson with automation requests, which he happily obliges.
“We don’t think about AI in the context of replacing humans,” he told the Me, Myself, and AI podcast. “We always think about it in terms of this human-machine collaboration, because they’re good at different things. Humans are really good at creativity and flexibility and insight, whereas machines are really good at precision and giving the exact same result every single time and doing it at scale and speed.”
Moderna, which was founded as a “digital biotech,” is undoubtedly the poster child of AI use in mRNA vaccines. Moderna recently signed a deal with IBM to use the company’s quantum computers as well as its proprietary generative AI, MoLFormer.
Moderna’s success is encouraging other companies to follow its example. In January, BioNTech, which partnered with Pfizer to make the other highly effective mRNA vaccine against COVID, acquired the company InstaDeep for $440 million to implement its machine learning AI across its mRNA medicine platform. And in May, Chinese technology giant Baidu announced an AI tool that designs super-optimized mRNA sequences in minutes. A nearly countless number of mRNA molecules can code for the same protein, but some are more stable and result in the production of more proteins. Baidu’s AI, called “LinearDesign,” finds these mRNAs. The company licensed the tool to French pharmaceutical company Sanofi.
Writing in the journal Accounts of Chemical Research in late 2021, Sebastian M. Castillo-Hair and Georg Seelig, computer engineers who focus on synthetic biology at the University of Washington, forecast that AI machine learning models will further accelerate the biotechnology research process, putting mRNA medicine into overdrive to the benefit of all.
This article originally appeared on Big Think, home of the brightest minds and biggest ideas of all time.
Opioid prescription policies may hurt those in chronic pain
Tinu Abayomi-Paul works as a writer and activist, plus one unwanted job: Trying to fill her opioid prescription. She says that some pharmacists laugh and tell her that no one needs the amount of pain medication that she is seeking. Another pharmacist near her home in Venus, Tex., refused to fill more than seven days of a 30-day prescription.
To get a new prescription—partially filled opioid prescriptions can’t be dispensed later—Abayomi-Paul needed to return to her doctor’s office. But without her medication, she was having too much pain to travel there, much less return to the pharmacy. She rationed out the pills over several weeks, an agonizing compromise that left her unable to work, interact with her children, sleep restfully, or leave the house. “Don’t I deserve to do more than survive?” she says.
Abayomi-Paul’s pain results from a degenerative spine disorder, chronic lymphocytic leukemia, and more than a dozen other diagnoses and disabilities. She is part of a growing group of people with chronic pain who have been negatively impacted by the fallout from efforts to prevent opioid overdose deaths.
Guidelines for dispensing these pills are complicated because many opioids, like codeine, oxycodone, and morphine, are prescribed legally for pain. Yet, deaths from opioids have increased rapidly since 1999 and become a national emergency. Many of them, such as heroin, are used illegally. The CDC identified three surges in opioid use: an increase in opioid prescriptions in the ‘90s, a surge of heroin around 2010, and an influx of fentanyl and other powerful synthetic opioids in 2013.
As overdose deaths grew, so did public calls to address them, prompting the CDC to change its prescription guidelines in 2016. The new guidelines suggested limiting medication for acute pain to a seven-day supply, capping daily doses of morphine, and other restrictions. Some statistics suggest that these policies have worked; from 2016 to 2019, prescriptions for opiates fell 44 percent. Physicians also started progressively lowering opioid doses for patients, a practice called tapering. A study tracking nearly 100,000 Medicare subscribers on opioids found that about 13 percent of patients were tapering in 2012, and that number increased to about 23 percent by 2017.
But some physicians may be too aggressive with this tapering strategy. About one in four people had doses reduced by more than 10 percent per week, a rate faster than the CDC recommends. The approach left people like Abayomi-Paul without the medication they needed. Every year, Abayomi-Paul says, her prescriptions are harder to fill. David Brushwood, a pharmacy professor who specializes in policy and outcomes at the University of Florida in Gainesville, says opioid dosing isn’t one-size-fits-all. “Patients need to be taken care of individually, not based on what some government agency says they need,” he says.
‘This is not survivable’
Health policy and disability rights attorney Erin Gilmer advocated for people with pain, using her own experience with chronic pain and a host of medical conditions as a guidepost. She launched an advocacy website, Healthcare as a Human Right, and shared her struggles on Twitter: “This pain is more than anything I've endured before and I've already been through too much. Yet because it's not simply identified no one believes it's as bad as it is. This is not survivable.”
When her pain dramatically worsened midway through 2021, Gilmer’s posts grew ominous: “I keep thinking it can't possibly get worse but somehow every day is worse than the last.”
The CDC revised its guidelines in 2022 after criticisms that people with chronic pain were being undertreated, enduring dangerous withdrawal symptoms, and suffering psychological distress. (Long-term opioid use can cause physical dependency, an adaptive reaction that is different than the compulsive misuse associated with a substance use disorder.) It was too late for Gilmer. On July 7, 2021, the 38-year-old died by suicide.
Last August, an Ohio district court ruling set forth a new requirement for Walgreens, Walmart, and CVS pharmacists in two counties. These pharmacists must now document opioid prescriptions that are turned down, even for customers who have no previous purchases at that pharmacy, and they’re required to share this information with other locations in the same chain. None of the three pharmacies responded to an interview request from Leaps.org.
In a practice called red flagging, pharmacists may label a prescription suspicious for a variety of reasons, such as if a pharmacist observes an unusually high dose, a long distance from the patient’s home to the pharmacy, or cash payment. Pharmacists may question patients or prescribers to resolve red flags but, regardless of the explanation, they’re free to refuse to fill a prescription.
As the risk of litigation has grown, so has finger-pointing, says Seth Whitelaw, a compliance consultant at Whitelaw Compliance Group in West Chester, PA, who advises drug, medical device, and biotech companies. Drugmakers accused in National Prescription Opioid Litigation (NPOL), a complex set of thousands of cases on opioid epidemic deaths, which includes the Ohio district case, have argued that they shouldn’t be responsible for the large supply of opiates and overdose deaths. Yet, prosecutors alleged that these pharmaceutical companies hid addiction and overdose risks when labeling opioids, while distributors and pharmacists failed to identify suspicious orders or scripts.
Patients and pharmacists fear red flags
The requirements that pharmacists document prescriptions they refuse to fill so far only apply to two counties in Ohio. But Brushwood fears they will spread because of this precedent, and because there’s no way for pharmacists to predict what new legislation is on the way. “There is no definition of a red flag, there are no lists of red flags. There is no instruction on what to do when a red flag is detected. There’s no guidance on how to document red flags. It is a standardless responsibility,” Brushwood says. This adds trepidation for pharmacists—and more hoops to jump through for patients.
“I went into the doctor one day here and she said, ‘I'm going to stop prescribing opioids to all my patients effective immediately,” Nicolson says.
“We now have about a dozen studies that show that actually ripping somebody off their medication increases their risk of overdose and suicide by three to five times, destabilizes their health and mental health, often requires some hospitalization or emergency care, and can cause heart attacks,” says Kate Nicolson, founder of the National Pain Advocacy Center based in Boulder, Colorado. “It can kill people.” Nicolson was in pain for decades due to a surgical injury to the nerves leading to her spinal cord before surgeries fixed the problem.
Another issue is that primary care offices may view opioid use as a reason to turn down new patients. In a 2021 study, secret shoppers called primary care clinics in nine states, identifying themselves as long-term opioid users. When callers said their opioids were discontinued because their former physician retired, as opposed to an unspecified reason, they were more likely to be offered an appointment. Even so, more than 40 percent were refused an appointment. The study authors say their findings suggest that some physicians may try to avoid treating people who use opioids.
Abayomi-Paul says red flagging has changed how she fills prescriptions. “Once I go to one place, I try to [continue] going to that same place because of the amount of records that I have and making sure my medications don’t conflict,” Abayomi-Paul says.
Nicolson moved to Colorado from Washington D.C. in 2015, before the CDC issued its 2016 guidelines. When the guidelines came out, she found the change to be shockingly abrupt. “I went into the doctor one day here and she said, ‘I'm going to stop prescribing opioids to all my patients effective immediately.’” Since then, she’s spoken with dozens of patients who have been red-flagged or simply haven’t been able to access pain medication.
Despite her expertise, Nicolson isn’t positive she could successfully fill an opioid prescription today even if she needed one. At this point, she’s not sure exactly what various pharmacies would view as a red flag. And she’s not confident that these red flags even work. “You can have very legitimate reasons for being 50 miles away or having to go to multiple pharmacies, given that there are drug shortages now, as well as someone refusing to fill [a prescription.] It doesn't mean that you’re necessarily ‘drug seeking.’”
While there’s no easy solution. Whitelaw says clarifying the role of pharmacists and physicians in patient access to opioids could help people get the medication they need. He is seeking policy changes that focus on the needs of people in pain more than the number of prescriptions filled. He also advocates standardizing the definition of red flags and procedures for resolving them. Still, there will never be a single policy that can be applied to all people, explains Brushwood, the University of Florida professor. “You have to make a decision about each individual prescription.”