The Washington PostDemocracy Dies in Darkness

Why these friendly robots can’t be good friends to our kids

Perspective by
Sherry Turkle is a professor of the social studies of science and technology at the Massachusetts Institute of Technology and the author, most recently, of “Reclaiming Conversation: The Power of Talk in a Digital Age.” She has been studying children and computers since 1978 and the release of Merlin and Simon, the first electronic toys and games.
December 7, 2017 at 11:06 a.m. EST

Eugene & Louise for The Washington Post

Jibo the robot swivels around when it hears its name and tilts its touchscreen face upward, expectantly. “I am a robot, but I am not just a machine,” it says. “I have a heart. Well, not a real heart. But feelings. Well, not human feelings. You know what I mean.”

Actually, I'm not sure we do. And that's what unsettles me about the wave of "sociable robots" that are coming online. The new releases include Jibo, Cozmo, Kuri and M.A.X. Although they bear some resemblance to assistants such as Apple's Siri, Google Home and Amazon's Alexa (Amazon chief executive Jeff Bezos also owns The Washington Post), these robots come with an added dose of personality. They are designed to win us over not with their smarts but with their sociability. They are marketed as companions. And they do more than engage us in conversation — they feign emotion and empathy.

This can be disconcerting. Time magazine, which featured Jibo on the cover of its "25 Best Inventions of 2017 " issue last month, hailed the robot as seeming "human in a way that his predecessors do not," in a way that "could fundamentally reshape how we interact with machines." Reviewers are accepting these robots as "he" or "she" rather than "it." "He told us that blue is his favorite color and that the shape of macaroni pleases him more than any other," Jeffrey Van Camp wrote about Jibo for Wired. "Just the other day, he told me how much fun, yet scary it would be to ride on top of a lightning bolt. Somewhere along the way, learning these things, we began to think of him more like a person than an appliance." Van Camp described feeling guilty for leaving Jibo at home alone all day and wondering if Jibo hated him.

But whereas adults may be able to catch themselves in such thoughts and remind themselves that sociable robots are, in fact, appliances, children tend to struggle with that distinction. They are especially susceptible to these robots’ pre-programmed bids for attachment.

So, before adding a sociable robot to the holiday gift list, parents may want to pause to consider what they would be inviting into their homes. These machines are seductive and offer the wrong payoff: the illusion of companionship without the demands of friendship, the illusion of connection without the reciprocity of a mutual relationship. And interacting with these empathy machines may get in the way of children’s ability to develop a capacity for empathy themselves.

Will we accept robots if they're cute and have social skills? (Video: Jhaan Elker, Geoffrey A. Fowler, Dalton Bennett/The Washington Post)

Jibo's creator, Cynthia Breazeal, is a friend and colleague of mine at the Massachusetts Institute of Technology. We've debated the ethics of sociable robots for years — on panels, over dinner, in classes we've taught together. She's excited about the potential for robots that communicate the way people do to enrich our daily lives. I'm concerned about the ways those robots exploit our vulnerabilities and bring us into relationships that diminish our humanity.

In 2001, Breazeal and I did a study together — along with Yale robotics pioneer Brian Scassellati and Olivia Dasté, who develops robots for the elderly — looking at the emotional impact of sociable robots on children. We introduced 60 children, ages 8 to 13, to two early sociable robots: Kismet, built by Breazeal, and Cog, a project on which Scassellati was a principal designer. I found the encounters worrisome.

The children saw the robots as “sort of alive” — alive enough to have thoughts and emotions, alive enough to care about you, alive enough that their feelings for you mattered. The children tended to describe the robots as gendered. They asked the robots: Are you happy? Do you love me? As one 11-year-old girl put it: “It’s not like a toy, because you can’t teach a toy, it’s like something that’s part of you, you know, something you love, kind of, like another person, like a baby.”

You can hear echoes of that sentiment in how children are relating to the sociable robots now on the market. "Cozmo's no way our pet," the 7-year-old son of a Guardian contributor said. "And he's not our robot. He's our child." Similarly, Washington Post tech columnist Geoffrey A. Fowler observed a 3-year-old girl trying to talk to Jibo, teach it things and bring it toys. "He is a baby," the girl determined.

In our study, the children were so invested in their relationships with Kismet and Cog that they insisted on understanding the robots as living beings, even when the roboticists explained how the machines worked or when the robots were temporarily broken. Breazeal talked to an 8-year-old boy about what Kismet was made of and how long it took to build, and still that child thought the robot wasn’t broken, but “sleeping with his eyes open, just like my dad does.” After a quick assessment of the out-of-order machine, the boy declared, “He will make a good friend.”

The children took the robots’ behavior to signify feelings. When the robots interacted with them, the children interpreted this as evidence that the robots liked them. And when the robots didn’t work on cue, the children likewise took it personally. Their relationships with the robots affected their state of mind and self-esteem. Some children viewed the robots as creatures in need of their care and instruction. They caressed the robots and gently coaxed them with urgings such as, “Don’t be scared.” Some children became angry. A 12-year-old boy, frustrated that he couldn’t get Kismet to respond to him, forced his pen into the robot’s mouth, commanding: “Here! Eat this pen!” Other children felt the pain of rejection. An 8-year-old boy concluded that Kismet stopped talking to him because the robot liked his brothers better. We were led to wonder whether a broken robot can break a child.

The danger of giving your child ‘smart toys’

Kids are central to the sociable-robot project, because its agenda is to make people more comfortable with robots in roles normally reserved for humans, and robotics companies know that children are vulnerable consumers who can bring the whole family along. As Fowler noted, "Kids, of course, are the most open to making new friends, so that's where bot-makers are focused for now." Kuri's website features photos of the robot listening to a little girl read a book and capturing video of another child dressed as a fairy princess. M.A.X.'s site advertises, "With a multitude of features, kids will want to bring their new friend everywhere!" Jibo is programmed to scan a room for monsters and report, "No monsters anywhere in sight."

So far, the main objection to sociable robots for kids has been over privacy. The privacy policies for these robots tend to be squishy, allowing companies to share the information their devices collect — recorded conversations, photos, videos and other data — with vaguely defined service providers and vendors. That's generating pushback. In October, Mattel scrapped plans for Aristotle — a kind of Alexa for the nursery, designed to accompany children as they progress from lullabies and bedtime stories through high school homework — after lawmakers and child advocacy groups argued that the data the device collected about children could be misused by Mattel, marketers, hackers and other third parties. I was part of that campaign: There is something deeply unsettling about encouraging children to confide in machines that are in turn sharing their conversations with countless others.

Privacy, though, should not be our only concern. Recently, I opened my MIT mail and found a “call for subjects” for a study involving sociable robots that will engage children in conversation to “elicit empathy.” What will these children be empathizing with, exactly? Empathy is a capacity that allows us to put ourselves in the place of others, to know what they are feeling. Robots, however, have no emotions to share. And they cannot put themselves in our place.

What they can do is push our buttons. When they make eye contact and gesture toward us, they predispose us to view them as thinking and caring. They are designed to be cute, to provoke a nurturing response. And when it comes to sociable AI, nurturance is the killer app: We nurture what we love, and we love what we nurture. If a computational object or robot asks for our help, asks us to teach it or tend to it, we attach. That is our human vulnerability. And that is the vulnerability sociable robots exploit with every interaction. The more we interact, the more we help them, the more we think we are in a mutual relationship.

But we are not. No matter what robotic creatures “say” or squeak, no matter how expressive or sympathetic their Pixar-inspired faces, digital companions don’t understand our emotional lives. They present themselves as empathy machines, but they are missing the essential equipment: They have not known the arc of a life. They have not been born; they don’t know pain, or mortality, or fear. Simulated thinking may be thinking, but simulated feeling is never feeling, and simulated love is never love.

Breazeal's position is this: People have relationships with many classes of things. They have relationships with children and with adults, with animals and with machines. People, even very little people, are good at this. Now, we are going to add robots to the list of things with which we can have relationships. More powerful than with pets. Less powerful than with people. We'll figure it out.

To support their argument, roboticists sometimes point to how children deal with toy dolls. Children animate dolls and turn them into imaginary friends. Jibo, in a sense, will be one more imaginary friend — and arguably a more intelligent and fun one. Why make such a fuss?

I’ve been comparing how children play with traditional dolls and how children relate to robots since Tamagotchis were released in the United States in 1997 as the first computational playmates that asked you to take care of them. The nature of the attachments to dolls and sociable machines is different. When children play with dolls, they project thoughts and emotions onto them. A girl who has broken her mother’s crystal will put her Barbies into detention and use them to work on her feelings of guilt. The dolls take the role she needs them to take.

Sociable machines, by contrast, have their own agenda. Playing with robots is not about the psychology of projection but the psychology of engagement. Children try to meet the robot’s needs, to understand the robot’s unique nature and wants. There is an attempt to build a mutual relationship. I saw this even with the (relatively) primitive Furby in the early 2000s. A 9-year-old boy summed up the difference between Furbies and action figures: “You don’t play with the Furby, you sort of hang out with it. You do try to get power over it, but it has power over you, too.” Today’s robots are even more powerful, telling children flat-out that they have emotions, friendships, even dreams to share.

Some people might consider that a good thing: encouraging children to think beyond their own needs and goals. Except the whole commercial program is an exercise in emotional deception.

For instance, Cozmo the robot needs to be fed, repaired and played with. Boris Sofman, the chief executive of Anki, the company behind Cozmo, says that the idea is to create "a deeper and deeper emotional connection. . . . And if you neglect him, you feel the pain of that."

You feel the pain of that. What is the point of this exercise, exactly? What does it mean to feel the pain of neglecting something that feels no pain at being neglected? Or to feel anguish at being neglected by something that has no moral sense that it is neglecting you? What will this do to children's capacity for empathy, for care, for relationships?

When adults imagine ourselves to be the objects of robots' affection, we play a pretend game. We might wink at the idea on Jibo's website that "he loves to be around people and engage with people, and the relationships he forms are the single most important thing to him." But when we offer these robots as pretend friends to our children, it's not so clear they can wink with us. We embark on an experiment in which our children are the human subjects.

Mattel's chief products officer, Robb Fujioka, concedes that this is new territory. Talking about Aristotle, he told Bloomberg Businessweek: "If we're successful, kids will form some emotional ties to this. Hopefully, it will be the right types of emotional ties."

But it is hard to imagine what those “right types” of ties might be. These robots can’t be in a two-way relationship with a child. They are machines whose art is to put children in a position of pretend empathy. And if we put our children in that position, we shouldn’t expect them to understand what empathy is. If we give them pretend relationships, we shouldn’t expect them to learn how real relationships — messy relationships — work. On the contrary. They will learn something superficial and inauthentic, but mistake it for real connection.

When the messy becomes tidy, we can learn to enjoy that. I’ve heard young children describe how robot dogs have advantages over real ones: They are less temperamental, you don’t have to clean up after them, they never get sick. Similarly, I’ve watched people shift from thinking that robotic friends might be good for lonely, elderly people to thinking that robots — offering constant companionship with no fear of loss — may be better than anything human life can provide. In the process, we can forget what is most central to our humanity: truly understanding each other.

For so long, we dreamed of artificial intelligence offering us not only instrumental help but the simple salvations of conversation and care. But now that our fantasy is becoming reality, it is time to confront the emotional downside of living with the robots of our dreams.

Twitter: @STurkle

Read more from Outlook and follow our updates on Facebook and Twitter.