In love with a bot

When robots look like people or pets, says Robert Ito, it’s hard not to develop feelings for them.

"The robot is smiling at me, his red rubbery lips curved in a cheery grin. I’m seated in front of a panel with 10 numbered buttons, and the robot, a 3-foot-tall, legless automaton with an impish face, is telling me which buttons to push and which hand to push them with: “Touch seven with your right hand; touch three with your left.”

The idea is to go as fast as I can. When I make a mistake, he corrects me; when I speed up, he tells me how much better I’m doing. Despite the simplicity of our interactions, I’m starting to like the little guy. Maybe it’s his round silvery eyes and moon-shaped face; maybe it’s his soothing voice—not quite human, yet warm all the same. Even though I know he’s just a jumble of wires and circuitry, I want to do better on these tests, to please him.

The robot’s name is Bandit. We’re together in a tiny room at Rancho Los Amigos National Rehabilitation Center in Downey, Calif., where Bandit regularly puts stroke victims through their paces. They’re very fond of him, says University of Southern California researcher Eric Wade, who has worked with Bandit and his predecessors for five years. The stroke victims chitchat with Bandit, chide him, smile when he congratulates them. “People will try to hug the robots,” says Wade. “We go out to nursing homes, and people ask, ‘When’s the robot coming back?’”

Bandit is one of a growing number of social robots designed to help humans in both hospitals and homes. There are robots that comfort lonely shut-ins, assist patients suffering from dementia, and help autistic kids learn how to interact with their human peers. They’re popular, and engineered to be so. If we didn’t like them, we wouldn’t want them listening to our problems or pestering us to take our meds. So it’s no surprise that people become attached to these robots. What is surprising is just how attached some have become. Researchers have documented people kissing their mechanized companions, confiding in them, giving them gifts—and being heartbroken when the robot breaks, or the study ends and it’s time to say goodbye.

And this is just the beginning. What happens as robots become ever more responsive, more human-like? Some researchers worry that people—especially groups like autistic kids or elderly shut-ins who already are less apt to interact with others—may come to prefer their mechanical friends over their human ones.

Are we really ready for this relationship?

There are over 100 different models of social robots worldwide. The family includes machines that can act as nursemaids and housekeepers, provide companionship, talk patients through physical rehabilitation, and act as surrogate pets. The most popular, Sony’s Aibo (Artificial Intelligence Bot) robot dog, sold more than 140,000 units before it was discontinued. The Japan Robot Association, an industry trade group, predicts that today’s $5 billion a year market for social robots will top $50 billion a year by 2025.

What makes these machines’ popularity all the more remarkable is that they are a long way from the charming pseudo-humans of science fiction, your chatty C-3POs or cuddly WALL-Es. Many of these helpmates are little more than animatronic Pillow Pets.

The Japanese-made Paro, for instance, looks like a plush-toy version of a baby harp seal. It coos, moves its head and tail, bats its long lashes—and that’s about it. Even so, people adore it. More than a thousand Paros have been sold since its creation in 2003, making it one of the most popular therapeutic robots ever produced. In one study, a few people in two nursing homes seemed to believe that the Paro was a real animal; others spoke to it and were convinced that the Paro, which can only squeak and purr, was speaking back to them.

Or consider the Roomba, a robot vacuum cleaner that has sold more than 6 million units. In a 2007 study, researchers from Georgia Tech’s College of Computing looked at the ways in which Roomba owners bonded with their gadgets. Though the machines have neither faces nor limbs, and do little more than scuttle around and pick up lint, users were noted speaking to them, describing them as family members, even expressing grief when they needed to be “hospitalized.”

“I love the silly thing,” says Jill Cooper, co-founder of the frugal-living website LivingOnADime.com. Cooper, like many Roomba owners, gave her robot a name (Bob), speaks to him, and shows him off to visitors. “I hate to get too deep here,” she says, “but it’s like trying to explain what it feels like to be in love to somebody who’s never been in love before.”

“I’ve had to say goodbye to a lot of robots,” laments Kjerstin Williams, a senior robotics engineer at the research-and-development firm Applied Minds in Glendale, Calif. “If you have animals as pets, you go through the same process: You grieve and move on, and you try to re-engage with the next animal, or the next set of robots. It’s just that socially, it’s perfectly acceptable to grieve over a dog and maybe never get another one. If you’re a roboticist, you can’t do that.”

And it’s not just social robots spawning teary farewells. When a U.S. Marines explosives technician in Iraq brought the blasted remains of Scooby-Doo, his bomb-disabling robot, to the repair shop, Ted Bogosh, the master sergeant in charge of the shop, told him the machine was beyond repair. Bogosh offered the Marine a new robot, but the mournful man insisted he didn’t want a new robot—he wanted Scooby-Doo back. “Sometimes they get a little emotional,” Bogosh told The Washington Post.

In another instance reported by the Post, a U.S. Army colonel halted an experiment at the Yuma Proving Ground in Arizona in which a 5-foot-long, insect-like robot was getting its many limbs blown off one at a time. The colonel, according to Mark Tilden, the robotics physicist at the site, deemed the spectacle “inhumane.”

If veteran military officers can get choked up over a mechanized centipede, how hard might, say, a stroke patient fall for an artificial roommate? “Imagine a household robot that looks like a person,” says Matthias Scheutz, a computer science professor at Tufts University. “It’s nice, because it’s programmed to be nice. You’re going to be looking for friendship in that robot, because the robot is just like a friend. That’s what I find really problematic.”

Robots already are used extensively in Japan to help take care of older people, which concerns Sherry Turkle, director of the MIT Initiative on Technology and Self.

“The elderly, at the end of their lives, deserve to work out the meaning of their lives with someone who understands what it means to be born, to have parents, to consider the question of children, to fear death,” says Turkle. “That someone has to be a person. That doesn’t mean that robots can’t help with household chores. But as companions, I think it is the wrong choice.”

Then again, assistive robots for the elderly are a hot topic precisely because, as populations age, there are fewer human caregivers to go around. “Our work never aims to replace human care,” says Maja Mataric, director of USC’s Center for Robotics and Embedded Systems. “There is a vast gap in human care for all ages and various special needs. The notion that people should do the caring is not realistic. There simply aren’t enough people. We must find other ways to care for those in need.”

And the robots do seem to help. A 2009 review of 43 studies published in the journal Gerontechnology found that social robots increase positive mood and ease stress in the elderly. Some studies also reported decreases in loneliness and a strengthening of ties between the subjects and their family members.

But Turkle wonders if such human-robot relationships are inherently deceptive, because they encourage people to feel things for machines that can’t feel anything. Robots are programmed to say “I love you” when they can’t love; therapeutic robot pets, like Aibos and Paros, feign pleasure they don’t feel. Are programmers deluding people with their lovable but unloving creations?

“People can’t help falling for these robots,” says Scheutz. “So if we can avoid it, let’s not design them with faces and humanoid forms. There’s no reason that everything has to have two legs and look like a person.”

Unfeeling or not, a robot and its charms can be hard to resist. In the weeks following my meeting with Bandit, I find myself Googling his name and USC just to see if there’s been any news about him. I don’t think I miss him, really. I just want to know what he’s been up to.

Williams, the roboticist at Applied Minds, understands what I’m going through. As a graduate student at Caltech, Williams became attached to an Aibo, one of many that she would take around to local schools to get kids interested in robotics. She took this particular Aibo home, named him Rhodium (her husband is a chemist), played with him, learned his likes (a pink ball) and dislikes (having the antenna on his ear pushed the wrong way). But after graduation, she had to return Rhodium to the university.

“I do wonder where he went,” says Williams. “And I hope he still has his pink ball, because he’d be awfully sad if he couldn’t find it.” Sorry to say, the little robot dog undoubtedly misses his pink ball as much as he misses Williams—which is not at all.

- As seen in The Week
Brought to you by
NetLingo: Improve Your Internet IQ
Subscribe to the NetLingo Blog via Email or RSS
here!