Alone Together (10 page)

Read Alone Together Online

Authors: Sherry Turkle

BOOK: Alone Together
5.83Mb size Format: txt, pdf, ePub
When Wilson, who so enjoys burping in synchrony with his Furby, faces up to the hard work of getting his Furby to sleep, he knows that if he forces sleep by removing his Furby’s batteries, the robot will “forget” whatever has passed between them—this is unacceptable. So Furby sleep has to come naturally. Wilson tries to exhaust his Furby by keeping it up late at night watching television. He experiments with Furby “sleep houses” made of blankets piled high over towers of blocks. When Wilson considers Furby sleep, his thoughts turn to Furby dreams. He is sure his Furby dreams “when his eyes are closed.” What do Furbies dream of? Second and third graders think they dream “of life on their flying saucers.”
7
And they dream about learning languages and playing with the children they love.
David and Zach, both eight, are studying Hebrew. “My Furby dreams about Hebrew,” says David. “It knows how to say
Eloheinu
. . . . I didn’t even try to teach it; it was just from listening to me doing Hebrew homework.” Zach agrees: “Mine said
Dayeinu
in its sleep.” Zach, like Wilson, is proud of how well he can make his Furby sleep by creating silence and covering it with blankets. He is devoted to teaching his Furby English and has been studying Furbish as well; he has mastered the English/Furbish dictionary that comes with the robot. A week after Zach receives his Furby, however, his mother calls my office in agitation. Zach’s Furby is broken. It has been making a “terrible” noise. It sounds as though it might be suffering, and Zach is distraught. Things reached their worst during a car trip from Philadelphia to Boston, with the broken Furby wailing as though in pain. On the long trip home, there was no Phillips screwdriver for the ultimate silencing, so Zach and his parents tried to put the Furby to sleep by nestling it under a blanket. But every time the car hit a bump, the Furby woke up and made the “terrible” noise. I take away the broken Furby, and give Zach a new one, but he wants little to do with it. He doesn’t talk to it or try to teach it. His interest is in “his” Furby, the Furby he nurtured, the Furby he taught. He says, “The Furby that I had before could say ‘again’; it could say ‘hungry.’” Zach believes he was making progress teaching the first Furby a bit of Spanish and French. The first Furby was never “annoying,” but the second Furby is.
His
Furby is irreplaceable.
After a few weeks, Zach’s mother calls to ask if their family has my permission to give the replacement Furby to one of Zach’s friends. When I say yes, Zach calmly contemplates the loss of Furby #2. He has loved; he has lost; he is not willing to reinvest. Neither is eight-year-old Holly, who becomes upset and withdrawn when her mother takes the batteries out of her Furby. The family was about to leave on an extended vacation, and the Furby manual suggests taking out a Furby’s batteries if it will go unused for a long time. Holly’s mother did not understand the implications of what she saw as commonsense advice from the manual. She insists, with increasing defensiveness, that she was only “following the instructions.” Wide-eyed, Holly tries to make her mother understand what she has done: when the batteries are removed, Holly says, “the Furby forgets its life.”
Designed to give users a sense of progress in teaching it, when the Furby evolves over time, it becomes the irreplaceable repository and proof of its owner’s care. The robot and child have traveled a bit of road together. When a Furby forgets, it is as if a friend has become amnesic. A new Furby is a stranger. Zach and Holly cannot bear beginning again with a new Furby that could never be the Furby into which each has poured time and attention.
OPERATING PROCEDURES
 
In the 1980s, the computer toy Merlin made happy and sad noises depending on whether it was winning or losing the sound-and-light game it played with children. Children saw Merlin as “sort of alive” because of how well it played memory games, but they did not fully believe in Merlin’s shows of emotion. When a Merlin broke down, children were sorry to lose a playmate. When a Furby doesn’t work, however, children see a creature that might be in pain.
Lily, ten, worries that her broken Furby is hurting. But she doesn’t want to turn it off, because “that means you aren’t taking care of it.” She fears that if she shuts off a Furby in pain, she might make things worse. Two eight-year-olds fret about how much their Furbies sneeze. The first worries that his sneezing Furby is allergic to him. The other fears his Furby got its cold because “I didn’t do a good enough job taking care of him.” Several children become tense when Furbies make unfamiliar sounds that might be signals of distress. I observe children with their other toys: dolls, toy soldiers, action figures. If these toys make strange sounds, they are usually put aside; broken toys lead easily to boredom. But when a Furby is in trouble, children ask, “Is it tired?” “Is it sad?” “Have I hurt it?” “Is it sick?” “What shall I do?”
Taking care of a robot is a high-stakes game. Things can—and do—go wrong. In one kindergarten, when a Furby breaks down, the children decide they want to heal it. Ten children volunteer, seeing themselves as doctors in an emergency room. They decide they’ll begin by taking it apart.
The proceedings begin in a state of relative calm. When talking about their sick Furby, the children insist that this breakdown does not mean the end: people get sick and get better. But as soon as scissors and pliers appear, they become anxious. At this point, Alicia screams, “The Furby is going to die!” Sven, to his classmates’ horror, pinpoints the moment when Furbies die: it happens when a Furby’s skin is ripped off. Sven considers the Furby as an animal. You can shave an animal’s fur, and it will live. But you cannot take its skin off. As the operation continues, Sven reconsiders. Perhaps the Furby can live without its skin, “but it will be cold.” He doesn’t back completely away from the biological (the Furby is sensitive to the cold) but reconstructs it. For Sven, the biological now includes creatures such as Furbies, whose “insides” stay “all in the same place” when their skin is removed. This accommodation calms him down. If a Furby is simultaneously biological and mechanical, the operation in process, which is certainly removing the Furby’s skin, is not necessarily destructive. Children make theories when they are confused or anxious. A good theory can reduce anxiety.
But some children become more anxious as the operation continues. One suggests that if the Furby dies, it might haunt them. It is alive enough to turn into a ghost. Indeed, a group of children start to call the empty Furby skin “the ghost of Furby” and the Furby’s naked body “the goblin.” They are not happy that this operation might leave a Furby goblin and ghost at large. One girl comes up with the idea that the ghost of the Furby will be less fearful if distributed. She asks if it would be okay “if every child took home a piece of Furby skin.” She is told this would be fine, but, unappeased, she asks the same question two more times. In the end, most children leave with a bit of Furby fur.
8
Some talk about burying it when they get home. They leave room for a private ritual to placate the goblin and say good-bye.
Inside the classroom, most of the children feel they are doing the best they can with a sick pet. But from outside the classroom, the Furby surgery looks alarming. Children passing by call out, “You killed him.” “How dare you kill Furby?” “You’ll go to Furby jail.” Denise, eight, watches some of the goings-on from the safety of the hall. She has a Furby at home and says that she does not like to talk about its problems as diseases because “Furbies are not animals.” She uses the word “fake” to mean nonbiological and says, “Furbies are fake, and they don’t get diseases.” But later, she reconsiders her position when her own Furby’s batteries run out and the robot, so chatty only moments before, becomes inert. Denise panics: “It’s dead. It’s dead right now.... Its eyes are closed.” She then declares her Furby “both fake and dead.” Denise concludes that worn-out batteries and water can kill a Furby. It is a mechanism, but alive enough to die.
Linda, six, is one of the children whose family has volunteered to keep a Furby for a two-week home study. She looked forward to speaking to her Furby, sure that unlike her other dolls, this robot would be worth talking to. But on its very first night at her home, her Furby stops working: “Yeah, I got used to it, and then it broke that night—the night that I got it. I felt like I was broken or something.... I cried a lot. . . . I was really sad that it broke, ’cause Furbies talk, they’re like real, they’re like real people.” Linda is so upset about not protecting her Furby that when it breaks she feels herself broken.
Things get more complicated when I give Linda a new Furby. Unlike children like Zach who have invested time and love in a “first Furby” and want no replacements, Linda had her original Furby in working condition for only a few hours. She likes having Furby #2: “It plays hide-and-seek with me. I play red light, green light, just like in the manual.” Linda feeds it and makes sure it gets enough rest, and she reports that her new Furby is grateful and affectionate. She makes this compatible with her assessment of a Furby as “just a toy” because she has come to see gratitude, conversation, and affection as something that toys can manage. But now she will not name her Furby or say it is alive. There would be risk in that: Linda might feel guilty if the new Furby were alive enough to die and she had a replay of her painful first experience.
Like the child surgeons, Linda ends up making a compromise: the Furby is both biological and mechanical. She tells her friends, “The Furby is kind of real but just a toy.” She elaborates that “[the Furby] is real because it is talking and moving and going to sleep. It’s kind of like a human and a pet.” It is a toy because “you had to put in batteries and stuff, and it could stop talking.”
So hybridity can offer comfort. If you focus on the Furby’s mechanical side, you can enjoy some of the pleasures of companionship without the risks of attachment to a pet or a person. With practice, says nine-year-old Lara, reflecting on her Furby, “you can get it to like you. But it won’t die or run away. That is good.” But hybridity also brings new anxieties. If you grant the Furby a bit of life, how do you treat it so that it doesn’t get hurt or killed? An object on the boundaries of life, as we’ve seen, suggests the possibility of real pain.
AN ETHICAL LANDSCAPE
 
When a mechanism breaks, we may feel regretful, inconvenienced, or angry. We debate whether it is worth getting it fixed. When a doll cries, children know that they are themselves creating the tears. But a robot with a body can get “hurt,” as we saw in the improvised Furby surgical theater. Sociable robotics exploits the idea of a robotic body to move people to relate to machines as subjects, as creatures in pain rather than broken objects. That even the most primitive Tamagotchi can inspire these feelings demonstrates that objects cross that line not because of their sophistication but because of the feelings of attachment they evoke. The Furby, even more than the Tamagotchi, is alive enough to suggest a body in pain as well as a troubled mind. Furbies whine and moan, leaving it to their users to discover what might help. And what to make of the moment when an upside down Furby says, “Me scared!”?
Freedom Baird takes this question very seriously.
9
A recent graduate of the MIT Media Lab, she finds herself engaged with her Furby as a creature and a machine. But how seriously does she take the idea of the Furby as a creature? To determine this, she proposes an exercise in the spirit of the Turing test.
In the original Turing test, published in 1950, mathematician Alan Turing, inventor of the first general-purpose computer, asked under what conditions people would consider a computer intelligent. In the end, he settled on a test in which the computer would be declared intelligent if it could convince people it was not a machine. Turing was working with computers made up of vacuum tubes and Teletype terminals. He suggested that if participants couldn’t tell, as they worked at their Teletypes, if they were talking to a person or a computer, that computer would be deemed “intelligent.”
10
A half century later, Baird asks under what conditions a creature is deemed alive enough for people to experience an ethical dilemma if it is distressed. She designs a Turing test not for the head but for the heart and calls it the “upside-down test.” A person is asked to invert three creatures: a Barbie doll, a Furby, and a biological gerbil. Baird’s question is simple: “How long can you hold the object upside down before your emotions make you turn it back?” Baird’s experiment assumes that a sociable robot makes new ethical demands. Why? The robot performs a psychology; many experience this as evidence of an inner life, no matter how primitive. Even those who do not think a Furby has a mind—and this, on a conscious level, includes most people—find themselves in a new place with an upside-down Furby that is whining and telling them it is scared. They feel themselves, often despite themselves, in a situation that calls for an ethical response. This usually happens at the moment when they identify with the “creature” before them, all the while knowing that it is “only a machine.”
This simultaneity of vision gives Baird the predictable results of the upside-down test. As Baird puts it, “People are willing to be carrying the Barbie around by the feet, slinging it by the hair . . . no problem.... People are not going to mess around with their gerbil.” But in the case of the Furby, people will “hold the Furby upside down for thirty seconds or so, but when it starts crying and saying it’s scared, most people feel guilty and turn it over.”
The work of neuroscientist Antonio Damasio offers insight into the origins of this guilt. Damasio describes two levels of experiencing pain. The first is a physical response to a painful stimulus. The second, a far more complex reaction, is an emotion associated with pain. This is an internal representation of the physical.
11
When the Furby says, “Me scared,” it signals that it has crossed the line between a physical response and an emotion, the internal representation. When people hold a Furby upside down, they do something that would be painful if done to an animal. The Furby cries out—as if it were an animal. But then it says, “Me scared”—as if it were a person.

Other books

Divine by Teschner, B.L.
Hunted by Adam Slater
So Cold the River (2010) by Koryta, Michael
The Catcher's Mask by Matt Christopher, Bert Dodson
Popcorn by Ben Elton
The Return of Black Douglas by Elaine Coffman
Devil's Shore by Bernadette Walsh
The Red Queen Dies by Frankie Y. Bailey