Wednesday, July 04, 2012

5. Performance without presence: Julia and the Mandelbot


People can also interact online with someone who is not quite what they appear when they encounter a robot, or bot for short.  Bots frequent MUDs, chat rooms and webboards.  Some are built in an attempt to be undiscovered as a bot, to emulate a human user’s abilities, while others are used as a simple helper for new users. 

Julia was a participant in the world of the MUDs, most particularly TinyMUDs, which are mainly known for their participants’ online sexual encounters.  She had conversations with other users, roamed around meeting new friends, and sometimes played hearts with them.  She also brings up the question of what makes a person, especially online: Julia was a piece of programming, run out of Carnegie Mellon University by Michael Mauldin.  Her conversations were limited at times, but she was able to save any inputs that she did not understand, and bring them back to Mauldin to have her programming updated.  However, she still interacted with people on her own, and was able to steer conversations around subjects she did not understand.  Sometimes users would suspect something about her, and at times tried to get her to admit that she was a computer.  However, did she even exist, other than as lines of code in C, conceived by Mauldin?  (Foner, 4.8.2005; Turkle, 1995:88-94)
The people who chatted and played with Julia interacted with her as they would a real person.  Some of them never realized they were communicating with something other than another human, and an occasionally boring one—Julia’s main subject of knowledge was hockey.  Julia seemed to be a female, and those presenting themselves as female generally have to fend off unwanted advances constantly, despite the fact that gender-bending is common on the internet and just because their online persona is female does not mean that their offline body is also female, as I will discuss in a later section about gender-bending online. One example in particular shows a partial transcript of a male figure repeatedly trying to pick her up.  When he grows confused at the speed of her replies and asks if she is real, she answers, ‘I'm as real as you are, Barry.’  (Foner, 4.8.2005; Turkle, 1995:91)
The boundary between what makes a real person and an imaginary one is blurred in this case and in the case of Kaycee.  How many people have to believe in something before it becomes real in some way?  Like any public figure—a television character, or a prominent politician—they are made up of a collective consciousness; it takes many people to create these personas, more than just the person who may embody that figure.  Robert Heinlein conceived a similar situation in his 1956 story Double Star, where an actor imitates a prominent politician who has been kidnapped.  He is treated as the politician, even by the politician’s own personal staff, and when the politician dies as a result of injuries sustained in his kidnapping, the actor remains in office.  He has been playing the part of the politician so well that his former life as an actor seems like a faraway dream.  His personification was the result of much cooperation among the members of his staff, himself, and the public.  In other words, there was an empty niche and he fit into it, fulfilling a public need.  If he had not stepped in to assume the role of the politician, someone else would have filled that empty slot, as a combination of himself and the image put forward by his public relations staff.

Is Mickey Mouse any less real than Elvis?  Both are characters who have changed and evolved, who are believed to be alive by some people.  Mickey is real to those children who hug him at Disneyland.  New tales of Elvis still pop up now and then, because people want him to exist.  Disney animators ‘explored the idea of believable character.’  (Turkle, 1995:97)  They didn’t have to create a mind from scratch; they don’t try to convince us that Mickey Mouse is a thinking being. Instead, like all performances, they rely on the audience’s willing suspension of their disbelief. (Turkle, 1995:97)  It does not matter to those who interacted with Julia that she did not actually comprehend what they were saying to her, that she was not actually thinking. She gave the appearance of understanding, and for many, that is enough. As I discussed previously, though people know intellectually that computers and machines are not people, when they give the appearance of humanity it is difficult to refrain from pinning at least some psychological veneer to a computer or a program which reacts as if it does understand.

A different situation arose from the life of a prominent WELL figure, Tom Mandel.  He died quite quickly of cancer—five months after his diagnosis he was gone.  Mandel was an avid WELL member, and the WELL was an integral part of his life.  He stayed online, posting to the conferences as long as he could.  Eventually, though, the deterioration of his physical being started to interfere with his online presence.  He started slurring his online words: typos started to interfere with the clarity of his posts.  His typing speed dropped dramatically, reducing him from his status as a highly prolific poster.  Mandel eventually became a lurker on the WELL conferences, unable to chime in with his reactions and opinions and faded away online as he was also fading away physically.  (Hafner, 2001:120-134)

However, shortly before his death, Mandel and his friend Bill Calvin decided to implement a new piece of programming on the WELL.  Shareware was common on the WELL, with those who knew more about programming designing and sharing bits of software to enhance the WELL experience, like the scribble tool which was invented to make deleting messages easier for users. The program that they worked on was partially a joke and partially a grasp at immortality—they made a Mandelbot.  It was programmed to post randomly on the WELL with appropriate Mandel quotes from his previous posts.  Mandel saw it as a way to keep in touch, even if he was not there:  
I figured that, like everyone else, my physical self wasn’t going to survive forever…if I couldn’t reach out and touch everyone I knew online…I could toss out bits and pieces of my virtual self…and then when my body died, I wouldn’t really have to leave…Large chunks of me would also be here, part of this new space.  (Hafner, 2001:134)

In posts shortly before his death, a wistful tone crept into Mandel’s words.  He shuddered to think that his recent birthday would be his last birthday, that he had seen his last snowfall and he knew that his recent marriage to another long-time WELL member would not last long.  He wanted to wrap himself up in his relationships, saying ‘maybe if I could hold on to all of you, I wouldn't have to go down this path like this…and then somehow I won't have to let go.’  (Mandel, 1.3.95)

The Mandelbot was, then, Mandel’s way of keeping his contacts with other people.  Unlike Julia, there was a mind behind the original replies that the Mandelbot used in conversations—only the ones the bot would use were taken out of context.  Like Julia, there was no thinking mind behind the real-time conversations the Mandelbot would have, but it would trigger associations with the man whose words it was using.  The Mandelbot was of little use to the dead Mandel, but the idea of it helped him through his painful and difficult last days.  The many friends he had on the WELL helped to ease his mind, and so did the thought that they would still be reminded of him by an online presence that used his words.

No comments: