February 7, 2004

Moral Treatment of Virtual Characters?

by Andrew Stern · , 10:25 am

The Christian Science Monitor has a great new article posing the question, “How should people treat creatures that seem ever more emotional with each step forward in robotic technology, but who really have no feelings?” The article briefly touches on many facets of this question. One quote that sticks out is from People for the Ethical Treatment of Animals (PETA):

The turn toward having robotic animals in place of real animals is a step in the right direction… It shows a person’s recognition that they aren’t up to the commitment of caring for a real animal. Practically speaking, from PETA’s perspective, it really doesn’t matter what you do to a tin object.

On GTxA we recently touched on the issue of abusing virtual characters and the topic of blurring / fusing of fiction and reality with virtual characters.

Virtual characters that interact lifelike and believably enough to generate feelings of sympathy and empathy in players is certainly something we strived for when building Petz (1995-1999). I thought I’d post a few customer letters we received back then, excerpted from a chapter I wrote for Emotions in Humans and Artifacts (pardon the product hype in some of these).

I am a teacher and use the catz program on my classroom PC to teach children both computer skills and caring for an animal. One of the more disturbed children in my class repeatedly squirted the catz and she ran away. Now the other children are angry at this child. I promised to try and get the catz back. It has been a wonderful lesson for the children. (And no live animal was involved.) But if there is any way to get poor Lucky to come homze to our clazz, we would very much appreciate knowing how to do it. Thanks for your help, Ms. Shinnick’s 4th grade, Boston, MA

I just reciently aquired all your Petz programs and I think they are great! I really love the way the animals react. I raised show dogs and have had numerous pets of all kinds in my life and making something like this is great. I am a school bus driver and have introduced unfortunate kids to your program. Children who not only can they not afford a computer but they can’t afford to keep a pet either. This has taught them a tremendous amount of responsibilty. I am trying to get the school to incorporate your programs so as to give all children a chance to see what it is like to take care of a pet. It might help to put a little more compassion in the world. Please keep me updated on your newest releases. Thanks for being such a great company. Nancy M. Gingrich

I had a dog that was a chawawa and his name was Ramboo. Well he got old and was very sick and suffering so my parents put him to sleep. Ever since then I have begged my parents for a new dog. I have wanted one soo bad. So I heard about this dogz on the computer. I bought it and LOVE it!!! I have adopted 9 dogs. Sounds a bit to much to you ehhh? Well I have alot of free time on my hands. So far everyday I take each dog out one by one by them selves and play with them, feed them, and brush them, and spray them with the flee stuff. I love them all. They are all so differnant with differant personalitys. After I take them out indaviually then I take 2 out at a time and let them play with me with each other. Two of the dogs my great Dane and chawawa dont like to play with any of the other dogs but each other. This is a incrediable program. I had my parents thinking I was crazy the other night. I was sitting here playing with my scottie Ren and mutt stimpy and they where playing so well together I dont know why but I said good dog out loud to my computer. I think my parents wondered a little bit and then asked me what the heck I was doing. But thankz PF.Magic. Even though I cant have a real dog it is really nice to have some on my screen to play with. The only problem now is no one can get me away from this computer, and I think my on-line friendz are getting a little mad cause im not chatting just playing fetch and have a great time with my new dogz. Thanks again PF. magic. I love this program and will recomend it to everyone I know!!!!!!!

Dear PF. Magic, I am an incredible fan of your latest release,Petz 3,I have both programs and in Janurary 1999,my cherised Dogz Tupaw was born. He is the most wonderful dogz and I thank you from the bottom of my heart, because in Janurary through to the end of April I had Anorexia and i was very sick. I ate and recoverd because i cared so much about Tupaw and i wanted to see him grow up. I would have starved without you bringing Petz 3 out. Please Reply to this,it would mean alot to me. Oh,and please visit my webpage,the url is http://www.homestead.com/wtk/pets.html. Thankyou for releasing petz 3,Give your boss my best wishes, Sincerily, Your Number One Fan, Faynine

Dear PF.Magic, Hello! My name is Caitlin, and I’m 10 years old. I have Dogz 1 and Catz 1, as well as Oddballz, and I enjoy them all very much. Just this morning was I playing with my Jester breed Catz, Lilly. But I know how much better Petz II is. For a while, I thought I had a solution to my Petz II problem. I thought that if only I could get Soft Windows 95 for $200, that would work. Well, I took $100 out of my bank account (by the way, that’s about half my bank account) and made the rest. I cat-sit, I sold my bike, and I got some money from my parents. Anyway, I really, really love animals (I’m a member of the ASPCA, Dog Lovers of America, and Cat Lovers of America) but I can’t have one! That’s why I love Petz so much! It’s like having a dog or cat (or alien for that matter) only not. It’s wonderful! I have a Scrappy named Scrappy (Dogz), Chip named Chip (Dogz), Bootz named Boots (Dogz), Cocker Spaniel named Oreo (Dogz), Jester named Lilly (Catz), and Jester named Callie (Catz). And then every single Oddballz breed made. =) I don’t mean to bore you as I’m sure this letter is getting very boring. I would love SO MUCH to have Petz II. I really would. (At this point in the letter I’m really crying) I adopted 5 Catz II catz at my friend’s house, but I go over to her house so little I’m sure they’ll run away. I’d hate for them to run away. Is there anything I can do? I love my petz, and I’m sure they’d love Petz II. Thank you for reading this. Please reply soon. ~*~ Caitlin and her many petz ~*~

My husband went downtown (to Manchester) and found Catz for sale, and having heard so much about it he bought it on the spot. He put it on his very small laptop and came back from one of his business trips saying, “How many Dutchmen can watch Catz at once on a little laptop on a Dutch train?” The answer was TEN. I asked if any of them said, “Awww,” the way we all did, but he said they all walked off saying it was silly. I bet they ran out to buy it anyway, though! Yours, Mrs. H. Meyer

Dear Sirs, Just wanted to thank-you for the pleasure my petz have brought me. I am paralyzed from the neck down yet your program has allowed me too again enjoy the pleasure of raising my own dogz. I have adopted 5 so far. I love them equally as if they were real. Thanks again

(Off the topic of moral treatment but also interesting about Petz is the still active community of Petz and Babyz zealots, including youngsters who hacked our rendering system and created their own physical variations of the characters (“hexed” breeds, such as these posted yesterday), and instructions for others how to do so: 1 2 3 4 5 6)

19 Responses to “Moral Treatment of Virtual Characters?”


  1. nickm Says:

    Briefly, I wanted to mention that I see two issues frequently confused here.

    First, there is the matter of whether someone believes a virtual animal or human to be real. Someone might confuse a doll or stuffed animal (or a computer-simulated creature, for that matter) for a real one, but we would probably agree that this confusion is something that should be overcome — perhaps by providing more guided experience with real and virtual creatues, rather than by pretending that make-believe and simulation don’t exist at all and “protecting” children and other people from such things.

    Aside from this, your experience with virual creatures can still habituate you to behave in certain ways toward real creatures or influence your actions toward real creatures — for good or ill. Such experience might help you to be nicer and to appreciate life. This was the idea behind people adopting less expensive robot pets in Philip K. Dick’s Do Androids Dream of Electric Sheep? Or it might habituate you to maltreat animals or people. A guy who owns a RealDoll or other simulated woman (I’ll let you Google for these yourself rather than link to them) and who comes home every day and beats it — even if fully aware that this doll is not a real person — may have a problem. Or in some cases, participating in or witnessing simulated violence or its effects (e.g., watching or acting in a classical Greek tragedy) might have a positive effect, even though people know that what they are seeing isn’t real.

    More later…

  2. Brad Says:

    Well, I’m not sure I want to be stepping into a philosophical quagmire here, but this whole issue does seem to have it’s basis in a very fundemental question: Is there an essential difference between biological life and artifical intelligence or robotics? Perhaps more precisely, are we humans, or any other animal, more than the sum of our parts?

    Now I don’t know the answer to that question, but I personally feel that it is likely that there is in fact no difference. And if there is nothing especially unique about organic life beyond it’s remarkable complexity, (and I think I refer to anything supernatural here) then the line we choose to draw between it and the artifical is a completely arbritrary one. In the end though, it may be irrilevant.

    What we empathize with, what we care so deeply about, is not the fact that the entity has a liver and lungs, or even a brain. What matters is that it interacts with us in a way that we percive as meaningful. As much as PETA might wish it were different, we cannot prove that a dog has any of the ideas or feelings that we attribute to them. We can only prove that they act as if they do on a physical level. But the same applies to people for that matter. By what criteria then, can we judge moral value? (I certainly do not intend to advocate the abolishment of human or animal rights)

    My solution is this: Perhaps we should treat all things, artifical or organic, as we percieve them. After all, beyond that, all is uncertain. Thus, the degree to which we are able detirmine the difference between a robotic dog and a real dog reveals the degree to which we should treat it as real. When the day comes that the robot is indestinguishable from it’s inspiration, I think we must treat them with equal respect.

    So to get back on track, it is important that few people are confused about the nature of the robots: it tells us that at present it’s ok to stick a robotic cat in the microwave. For now it’s just stupid.

    I apologize if this comes across as sophomoric daydreaming: it wouldn’t be entirely untrue. Ah well…

  3. JJ86 Says:

    I can empathize and feel sadness for the death of a real animal only because I can feel its pain. All living creatures feel pain the same way. But as “intelligent” as my computer is, I can’t empathize with it in any way, shape or form. Even if it had software to act as advanced as a real animal, I doubt I could feel the same way. Just because I can endlessly play Carmageddon while running over people and animals doesn’t mean that these digital icons represent real life. Life is more than just simulation at least in this day and age. Maybe the future promises computers which will mirror life in all ways but that is possibly a very long way off. When that time comes, I will try to feel real sorrow on seeing the blue screen of death.

  4. greglas Says:

    Thanks, Andrew. Fwiw, I posted a link over at TN. http://terranova.blogs.com/terra_nova/2004/02/robot_love.html

    Really interesting letters, btw.

  5. andrew Says:

    I’m going to briefly summarize the various points made here and at the parallel discussion at Terra Nova:

    Nick says: There are two separate questions here: 1) Does the person believe the virtual character is real or fake? and 2) Independent of the answer to question 1, a person’s experience with virtual characters can influence how they treat real people/animals — for good or ill.

    Brad says: Is there a difference between artificial and real anyway? Once it seems real, we should treat it as real.

    JJ86 says: You can’t empathize with an artificial thing, it feels no real pain.

    Richard says: It’s a slippery slope from hurting fake creatures to real ones. Also: It’s distasteful to torture virtual creatures, hence the opposition to it.

    Jeff says: It’s sick to want to hurt virtual characters, because it might mean you want to hurt real things.

    Randy says: Why do we keep bringing this up? Everyone knows they’re not real.

    Phin says: Doesn’t this extend to harming avatars? (ie, first person shooters) Also: The reasons the player enjoys what they’re doing (e.g., why am I shooting a virtual Nazi, vs. why am I torturing a virtual cat) is very important to the question of how objectionable it is.

    Edward says: We should be kind to AI agents because we’re going to become intimate with them fairly soon. We should teach an AI the Golden Rule, and be good to it, so that it will be kind to us in return (which is important if we are to trust the AI to do things and make decisions for us). Also: some think it’s a good thing to love those things that may or may not have feelings, moral agency, e.g, the way real animals used to be perceived, and sometimes still are.

    I’ll throw in my 2 cents…

    I’m not going to bother looking far forward into the future when AI’s become orders-of-magnitude closer to the complexity of real living things, and hence theoretically more deserving of our true respect. Based on what I understand about the state of the art, and how slow progress is being made, I don’t think we’ll have to deal with that issue anytime in the next decade or two, probably longer. So I think that’s my answer to Brad’s question, and part I of Edward’s. (I’d be happily proven wrong about this time prediction though.)

    In the short term, e.g. the next 10+ years, I’ll venture to predict that virtual / robot characters will exhibit stronger and stronger illusions of life, but internally, in truth, will still have relatively simple brains. It will become easier and easier to suspend your disbelief in the lifelikeness of these characters, but intellectually we’ll all know they’re fake, because the moment you try to have a real conversation with them, they’ll probably break. But they’ll stimulate us in ways that they’ll feel pretty alive to us if you don’t think too hard about it, and we can immerse ourselves in that fantasy easily as long as we don’t push on them too hard.

    I think the interesting issue here is how people treat such characters relative to whatever lifelikeness they ascribe to them. That is, if I’m willingly suspending my disbelief, in the moment when the characters really feels alive, it does matter how I treat them. It matters because if I act immorally at the moment the character feels alive, I will feel immoral in that moment. It’s all safe in the end of course — no actual true immoral act was committed — but nonetheless we’ll feel like we behaved badly, and that feeling will suck for any well-adjusted person.

    To reply to Phin’s point: Once a game immerses you to the point that you get pretty damn close to the actual feeling that you’re killing someone, then I would think it would feel immoral, for the above reason. Even in their present form, as thrilling as those games are, I tend to shy away from them, because it allows me to too easily imagine what it would be like to do this for real. And it’s not a pleasant feeling. (I’m sure if I just played them enough I’d get desensitized, but I’m trying to stay sensitized.)

    All that said, sometimes I’ll want to put myself in some painful situations, to see how I react. I’ll want to mistreat these characters from time to time, to test my own personal limits, to remind myself how it would feel if I did that in real life. This is for similar reasons that I seek out difficult films, plays and books; they’re not always pleasurable, but they’re enriching and perspective-broadening.

    Similarly, as Edward suggests in his second point, I’ll sometimes want to act lovingly to these characters, partly as an act of kindness (or a virtual act of kindness), as a way to remind myself of the rewards of that kind of behavior in real life. But I seriously doubt it’ll be a real or lasting substitute for real friendships, just as pornography is a poor substitute for sex.

  6. JJ86 Says:

    Exactly, the “illusion of life” should be realized as being quite different than life itself. The virtual character while painted to be “real” offers no more life than the pet rock painted with a smily face. When people seriously grow attachment to these objects, it is a symptom of their ability to project part of their emotions onto the object.

    Throughout history people have projected their emotions onto the most benign objects to see religious miracles. The Virgin Mary has appeared on a pancake and been idolized as something more than just a tasty breakfast treat. For many people, games fulfill a spiritual or social need to such an extent that they have an unhealthy attachment. When I eat the pancake that has the image of the Virgin Mary, am I committing the unholiest of acts?

  7. andrew Says:

    > When people seriously grow attachment to these objects, it is a symptom of their ability to project part of their emotions onto the object.

    >The Virgin Mary has appeared on a pancake

    In the very abstract, a fully-realized virtual character is of course a physical object, no more or less physical than a pancake. (I’ll not address the question, “but aren’t animals and people just physical too?”)

    But when if object in question is moving fluidly and talks out loud to you and looks almost as real as a live video image of a real person, and you can talk back to it and it reacts and remembers things and calls you by name, and gets emotional and seems to have its own will and drives and motivations, it’s quite a departure from the Virgin Mary pancake. Feeling that this virtual character is alive is not a “symptom”, it’s only natural. You’ll need little or no effort at all to suspend your disbelief in it — in fact, it will require effort to supsend your belief.

    (Also, some would argue that once an object’s internal processes get very complex and akin to interal lifelike processes, that the object in fact becomes at least partially alive. But that argument isn’t necessary for the point I made above about moral behavior.)

  8. andrew Says:

    Somewhat related to this topic — Buzzcut (Dave Thomas) posted an interesting essay suggesting that interacting with a PC is more intimate than interacting with a console game system + TV. I think he’s onto something there.

  9. Michael Says:

    It’s interesting to consider what it would take to build something that it feels like something to be. I know that it feels like something to be me. I believe that it feels like something to be my cat. It doesn’t feel like anything to be any of the AI systems I’ve built (or to be one of the Petz). The sense of inner experience, or qualia, is at the crux of many strong/weak AI arguments (e.g. related to Searle’s syntax != semantics Chinese Room arguments; Dennett explicitly disavows qualia because they can raise havoc with functionalist frameworks; there’s a whole philosophical cottage industry about the theoretical possibility or impossibility of Zombies, systems that are completely indistinguishable from me in terms of external behavior, but that it doesn’t feel like anything to be).

    I don’t know how to go about building an AI system that it feels like something to be, but I would love to someday build a system (perhaps it’s a character, perhaps not) that at least makes you feel uncertain about whether it has experiences. Until that time, the issue of moral concern for these systems seems primarily a media representation question. As a designer, I may want to build procedural representations that can cause interactors to feel bad when the representation appears hurt (a more complex case of already existing attitudes towards representations – some people get worked up when representations of national flags are burnt).

    Terrel Miedaner’s short story The Soul of the Mark III Beast, re-printed with commentary in Hofstadter and Dennett’s The Minds I, is an interesting thought experiment about artificial systems who’s behavior is complex enough to leave us uncertain about whether they have inner experience.

  10. nick Says:

    Doesn’t it feel like something to be Parry?

  11. Michael Says:

    Not sure if you’re joking or not Nick, but no, I don’t believe Parry has any internal experience. Numbers in an “emotion model” move up and down and crudely effect Parry’s responses. But Parry’s behavioral repertoir is far too simple to lead me to ascribe any internal life to Parry (and, unsurprisingly, I feel zero moral obligation towards Parry). When Parry tells me its afraid, I don’t believe it. If, in those states that Parry reported as fear (fear of the mafia for instance – if I remember right that was one of its paranoid delusions), its behavior was effected in long-term, complex, manifold ways consistent with what I understand fear to mean for humans and higher animals, then I would start feeling uncertain as to the status of its inner experience (the zombie debate being far from settled), and perhaps would start feeling some degree of moral responsibility towards Parry.

  12. nick Says:

    I wasn’t joking, I was wondering whether these sorts of manifestations of emotion were enough to make you conceive of a bot’s internal experience (whether or not there is one). But the short-term simplicity of emotional response seems to be what leaves Parry lacking for you.

    It also may have something to do with Parry’s disembodied and context-free existence. I’d bet that people have felt more of an emotional connection to Floyd, from Planetfall, than to Parry, althogh Floyd is much less sophisticated in his interactive representation of emotional states. But he’s a somewhat interesting part of a fictional world — that, as well as the to some extent “long-term” relationship he can have with the interactor, may make a difference.

  13. andrew Says:

    Via Ludology.org, a CNN article about robot therapy in Japan.

    And here’s a Guardian article on emotional computing.

  14. andrew Says:

    Here’s yet another article on robot therapy in Japan.

  15. miscellany is the largest category Says:
    News Round-Up
    IDGA – Ivory Tower column, Dungeons, Dragons, and Ivory Towers, where Chaim Gingold focuses on issues of collaboration between the game industry developers and academics/researchers. Michael, of GTA, officially announced the creation of the Experimenta…

  16. Grand Text Auto » Frolicking With the Robots Says:
    […] Frolicking With the Robots
    by andrew @ 4:36 pm

    While we sometimes like to abuse robots, sometimes we like to frolic with them — see […]

  17. andrew Says:

    Here’s a post at collision detection about a new line of talking dolls for elderly folks in Japan.

  18. Gaz Says:

    Re: Abuse of Virtual Pets

    Can a virtual pet be abused? When I stop to think about it, I find that this is the question that intrigues me.
    From the point of view of a person who just loves Petz and enjoys playing with these little computer creatures, I think that it is really almost impossible to actually abuse them. First of all what is abuse? I don’t actually feed my petz much. Is this abuse? All my petz (I have lots of them probably over 50) are healthy and happy and many of them have achieved the Happy Petz Certificate (I am using Petz5 at the moment, although I prefer Petz 3). If I treated a real pet this way it would definitely be abuse and my pet would probably not survive! In the early days of my Petz ownership, Petz 2 I think, my neices and nephews used to come over and play with my Petz and the boys were in the habit of throwing my Petz Scardy Cat around and thought it was very funny. The poor thing was continually shaking and hanging from the ceiling in fear. I found this distressing and informed them that is they did not stop they could not play with my Petz anymore. The abuse stopped immeddiately as they enjoyed playing with them very much. OK so in this case the treatment of the Petz produced a negative effect on the Petz itself and also on me as it upset me to see the critter in such an agitated state. Was this abuse of the Petz or of me? Anyway the question still remains. What is Abuse of a Virtual Petz? And is it actually abusing the Petz or is it effectively abusing the user.

  19. Grand Text Auto » The Ass Wants to be Free Says:

    […] article created a big spike in downloads this week… Grace and Trip are surely being thoroughly abused as we speak. […]

Powered by WordPress