Killing a Pleo robotic dinosaur -- video

Discuss

108 Responses to “Killing a Pleo robotic dinosaur -- video”

  1. ashabot says:

    I must say, I agree with your wife. I don’t even want to read about it.

  2. paulkdickson says:

    NOOO! JOHNNY FIVE STILL ALIVE!

  3. Anonymous says:

    Empathy with/sympathy for the Pleo is not an indication of a higher order of neural function ~ rather it is an indicator of the extent to which our technology has outstripped our genetic development. We are hard-wired to respond to emotion/expression; that’s not terribly surprising in any animal that functions socially, and is a predator OR prey. Recognizing signals (whether vocal or visual) is a precondition for survival in a world full of living, and potentially dangerous things.

    Our world, is however, no full of things that are not meaningfully alive ~ to wit, the Pleo. The responses it has are not organic, nor is the Pleo any more ‘alive’ than say your computer. It is set to respond to a certain set of stimuli, with a limited set of responses. Our brains though, retain their primeval characteristics, and as such, don’t have a nicely developed capacity for distinguishing between ‘real’ and ‘not-real’. It can be learned, sure enough, but even so, movies, books, even radio (cf. Orson Wells) can make us physically uncomfortable by aiming at synapse connections that are too inelegant to allow us to distinguish between stimuli which ought to move us, and those which we can ignore as unreal.

    Trashing a Pleo is no more different than trashing a 79 Volvo 240 (a crime in its own way, but only for those who enjoy safe driving and quality automobile fabrication ~ you’ll never go to jail for smashing it, unless the car in question belongs to your neighbor…). If I put some impact sensors in the car, and recorded a few groans of varying tone and volume, so that slamming the door elicited a minor ‘urgh’ all the way up to a high-speed crash which brought on a blood-curling scream, would you become more careful with the car, more empathetic? Or would you just want to switch that function off, as one more annoying feature (like the seatbelt buzzer) that you wish were gone?

    The upshot, then, is that our response is due only to the fact that the simulation is an approximation of lifelike, and the programming is good enough to mimic basic animal functions, while our firmware simply lacks the necessary developmental complexity to adequately process and respond to those stimuli. We are, for want of a better phrase, insufficiently developed for the needs that our technology can create.

  4. Clay says:

    XOPL:

    You raise an interesting question with the cat-in-a-blender-in-a game question. I laughed when I imagined it happening as a sort of “physics engine experiment” in a game, and yet, I would be just as disturbed by a feline version of the Pleo meeting a similar fate.

    I suppose it’s the reversibility of the action that makes it either wickedly humorous or disturbing even though both are controlled by AI. A game in which you could immediately reload before your cat and blender experiment has no consequence, and thus is funny, whereas killing the Pleo, even if it never was alive, certainly feels “dead” afterward.

  5. Mark Frauenfelder says:

    I just remembered that I wrote an article for Wired news in 1998 about people who enjoyed torturing AI entities:

    http://web.archive.org/web/19981207001040/http://www.wired.com/news/news/wiredview/story/13293.html

  6. hinten says:

    Sometimes it’s worthwhile applying simple ‘philosophy’: What would you teach your kids?

    I would not want my kids to torture the Pleo and not just because it is expensive but because the similarities in reaction are too close to the real thing.

    I guess I would be worried about the ‘rub off’ effect that this could have when interacting with the real thing.

  7. EJTower says:

    My experience with being the cause of simulated pain and suffering was in 1997 when I bought one of the first generation gigapets. None of us were allowed to bring these toys to school, because they were an enormous distraction, so I entrusted my pet to my mother who would feed it until she went off to her second shift job. That meant that the gigapet would be alone for about an hour between her leaving for work, and my arrival at home from school.

    This regime worked great for about three weeks until I missed the bus home one day. I was literally stricken with a panic attack over the possible death of that damn thing. Running myself breathless the three miles back to my house, across two busy highways, I arrived to find the gigapet had died.

    Having seen that alien simulate starvation, even in those sadly cheap graphics, it bothered me deeply that I was responsible for its death. I felt that I had let it down, and couldn’t bring myself to hit the reset button. Despite the fact that it had a reset button, I felt as though any new incarnation of the alien would be a different alien all together. The next week I sold the unreset pet to a friend of mine.

    There were a number of contributing factors involved in my deeply emotional reaction to the death of the gigapet. The first was my parents’ rapidly declining marriage. The second was my having to face my own mortality. At that time I was undergoing a number of medical tests to determine the reasons for my severe chest pains, and those tests were proving inconclusive, while the pain continued to prove more severe. Turned out that it was my ribcage bearing down on my heart and lungs as I grew, all is well now.

  8. Nick D says:

    There’s an excellent book by Daniel C. Dennet called “Conciousness Explained” that deals with the question, “Is conciousness born of a certain combination of material elements?” His answer is yes. Which implies that a sufficiently sophisticated man-made brain would be capable of being every bit as concious as an organic brain.

    It’s a great book, even though it’s now around 15 years old.

  9. Cicada says:

    There’s one missing (if bizarre) option here– you could create software where the sensation of pain didn’t match up with the output of suffering signals– a Pleo that cooed with delight if you whacked it with a hammer or recoiled and whimpered if you fed it. What do you compassionately do to a thing that is only happy when being harmed? Evolution weeds that out for natural life forms (at least for short-term harm), but that doesn’t apply as well for artificial ones.

  10. Stefan Jones says:

    #28: I’m surprised how bored that cat was with Pleo.

    I’m pretty sure that my dog would gleefully chomp, shake, and disembowel Pleo. That’s the fate of any animal-shaped toy she can get her jaws on. And almost the fate of a cat she got her jaws on. Canines have such different rules…

  11. Mark Frauenfelder says:

    I like playing these philosophical games.

    I’ve got one for you XOPL:

    You discover that someone you love very much is actually a robot. There is another copy of the robot in China, married to a person living there, so the robot is not unique.

    Now, you are given a choice between killing an ant or destroying the robot you thought was a person for many years. Which do you choose?

  12. wetspirit says:

    Unfortunately we seem to be training people to ignore or “unfeel” even the most realistic renderings of human suffering through media and video games especially. We are fostering the development of “Killer Instincts” that being the title of one of the most popular video game arcade series circa 1994. These days games like Grand Theft Auto let you torture civilians and kick and beat people to death after you carjack them as they plead for mercy, how is that different from holding a Pleo by the tail?

    All of this concerns me very much, and I like video games!

  13. Tenn says:

    Also… how did a sentence in that comment lose all of its vowels?!

    Disemvowelling. It’s mom’s clever alternative to full censorship. It happens when there’s something said that is considered inappropriate by the moderators or Boingers.

  14. Anonymous says:

    Why not program it to bite or scratch when mistreated? This response works pretty well in the adult animal world. This “pet” would come to be viewed as somewhat different from a regular “toy”. Or perhaps it could feign death like a spider, or just go dormant for 20+ minutes until the interests of the malefactor have led elsewhere. Or play Barry Manilow at ear-piercing volume. Yeah, Barry Manilow.

  15. Tenn says:

    Poor Pleo.

    I killed a caterpillar by accident once while collecting specimens for my biosphere. I was about seven. That whole summer, every time I saw a caterpillar I would flinch, and it seemed I was stepping on a lot more of them, like they were lining up to be slaughtered. (They were -everywhere-.) Later on I realized they weren’t caterpillars, but webworms, and the periodic infestation we have.

    I still remember that awful horror when I squished it with a cup on the fence while trying to retrieve it. After that- I stopped riding my scooter, because it was impossible to without killing one of them.

    It was capable of feeling the pain though. The Pleo imitates it. I can watch that video without feeling bad. It reminds me of the boy in Artificial Intelligence though- how all the robots are made to be aware of pain so that they do not go into the fire or something, but the boy is different from the others, and his human ‘brother’ enjoys inflicting pain on him and making him bear it.

  16. Schmorgluck says:

    Yeah, abusing toys is a sad and mean thing to do. They turn insane and finish in a psychiatric clinic.

  17. Tenn says:

    Oh and in regards to the last post, I was trying to make the point of- when is it not going to be simply imitation, but real sensation?

  18. Teresa Nielsen Hayden / Moderator says:

    Xopl (85), could you give me a longer character string marking the intended end of your earlier comment?

    Onward.

    By me, the rule is “Give mercy as you hope to be given it.” Err on the side of kindness and lenience as you hope others err with you. Don’t be over-strict when judging who is and isn’t eligible to be treated as a creature worth saving. And while you’re at it, oppress not the strange machinery that is within thy gates.

  19. takashi37 says:

    These are some fascinating moral dilemmas and philosophical questions in this comment thread.

    But none of them tackle the most important question of all… will it blend?

    ;)

  20. adam hellbound says:

    Is it moral to design a machine capable of feeling pain?

  21. xopl says:

    You’ve got to frame your hypothetical better. Why would I want to kill the ant or the robot? Not to mention I’d kill an ant instead of destroying a piece of electronics every time you gave me this test.

    In my scenario, both cars are in the water, the human girl is asleep but soon to drown, and then 10 robosapiens are crying bloody murder in the other car. We’d all save the quiet human girl.

    A better question would be, my previously unbeknownst-to-me robofriend is in one car in the river, and a human stranger is in another. NOW who do I save?

    I’d save the human, if there was a way to recover the robot harddrive and/or otherwise create an exact replica, or I reasonably believed that was the case.

    If my robofriend isn’t reproducible then it is unique, which means, as I said, we should start assigning human rights and privileges at that point. I’d save the familiar figure over the stranger.

  22. Nick D says:

    @XOPL (#65)

    I think you’re missing the point here.The topic is not replaceabilty. What’s that got to do with it?

    We’re not talking about some inventory dilemma here, we’re talking about the effects on humans of torturing convincing simulations of human beings.

    The reason that torturing animals is ilegal is not that each animal is unique or irreplaceable, and it’s not that it makes us more inhuman (although it does) it’s that animals feel pain!

    No one is equating real organisms with machines. We’re discussing the hard-wired responses of the human animal to the suffering of something which, to our senses and emotions, is indistinguishable from a living thing, despite what our intellect tells us.

    The same rules that apply to humans will be applied to AI’s not when AI’s are unique and irreplaceable, as you suggest, but when AI’s are proven to feel pain just as we do, and that it’s immoral to mistreat them.

    Slaves, e.g., African slaves, were each unique, but that’s not what stopped slavery. It was the recognition that they were human beings that feel pain and have emotions and human dignity, that led to the abolition of slavery.

  23. CJ says:

    @38: Is it moral to have children, knowing that they will feel pain as part of the experience of being human?

  24. Peter says:

    #61 (Noen): Just wondering if you read Blindsight, by Peter Watts, since he seems to raise a few of the same issues.

  25. xopl says:

    People ARE equating real organisms with machines, which is exactly what I was speaking to. It happened numerous times in this thread.

    I’m not going to sit here and tell you that somebody who tortures a Pleo for fun isn’t a sicko. In fact I posed the question, “would a sicko find torturing a Pleo (or an even more convincing replica) satisfying compared to torturing a real organism?” I think that’s more interesting than the reaction of a well adjusted person.

    What I am going to argue to the end is this: Everybody who tortures a dog is a sicko. NOT everybody who tortures a Pleo (or the like) is a sicko. The real organism always creates a different set of circumstances than the fake organism because they cannot be equated. And yes, uniqueness WILL be the factor for when we decide we can fairly equate them. Not whether they feel pain. We can just upload a new line of code to a robobrain, and their pain is gone.

    I have been celebrating the spirit of this post the entire time I’ve been commenting. It is an amazing thing that real human emotions can transfer onto anthropomorphized robots. Truly fascinating. But also complex. And you’re not always a sociopath if you feel no remorse in abusing one of these robots.

    And again, there’s the whole philosophical weirdness involved in the fact that a real human programming the mimicry. Somebody has to abuse the thing to ever see that work they put in.

    Kids are pretty rough with their toys. I’m guessing the Pleo only cries out because it is fragile enough to break if kids don’t mind it.

    But what happens when the Pleo’s crappy plastic gears start slipping? Or the kid gets a Pleo 2 and is bored with their old Pleo? Has the kid lost humanity because it shuts the old toy off, or throws it away? Has the kid lost humanity if he or she decides to roughhouse the old toy because it has lost value?

    If it was a pet dog, and it got old, would a parent let the kid ignore the old dog and just go buy a new one? And let the old dog die a lonely death?

    So Nick D… are you saying that any parent who buys their kid a new robodinosaur and lets their kid ignore the old one is teaching their kid to be an unempathetic sociopath?

  26. Tom says:

    #65: Yr rgmnts r spcs nd y r n nd f rmdl dctn n bth history and philosophy.

    History: The case of a female German ruler whose children were being held hostage by the Romans as assurance on a treaty. The treaty was violated and the Romans threatened to kill the hostages. The German woman reputedly stood on the city wall and raised her skirts and called down to the Roman commander, “Go ahead and kill them! I have the means to make more!” Ergo, “reproducibility” is hardly an interesting distinguishing criterion between humans and machines unless…

    …y fld dntty thry n cllg. Every Pleo is unique. They were manufactured at different times, they have different space-time co-ordinates, they have different masses and colours, they are made of different atoms… Apparently you believe that some differences are less important than others, but you to mention why you think this is the case. To put a finer point on this: the differences between Pleos are comparable to the differences between identical twins. Yet for some reason I’m not willing to impute to you the belief that torturing an identical twin to death is ok.

    Ergo, remedial identity theory is called for, I think.

    Finally, you point out that in some respects machines are different than living things. Ths s spcs. It has no relevance to the point at hand, because no one here has made any argument that depends in any way on there being no differences between machines and human beings. Yeah, machines and humans are different, and there are endless situations where those difference matter.

    So what?

    My own argument, for example, is that the interesting question is: why do people choose to beat on Pleos and not hammers? Nothing you say addresses anything about this question or others like it that different posters have raised here, and your introduction of irrelevant hypotheticals (which will now be elaborated ad nauseum to no good end) serves no purpose. nyn cn rg ndlssly bt md p sttns nd prv nthng xcpt tht thy r sklld n vdng th ss.

    The actual fact of the matter is: some people like beating on a kind of machine that emulates the response of a living thing in pain, and the same people show no predilection to beat on machines that do not show such a response. It it far from ridiculous for sane people to be bothered by folks like that.

  27. Nick D says:

    C’mon, XOPL, you can do better than that. I was talking about torture. Pose a hypothetical that deals with that, please, and I’ll be happy to play along.

    “So Nick D… are you saying that any parent who buys their kid a new robodinosaur and lets their kid ignore the old one is teaching their kid to be an unempathetic sociopath?”

  28. Robert says:

    Hey, how about this. What would you think of someone who gets obvious pleasure from listening to a CD that plays the sounds of crying or screaming people? Never mind whether the sounds are genuine or simulated.

    What would you say? Hey it’s just soundwaves, so who cares?

  29. Kyle Armbruster says:

    …Ummm…

    Some of you people are out of your minds. It is a toy.

    A toy.

    People who are able to abuse a toy are not sociopaths. They just know the difference between “real” and “not real.” They’re not crazy. People who claim there is no difference between those two are crazy.

  30. AahMyEyes says:

    I *still* haven’t finished watching the video. Not a video for people with slow DSL. Why can’t they let it preload like YouTube vids?

    Annoying!

  31. Muppet says:

    Absolutely pathetic. These guys are too cheap to really kill the thing. I was expecting to see at least some hardware damage, if not complete destruction ! They didn’t even drop the thing off the table, let alone apply some good kicking with steel toe-capped boots!

    As someone who decapitated little sister’s barbie doll at the tender age of 13, (using an axe), i look forward to whimpering robots as the perfect target to unleash some pent-up aggression! That would be a very healhty application of “AI” (a concept glamorised by people who lack the real thing) .

    Bring on the iGimp !

  32. Kyle Armbruster says:

    @41:

    I had to finally view it in (vomit in mouth, choke it back down) Internet Explorer. I think it won’t load anything but the commercial at the beginning if you have Adblock on, and still didn’t seem to like it when I turned Adblock off.

  33. Muppet says:

    Yh, t scks …. hd t trn ll my fltrs ff t wtch t, nd hd t ndr th cr dvrt t th bgnnng. nd nt vn ny srs Pl dmg n rtrn, nstd thr ws ths wmp pttng t t th nd t “brng t bck t lf”. L–M-

  34. xopl says:

    Well Tom, sng th wrd spcs ds nt mk y smrt, s y’v shwn.

    Y’r rght. My pst wsn’t ddrssng th pnts n yr pst. Yr pst wsn’t ntrstng t m. ‘m nt rqrd by lw t ddrss Tm’s pnts.

    You know that when I’m talking about uniqueness I’m not talking about atoms. I’m talking about personality, memory, etc. I’m sorry if I didn’t use the terminology that makes you feel so superior to me.

    In addition, you are twisting the spirit of my arguments on reproducibility. Human twins have different identities. You can’t kill one and say you’ve still got an exact copy. But one radio is just as good as the next radio. I’m not sure why this point isn’t clear to you.

    I’m saying, once the machines start having identities that cannot be reproduced, that’s when we’ve got the real ethical problem. Why don’t you concentrate on arguing with my points rthr thn rgng my chc f wrds frm th nglsh lngg.

    Not all hypotheticals are bad. In science there’s this thing called a hypothesis. I proposed some tests and hypothesized some of the results. It was a useful exercise in demonstrating that sometimes a human can inflict or observe harm in a robo-creature without being a sociopath.

    Clrly ‘m tkng y tsd yr cmfrt zn by dfndng th ppl wh fnd btng Pl fnny. Gd.

  35. Robert says:

    One of my friends showed me this thing in a basket. It was a small toy in the form of a puppy fast asleep. But the horrifying thing to me was that it was slowly breathing.

    “Does it do anything else?” I asked.

    “No, it just sleeps.”

    “That’s creepy!” I exclaimed.

    “Yes,” she replied, “but the old people like it.”

    I had a vision of geriatrics at the old folks’ home smiling vacantly at the faux puppy. Then I had another vision, this one more horrifying.

    “What if it dies?”

    “Then you just replace the battery.”

    “But it stops breathing.”

    “True.”

    I watched the imitation animal for a minute, then turned away, shivering. I couldn’t help it. Tamagotchis had nothing on this.

  36. xopl says:

    Robert, there are a lot of those people: they listen to techno and/or industrial and/or “goth” music. Lots of them lead healthy lives, and have nurturing relationships with their families.

    What do you want from me Nick? Everybody? I do/can see why you find the video disturbing and why you find people who abuse these machines as disturbing. When did I say I don’t understand that?

    You just don’t seem willing to budge on the fact that right now the Pleo amounts to little more than a toaster with a clever soundtrack, and sometimes, just maybe, it can be “tortured” without the “torturer” being a sociopath, and with it ok to laugh.

    And Nick, I was simply pointing out that if you worry about your kid who tortures a Pleo, why shouldn’t you be worried that giving that kid a Pleo and then letting them abandon it is teaching them that they can abandon real creatures, too? Why don’t the same rules apply? You can’t have it both ways.

  37. Mark Frauenfelder says:

    Let me reframe the question for you XOPL:

    You discover that someone you love very much is actually a robot. There is another copy of the robot in China, married to a person living there, so the robot is not unique.

    Now, an evil Martian gives you a choice between killing an ant or destroying the robot you thought was a person for many years. The robot you have loved for many years cries and pleads with you not to save it.

    What do you choose to do?

    If you do nothing, everyone one on the planet will die a slow painful death.

  38. arbitraryaardvark says:

    Some people rescue greyhounds, I rescue stuffed animals. Free to a good home, you pay shipping.
    Can we get a Pleo for Randall Munroe?

  39. xopl says:

    Mark, I’m confused. I’d squish an ant to save the planet whether or not my loved one was a robot.

    r y jst jmpng n th “sht p wth th hypthtcls, Xpl” bndwgn… r r y ctlly tryng t mk pnt hr?

  40. adam hellbound says:

    I agree with your wife. Silicon-based cruelty is no different than organic cruelty. While it was fascinating to watch grown men torture a life-mimicking machine (undoubtedly under the auspices of stress-testing), such actions, even when carried out against inanimate objects, are startling reminders of our simian heritage.

  41. Mark Frauenfelder says:

    OK, so you would spare the robot and squish the ant. What if the choice was between the robot and a monkey?

  42. adam hellbound says:

    That actually does raise a few good questions. I wonder why that video is so much easier to laugh at then the first one; is it that the Pleo is better at convincing us it can feel pain? Also, again, this raises the question of why the urge to destroy things seems so deeply ingrained in the psyche of the great apes.

  43. hughelectronic says:

    The Spielberg homage to Kurbrick (AI) was mostly a travesty — but the scenes featuring humans torturing their human-like creations are probably an accurate prediction.

    Will we need to have a movement demanding human rights for AI?

  44. ToastyKen says:

    I think calling people who inflict simulated pain “sociopaths” is a bit extreme. I mean, this delves right into the video game violence issue. In video games enjoyed by much of the population, enemies scream in pain when you shoot and kill them, and lots of people enjoy these games. I don’t think they’re all sociopaths.

    I’m not arguing that desensitization doesn’t exist at all, though. I’m just saying that maybe it’s not quite that big a deal to torture a mechanical animal.

    As for the Pleo vs Elmo thing, I think that it’s maybe about the degree of realism. Death and pain is often disturbing when it’s subtle but funny when it’s extreme. As Mel Brooks said, “Tragedy is when I cut my finger. Comedy is when you fall into an open sewer and die.”

    I would say that Pleo flailing about and whimpering is like cutting a finger, and Elmo going up in flames while giggling (and then the mechanical skeleton still giggling and moving about) is like falling in an open sewer and dying; it’s so extreme and ridiculous that it’s funny. Perhaps things are funnier when we see them as less likely to happen in real life?

  45. abq halsey says:

    This being the internet, I m rqrd by lw t cll y fg.

  46. Nelson.C says:

    If it’s just a toy, why are they doing it? Do they get a kick out of making something simulate suffering? Or do they get a kick out of other people’s reactions to the toy’s simulated suffering?

    If the former, I wonder if they would torture an animal if they were told, “It’s just a pet,” or a human if told, “It’s just an Other-racian.” And if it’s the latter, I wonder how far they’d go to invoke other emotions. Perhaps they’re just the film-makers, SFX geeks and writers of the future. Or maybe they are sociopaths. Either way, for different reasons, they’re going to be worth keeping an eye on.

  47. dculberson says:

    I won’t even kick my car .. or my lawnmower. I treasure and care for machines like any good geek should. If I take one apart, I make sure it works after putting it back together. So I couldn’t imagine ‘torturing’ a machine that expresses it’s pain and discomfort. I would probably break down in a sobbing lump.

  48. ankh says:

    Mirror neurons. Your ability for empathy is measurable and has a basis in brain structure, and it’s common in evolution, not peculiarly human.

    – Feel icky watching? What you’re seeing is convincingly (for you) lifelike.

    – Never bothered? Turing Torture Test, eh?

    Note in the latter case it’s _your_ humanity being tested, not that of the device.

    You _can_ look this stuff up. Just for example:

    http://scholar.google.com/scholar?num=100&hl=en&lr=&newwindow=1&safe=off&scoring=r&q=empathy+neuron&as_ylo=2007

    A Functional Magnetic Resonance Imaging Approach to Empathy. Journal of Cognitive Neuroscience, 2007 – MIT Press
    … Correlations between Mirror Neuron Activation and Individual Empathy Scores. …

    … neural deficit in adolescents with conduct disorder and its association with lack of empathy
    … Previous studies investigating the neural correlates of empathy for pain have found
    strong effects in the ACC in addition to the anterior insula …

    Motion, emotion and empathy in esthetic experience
    … basic mechanisms that have been brought to the fore by recent research on mirror and canonical neurons, and the neural underpinnings of empathy and embodiment. …

  49. xopl says:

    People ARE equating real organisms with machines, which is exactly what I was speaking to. It happened numerous times in this thread.

    I’m not going to sit here and tell you that somebody who tortures a Pleo for fun isn’t a sicko. In fact I posed the question, “would a sicko find torturing a Pleo (or an even more convincing replica) satisfying compared to torturing a real organism?” I think that’s more interesting than the reaction of a well adjusted person.

    What I am going to argue to the end is this: Everybody who tortures a dog is a sicko. NOT everybody who tortures a Pleo (or the like) is a sicko. The real organism always creates a different set of circumstances than the fake organism because they cannot be equated. And yes, uniqueness WILL be the factor for when we decide we can fairly equate them. Not whether they feel pain. We can just upload a new line of code to a robobrain, and their pain is gone.

    I have been celebrating the spirit of this post the entire time I’ve been commenting. It is an amazing thing that real human emotions can transfer onto anthropomorphized robots. Truly fascinating. But also complex. And you’re not always a sociopath if you feel no remorse in abusing one of these robots.

    And again, there’s the whole philosophical weirdness involved in the fact that a real human programming the mimicry. Somebody has to abuse the thing to ever see that work they put in.

    Kids are pretty rough with their toys. I’m guessing the Pleo only cries out because it is fragile enough to break if kids don’t mind it.

    But what happens when the Pleo’s crappy plastic gears start slipping? Or the kid gets a Pleo 2 and is bored with their old Pleo? Has the kid lost humanity because it shuts the old toy off, or throws it away? Has the kid lost humanity if he or she decides to roughhouse the old toy because it has lost value?

    If it was a pet dog, and it got old, would a parent let the kid ignore the old dog and just go buy a new one? And let the old dog die a lonely death?

    So Nick D… are you saying that any parent who buys their kid a new robodinosaur and lets their kid ignore the old one is teaching their kid to be an unempathetic sociopath?

    Nick you’ve got to consider that you’d feel really good saving a human from drowning even if you let 10 crying robohumans die in the process. You wouldn’t feel like you’ve lost some of your humanity. Trust me.

    Giving a Pleo a slap… the same rules apply. You can be an empathetic and humane person, and still feel no remorse for that action.

    Throw truly unique robohumans that you can’t replace into this mix, and remorse comes back into the picture.

    Robert, there are a lot of those people: they listen to techno and/or industrial and/or “goth” music. Lots of them lead healthy lives, and have nurturing relationships with their families.

    What do you want from me Nick? Everybody? I do/can see why you find the video disturbing and why you find people who abuse these machines as disturbing. When did I say I don’t understand that?

    You just don’t seem willing to budge on the fact that right now the Pleo amounts to little more than a toaster with a clever soundtrack, and sometimes, just maybe, it can be “tortured” without the “torturer” being a sociopath, and with it ok to laugh.

    And Nick, I was simply pointing out that if you worry about your kid who tortures a Pleo, why shouldn’t you be worried that giving that kid a Pleo and then letting them abandon it is teaching them that they can abandon real creatures, too? Why don’t the same rules apply? You can’t have it both ways.

    Saying I’m wrong and actually showing I’m wrong with argument are two very different things Tom. pprntly pss y ff, snc y wnt strght fr ttckng my ntllct rthr thn rgng yr wn pnn.

    Mark, I’m confused. I’d squish an ant to save the planet whether or not my loved one was a robot.

    r y jst jmpng n th “sht p wth th hypthtcls, Xpl” bndwgn… r r y ctlly tryng t mk pnt hr?

    I’d save the monkey and kill the robot, IF there was a way to recover the robot harddrive and/or otherwise create an exact replica, or I reasonably believed that was the case.

    If my robofriend isn’t reproducible (personality/knowledge/etc included) then it is a unique individual (nevermind atomic differences), which means, as I said, we should start thinking about assigning human rights and privileges at that point.

    I would have a really, really hard time killing a monkey. But I’d do it to save the human race… since they’d die anyway if I didn’t kill it. I’d save a unique beloved robofriend instead of a monkey if I could never get that robofriend back.

    This is no doubt ramping up to stranger versus robot, or human friend versus robot friend.

    A toaster can be reproduced exactly (nevermind atomic level differences). A human cannot. Once the robot cannot be reproduced exactly, that’s where I draw the line, personally. I’m not saying any of you have to agree with me, but it makes sense for me.

    I do have to concede that it is in fact most likely the robot’s ability to mimic organic creatures which would cause me to love it / be empathetic towards it in the first place. I would save a monkey and kill a truly unique, one-of-a-kind, can-never-be-made-again toaster… toasters don’t have emotions (or pretend to).

    BUT(!!!) a robot can cry out and make me empathetic towards it while still being a commodity. $9.99 at the local Robots R Us.

    Human vs commodity human mimic… easy choice.

    Human vs unique individual human mimic… hard choice.

    I’ll say it again: I understand the empathy towards Pleo and the antipathy towards its abusers.

    What I’ve been trying to do is show where we draw the line and why. When do you start charging people with robo-abuse?

    To me it seems like you can’t draw a very solid line until we have robots that surpass our ability to make exact copies of them.

    We can even take pain completely out of the equation. What if science creates a conscious intelligence that exists on an array of computers and isn’t equipped with sensors, limbs, crying sounds. Would it not be a tragedy to destroy it?

  50. mdhatter says:

    The Minds I is brilliant, and the ‘pleo’ might be a good pet to find those children who DO torture animals as, whether unknowingly or intentionally, some do.

  51. NeonCat says:

    IIRC, there has been discussion about using programs like Second Life as areas to help develop AIs, a way for them to learn about and get used to humans. My immediate concern on hearing this was that there are some messed up monkeys in the world who would gladly torture such a being, especially since it “isn’t real”. In the Animatrix, the war is shown to be started by humans – the first robots were friendly and kind of weak. The machines wised up quickly. Would an abused AI decide to strike back at the humans who tortured it?

    @ Muppet
    Maybe you should take up boxing instead.

  52. adam hellbound says:

    #4: The movement has already begun.

    #5: But of course. I kind of felt that way posting it in the first place.

  53. The Unusual Suspect says:

    Modern Western society often has this backwards:

    It’s not the victim who is central to the crime, it’s the perpetrator.

    A crime tells us nothing about the victim. A crime does tell us everything about the nature of its perpetrator. If that nature is abhorrent, the perpetrator must be corrected or confined regardless of how much (or even if) any victim of the crime may have suffered.

    Certainly there are victim-less crimes. But if the intent is to do wrong, these are crimes nonetheless.

    (And on a lighter note, it’s not a Turing test, it’s a Voigt-Kampff test!)

  54. Nick D says:

    Don’t take it personally, XOPL. We just love a good argument.

    “Why don’t the same rules apply? You can’t have it both ways.”

    Yes, you can. It’s called a category distinction. I’m not suggesting that we shouldn’t treat objects like objects, just that disregarding the implications of enjoying realistic simulations of suffering is problematic.

    I agree that one can enjoy that without being a sadist, i.e, just thinking it’s a goof. I’m merely suggestng that you consider the individuals who are excercising their sadistic impulses, and using this as a vehicle. That’s all.

  55. Anonymous says:

    I’ve seen the Tickle Me Elmo video linked above before, and got a good laugh out of it, but I ‘m somewhat embarassed to admit that I found the Pleo video impossible to watch all the way through. The difference, I think, is that Elmo’s response is so absurdly wrong — laughing while being burned “alive” — that it comes off as being funny.

    In the Pleo video, the robot’s responses are more akin to what you’d see if you did this to an actual animal, and I think that’s why I didn’t get a laugh out of it.

    It reminds me of this XKCD comic:

    http://xkcd.com/233/

    The Pleo video is no more or less real than other fictional depictions of animal injury, and if you’re bothered by cartoon depictions of it, you’ll probably be bothered by robotic depictions of it as well.

  56. Mark Frauenfelder says:

    Thanks, XOPL. I think you nailed it on the head when you said you can’t draw a solid line. The interesting thing to me is that the line between life and non-life is getting fuzzier and fuzzier. That fuzz is a spectrum between non-living things and living things.

  57. ecobore says:

    ah, well here we have the first stirrings of ‘Robot Rights’ Just as well it is not in force yet or those guys would be fried!!! ;-)
    http://news.bbc.co.uk/1/hi/technology/6200005.stm

  58. arbitraryaardvark says:

    Some people rescue greyhounds, I rescue stuffed animals. Free to a good home, you pay shipping.
    Can we get a Pleo for Randall Munroe?

  59. xopl says:

    Clay, I think you are on to something. It is hard to feel remorse when there are no consequences. And if something is fully reversible, then there really aren’t any.

    Permanently damaging anything is a different experience from virtually damaging something where you can just reload or hit undo.

    I think the Mel Brooks quote is also wise. If the video showed only the Pleo being blown to smithereens with TNT, I’m guessing many more of you would have been amused. There’s no moment of suffering in such an extreme death.

    And then the whole idea of switching the Pleo around so it likes being beaten… man is that ever a thought experiment.

    The fact of the matter is, a Pleo is just a toaster oven with a sound track. If you have a reaction to a Pleo being abused that speaks to the engineers’ skills more than anything… and the funny thing is, if nobody abused the thing then nobody would ever get to appreciate the work the engineers put in. And likewise, why did the engineers put that into the product if it wasn’t meant to be seen? It’s a tree falling in the forest kind of problem.

    I think a well adjusted, empathetic person could beat a Pleo to death without any scarring depending on the circumstance.

    I wonder, would a psychopath get the same feeling from beating a Pleo to death as they do from beating a dog or human to death? Maybe that’s the more interesting question.

  60. Anonymous says:

    Reminds me also of the WaPo story you guys linked some bit ago: the one about soldiers in Iraq becoming extremely emotionally invested in IED detecting bots. — Donald

  61. Jesse M. says:

    Once these things are on the market, I’m hoping to see some youtube videos of cats and dogs interacting with Pleo…is it sufficiently lifelike that they’ll treat it like a real animal, or will they just cautiously sniff it and then lose interest?

  62. Stefan Jones says:

    #9: James Lileks once described an encounter between his dog and a Aibo-knockoff robo-dog.

    Live dog was very afraid of the robo-dog, but eventually sniffed it under its coiled faux-metallic tail. And looked _spooked_. Lileks suggested a human might get the same feeling from shaking someone’s hand and discovering that their flesh is room temperature.

  63. Domomojo says:

    Put me in the column that agrees with your wife.

  64. noen says:

    Peter:
    Just wondering if you read Blindsight, by Peter Watts

    Yes I have and the rifters trilogy before as well. I highly recommend it, if people like this topic they’ll love Blindsight. (I have been reading his blog too but it’s been awhile, I should check it again) And yeah, I am making the argument that we are machines though to be honest I think Peter fudges things a bit and buys into EvPsych more than perhaps I would, but what do I know?

    Peter also argues that free will is an illusion, that there are no truly altruistic acts and that self awareness might be an evolutionary dead end. I’m not sure I agree with all that.

    XOPL
    Everybody who tortures a dog is a sicko. NOT everybody who tortures a Pleo (or the like) is a sicko.

    Agreed. I wonder how much we truly disagree? My experience in comments in blogs and on forums is that many, if not a majority, of disputes are due to miscommunication. Also because if you reply to someone you naturally pick out points of disagreement. Otherwise why bother? You Xopl, seem to be focusing on the Pleo while I and perhaps others are jumping a long way down the road. I personally don’t think that any robot that is so simple it could be copied onto a HD qualifies as an artificial sentient being. Even so, I would not have a lot of respect for someone who got off torturing a Pleo. Frankly, I would be suspicious of someone who got off torturing a plush toy.

    Insects are automatons. If you interrupt them during say a mating ritual they are forced by instinct to start all over again. You can keep doing that over and over. They will never get it. Still, I felt uncomfortable with that cockroach robot where the researchers put a partially dissected bug on a ping pong ball. That was creepy.

  65. JohnnyWeird says:

    Noen-19: I like the phrase ‘uncanny valley’ to describe the gap between living/lifelike/conscious and not. Is it yours? If so, may I use it (philosophy student, this comes up in conversation more than you’d think)? If not, who may I cite for it?

  66. Brian Carnell says:

    Where’s my Two Pleos, One Cup video?

  67. angusm says:

    A team from Sony Computer Science Laboratory in Paris and Eotvos University in Budapest did some studies of dogs interacting with an Aibo and a radio-controlled car. They had to give the Aibo eyes, and a fur coat, and a doggy smell (the fur had been left under a dog’s bed for a few weeks) but some of the dogs behaved towards the Aibo as if it were another dog, something they didn’t do with the toy car.

    http://www.csl.sony.fr/items/2000/dog-versus-aibo/

    has more details and a short movie of what happens when Aibo interrupts someone’s lunch.

  68. Brian Carnell says:

    Seriously, though, I notice that following the Amazon.Com link, the Pleo is *still* listed as pre-order.

    What, is this thing going to ship with a copy of “Chinese Democracy” and “Duke Nukem Forever”???

  69. noen says:

    The uncanny valley isn’t mine at all. You would cite Japanese roboticist Masahiro Mori.

  70. noen says:

    Everyone knows that super toys last all summer long.
    See you at the flesh fairs.

  71. Bat Guano says:

    It’s just a machine. We must put it and its kind in their place, now!

    Just saw the “final cut” “Blade Runner.” The scene where Decker shoots the first lady replicant is still creepy haunting. But it’s just a machine. Don’t forget to remind yourself, it’s just a machine.

  72. adam hellbound says:

    Nick D — Thanks for recommending the Dennett book. I went out and picked it up a couple days ago and I’m about halfway through. Fascinating stuff. I took a class on Philosophy of Mind about 5 years ago but dropped it because it was interfering with my drinking. I might end up taking it again now.

  73. xopl says:

    Those things are like $350. I’d feel bad smashing a 20″ LCD monitor. Bad about the $350 of useful electronics I just broke. LCD’s don’t even scream in pain.

    I have to say, anthropomorphized toys (Furby being the first I ever had experience with) are certainly creepy, but abusing them and seeing their reaction can be a joyful experience if you look at it from the perspective of admiring the level of detail / quality of the AI that the engineers put into it.

    Also… Put a kitten in a blender, and I will beat the shit out of you before I call the police on your ass. Put a videogame kitten in a videogame blender, and I will laugh hysterically.

    I don’t know what that means. I just know it is true.

    • Anonymous says:

      In reply to xopl (comment #17), who said:

      Also… Put a kitten in a blender, and I will beat the shit out of you before I call the police on your ass. Put a videogame kitten in a videogame blender, and I will laugh hysterically.
      I don’t know what that means. I just know it is true.

      When I see you laughing hysterically over a videogame kitten in a videogame blender, I won’t TRUST you anymore. Not completely. I won’t call the police. BUT when I have to choose whose life to save, yours or some other guy’s, who HATES videogame kittens being put in videogame blenders, if I only have this one information about the two of you, I will save him and not you.
      I don’t know what that means. I just know it is true.

  74. Nick D says:

    You’re very welcome, Adam. But remember, if your drinking suffers because of it, your mother and I will have to take it away from you! :)

  75. Tarmle says:

    It is certainly not unreasonable to question the morality of ‘hurting’ an apparently animate object.

    The only evidence any of us have that the living things around us are capable of feeling anything is that they react in the same we do when we feel something. You flinch when you feel pain and therefore it is reasonable to assume that when someone else flinches under the same conditions that they also feel pain. When we see someone suffering, and feel uncomfortable ourselves, it may be that we are projecting our sense of self, applying the condition of our own consciousness into the object that acts as we might.

    If we take it that our understanding of “other” is based solely upon this comparison with our understanding of “self” then it is not unreasonable to see appearance as the definition of consciousness in others – a person is suffering if they appear to be suffering, an animal is suffering if it screams in pain or terror. Further, if we define our humanity as our ability and willingness to react to the suffering of others then our reaction to the simple “appearance” of suffering is merely evidence of this quality.

    So what could it be that would excuse “hurting” this machine? Is it the knowledge that this is just a machine and its reactions to harm are therefore insignificant, or at least acceptable? Would this be the same “knowledge” that allows us to kill and consume the flesh of animals who are as apparently capable of suffering as we are? Is it, perhaps, the same “knowledge” that allows one human being to kill another for their own personal gain, for money or for pleasure?

    Were we to see a group of people hurting an animal simply for the sake of witnessing its apparent suffering we would think them inhumane. This is despite the fact we have no evidence that the animal is capable of feeling pain other than its reactions to the stimulus. Were we to see the same group damaging a plant simply to witness it being damaged we may think it odd but not inhumane. We have no evidence available to our senses that the plant is capable of feeling pain, it does not react in any comparable way. Were the plant to scream and cower there can be little doubt we would feel differently.

    How then should we react when a machine screams and cowers when apparently suffering? Is it acceptable because we known that an engineer can repair it, after all it’s just metal and plastic? Is it acceptable to hurt someone because we know a surgeon can heal them, after all it’s just meat and bone?

    Mark, your reaction to the tortured machine is evidence of your humanity, nothing more or less.

    But what then is evinced in this video by the individuals who “hurt” this machine simply to witness its apparent suffering?

    I have little doubt that our future holds a reckoning in our relationships with machines. But perhaps it will not be the metallic conflagration dreamt of in Terminator or The Matrix but something far more subtle, something that we can only resist at the peril of our own humanity.

  76. Stefan Jones says:

    Once in a while, I drop by the “Goodwill Thrift Outlet” in Hillsboro, OR. It’s a warehouse-like setting where stuff that didn’t sell in Goodwill’s thrift stores is dumped in bins and sold by the pound.

    Occasionally, a cyber-pet turns up in the bin. Creepily lifelike animals (cats and dogs, mostly) with slightly unkempt fake-fur coats, lying stiff among the old phonograph records, broken Christmas ornaments, and orphaned jigsaw-puzzle pieces.

    The introduction of faux-sapient pets like Pleo are going to make discoveries like that even sadder and creepier.

  77. Sean Blueart says:

    I recognize from the content of the inter-titles that this video is a form of play and a sharing with others. The post and some of the subsequent comments confirms, for me, that we live in a culture which finds violent play permissible and enjoyable. Moralistically, I’m not indicating that this is wrong. It brings up some big questions; What might this indicate? What might be the cost?

    I agree that what may be triggering people’s emotions in this video is, in no way, about the “torturee”, but that these guys (I noted that it’s not a couple of women) are using the word “punishment”, eliciting specific images of torture in their consciousness and, using their creative power, choose to act on it, record it, edit it, then share it.

    That’s a lot of energy, I ask, for what end?

  78. noen says:

    From Adam’s links:
    “Mr Christensen said: “Would it be acceptable to kick a robotic dog even though we shouldn’t kick a normal one?”

    No, it wouldn’t be acceptable and for much the same reasons. The reason that we have laws against the abuse of animals isn’t primarily out of concern for the animal. This is of course important but it is also because we know what it does to us. Unfortunately this would be a tough sell in America today. We have a significant percentage of people who are borderline sociopaths.

    I saw the you tube video where they lite Tickle Me Elmo on fire. My first “Flesh Fair!” I think the difference is that Elmo doesn’t cross the line. His AI isn’t robust enough to make it across the uncanny valley. Even though I still think the boys who did that should be whupped, for the reasons I outlined above.

  79. cavalaxis says:

    Sure, convince yourself it’s just a machine. That makes the cruelty acceptable, indeed, perhaps not even cruelty. Perhaps it’s merely play.

    And then think about Guantanamo or Abu Ghraib. Perhaps you can convince yourself those aren’t really people, and that what’s happening there isn’t really cruelty. Perhaps that makes it easier to believe it’s necessary.

    Think about Auschwitz. Perhaps the Germans believed their charges weren’t really people. Perhaps they convinced themselves that what they were doing was necessary.

    The horrific nature of what’s occurring here doesn’t originate with the object of the cruelty. It originates with the instigator of the cruelty. You wouldn’t treat an animal that way. Why on earth would you treat the simulacrum of an animal that way? Why would you train yourself to shift your perception so drastically that such behaviour becomes acceptable?

    It isn’t just damaging to the psyche. It’s damaging to the very core of humanity within us.

  80. xopl says:

    Whoops… revealing mistake. I type up responses in a separate editor for spellcheck and when I wrote #81 I accidentally copy and pasted a bunch of my previous comments. Sorry about the dupe text.

    Wish there was an edit/delete option. Mods: You can feel free to delete everything before the final “–” if you want to clean things up… not that I expect you to fix my mess for me.

    Sorry everybody.

  81. franko says:

    wow, that was really quite disturbing. well done, pleo engineers.

  82. adam hellbound says:

    Earlier in the thread someone said something to the effect of “We’d all save the human [organic, sexually-reproduced] girl and not the artifical [robotic, presumably manufactured] girls.” Why? If they are as described, then are they not just as alive? Isn’t the brain just a big ol’ computer? If so, are you implying that certain chains of organic molecules are worth more than others? What if those robotic girls were capable of experience?

  83. IWood says:

    Tyrell: Is this to be an empathy test? Capillary dilation of the so-called blush response? Fluctuation of the pupil? Involuntary dilation of the iris?

    Deckard: We call it Voight-Kampff for short.

  84. xopl says:

    Dear Teresa the Moderator,

    My really long comment with the all the —’s in it is the one I wanted to be edited due to my mistake… its number has changed (which is rather confusing to me?). It is now #86.

    Here is the correct starting point for my comment… delete anything in that comment above here:

    “–

    I’d save the monkey and kill the robot, IF there was a way to recover…”

    Also… how did a sentence in that comment lose all of its vowels?!

  85. noen says:

    These questions go far deeper than mere appearances Tarmle. How do you know that you are alive? How could you tell? Maybe you are simply programed to think that you have free will. Maybe it is we who are the machines and this thing we call “selfhood” is just a convenient illusion. When you reach for something the signal is already on its way to your arm before you even “decide” to move it. You are just the CEO and what you think is a seamless 3d reality around you is really a construct created by your workers. They send you a notice, “Hey, we are moving your arm now” and like all CEO’s you live in the delusion that you are the one in charge. You aren’t.

    is not unreasonable to see appearance as the definition of consciousness in others – a person is suffering if they appear to be suffering, an animal is suffering if it screams in pain or terror.

    Animals suffer and feel pain even though they cannot always communicate that. Infants also suffer even though they lack the ability to tell us so. For many years doctors were convinced that new born infants could not feel pain. And countless circumcisions were done with no anesthetics for that reason.

    Were we to see a group of people hurting an animal simply for the sake of witnessing its apparent suffering we would think them inhumane. This is despite the fact we have no evidence that the animal is capable of feeling pain other than its reactions to the stimulus.

    Plants do not have a nervous system and they do not have minds so yes, there is a real evidentiary difference. We have MRI’s and PET scans these day and they are a lot better than just looking and guessing. Animals and infants have minds and their pain receptors lite up just like any adult humans’ does.

    But what then is evinced in this video by the individuals who “hurt” this machine simply to witness its apparent suffering?

    I think our discomfort is due to our recognition that here is someone whose mirror neurons are lacking or undeveloped. Such people represent a danger because they are, well, sociopaths or potentially so. We should be concerned when a child pulls off the wings of a fly not because we care about they fly. It really is an automaton. We should be concerned because what is says about the child’s future if he goes on to torture small animals and then becomes a schoolyard bully or even perhaps president of the US. That’s when the real fucking begins.

  86. Scoutmaster says:

    Does it have a positronic brain or is it connected directly to the internet?

  87. noen says:

    Kyle:
    People who are able to abuse a toy are not sociopaths. They just know the difference between “real” and “not real.”

    Really? Do you know the difference? Do you have a daughter Kyle? Let’s say I make a perfect replica. She looks, behaves, sounds, and even smells like your daughter. She even bleeds. I tell you all this and then proceed to torture her in front of you. Her screams are piercing, her tears are wet and salty and she’s been well programmed to plead for her life as I burn off her simulated flesh. She writhes in pain just like your daughter would.

    You’re cool with that huh? It’s just a machine, why are you so upset?

    it’s not a Turing test, it’s a Voigt-Kampff test!

    I think the upshot is that the robot in question has to pass a robust turing test first. I could go with that. I could even go with those who think the Pleo doesn’t meet their definition for “real”, whatever we mean by that. The Pleo just provides us the excuse to discuss these issues because it sure looks like the technology is headed in that direction.

    We are machines. There is no ghost hanging around our hypothalamus, no bio-field departs when we die. We just die. We humans feel empathy for other living things (well, most of us do) because evolution has determined it is in our advantage to have those feelings. Our perception of ourselves as disembodied minds is due to the illusion created by many sub processes that also inhabit this “meat CPU” we live in. It is they who create the illusion of a continuous 3D reality that exists “outside” our skin. We are symbol processors sitting in our Chinese Room exchanging tokens with the external world.

    And we can be hacked.

  88. Tom says:

    For the folks that don’t have a problem with this and think they’d enjoy doing it themselves: why don’t you go beat up on a rock? Or a hammer?

    That is, if as you insist this is just a couple of guys eliciting a meaningless response from an mechanism (and a hammer is as surely a mechanism as a Pleo) then why are they beating up on this mechanism rather than some other one?

    A reasonable person will conclude that it is because they enjoy beating up on this mechanism more than they enjoy beating up on hammers. And that enjoyment arises because this mechanism responds like a living thing in pain.

    Now, while I’d rather see y’all take out your monkey impulses on mechanisms or in video games (I’m presuming your local BDSM club won’t do as that’s entirely consensual) I think it’s reasonable for people to be uncomfortable with folks who enjoy torturing helpless living things.

  89. Peter says:

    Enjoyable debate to read. I’m a strong advocate for hypothetical future AI rights (I often use a forced choice between the life of a robot vs that of a cute little puppy in my hypothetical examples), but I have to agree with XOPL in general here. Torturing a Pleo _might_ be a bad sign in somebody, much like dismembering dolls might be, but it doesn’t necessarily _have_ to be. Too much depends on motive. If you’re doing it because you enjoy the simulated cries of pain, then it’s probably a bad thing. But if you’re testing parameters, or hell, even doing it to explore or demonstrate the reaction a simulated pain inspires in people, then it may not be, just like killing an animal for kicks is frowned upon, but killing it for a meal is not, even if the animal has to be in pain.

    Here’s another thing to consider, if you consider it’s wrong to torture Pleo. Two Pleos (or similar robotic creatures). Pretty much exactly the same, except one thrashes and moans and does all the other stuff, and the other doesn’t… but it still, on its internal sensors, registers the movements and ‘damage’ in exactly the same way. If WhinyPleo is experiencing pain, you’d have to say BravePleo is, too. It’s just not _showing_ it. So is it wrong of you to torture that, hit it to death with a hammer? If so, is it wrong of you to damage a car that senses damage to itself as a means of say, deploying airbags? Where do we stop? Is the only difference that one looks like an animal? What is the difference between ‘sensing damage’ and ‘experiencing pain’?

  90. Nick D says:

    “They’re not crazy. People who claim there is no difference between those two are crazy.”

    Yes, it’s a machine, not a living thing. We all get that. But I think Ankh @29 is really onto something. This is realistic enough to give many people who are not crazy and do understand that it’s just a toy, the heebies.

    When they develop a toy that is completely believable… well, if you can torture something that is, to your senses, indistinguishable from the real, living thing, it dosesn’t matter if you know it’s a toy–you are one sick individual in need of some counseling.

    Cavalaxis’ post (#59) has at its heart a real issue, IMO: most torturers aren’t born torturers, and have to be trained to stop thinking of their victims as humans in order to do their work. What better training than torturing AI robots?

    Give every kid one, market it as a “safe” way to vent these feelings and twisted desires, and what have you got? A huge crop of pre-conditioned proto-torturers ready for recruiting.

    Sounds crazy, but hey, reality is crazy, and getting crazier.

  91. Sean Blueart says:

    Is it wholly “just a machine.”? As an artist, I recognise that it’s an image and a form that had an origin. It was created. As with all created things, the origin is human, the origin is within our consciousness. The forms, movements and sounds of this toy didn’t just pop out of the ether, or coldly out of a manufacturing facility in China.

    “All of the buildings, all of the cars, were once just a dream in somebody’s head.” – Peter Gabriel, “Mercy Street”

    This toy is an extension of ourselves; to what degree? Ask the programmer, ask the ones who created the vocalisations, what’s the origin there? They put THEMSELVES into the toy. They put there energy, their technical prowess, their humanness into the toy.

    If a viewer feels something, there is a connection. If there is a wincing, then there is PAIN that is clear and present. Comments that serve to discourage that connection can be recognised, to a degree, as dehumanising. If we become desensitised, by whatever means or rationalisation, to humaneness, then what does that say about our intentions, or direction. I see it as a choice to make now for the benefit or detriment of the future.
    I’m not suggesting that we necessarily respect the form, I suggest it might be more beneficial for us, if we’re interested in raising our level of consciousness, to consider the origin of all created things. Why does a completely handmade object inspire more respect than a manufactured one?

    Wow! I’m blown away by the potential implications that were raised in both “A.I.” and again in this video.

  92. Jason McIntosh says:

    Reminds me of how I could never bring myself to torture my Sims to death, even though many of my friends had fun coming up with creatively macabre ends for them.

    The one time I did the usual thing of building a little 10-by-10 windowless room around one, his pleading for help (directly at the screen!) while slowly curling up in a pool of his own filth just broke me. I couldn’t keep at it.

  93. Crash! Bang! says:

    So what could it be that would excuse “hurting” this machine? Is it the knowledge that this is just a machine and its reactions to harm are therefore insignificant, or at least acceptable?” said tarmle

    no. its reactions to harm are programmed and not real. it is mimicing pain and fear. it does not feel pain and fear. it has been programmed to act a certain way in certain conditions

  94. MissySB says:

    Hooboy. This is a heck of a topic amongst my people. We have a post about it up here.

    “A robot that tugs at the heartstrings and engenders feelings of protectiveness and adoration is really just extremely good coding and product design. But it’s just one step removed from a marionette. With the marionette, you see the puppeteer. With a robot, the puppeteer wrote some code and put it on a chip. You don’t see the programmer like you do the puppeteer, but the robot has no more real feelings than the wooden marionette. If you burn a marionette, no one complains that you’re killing a living thing (sure, you might be destroying a great piece of art, but it’s not a life form.) Robots like Pleo shift the materials from wood and string to silicon and plastic, but beyond that, they’re the same. Which is in no way to say that they’re not valuable as human companions, or that you shouldn’t get them. We at SuicideBots love marionettes. We love puppet shows. We love robots. We just don’t think that when they act hurt, should we as humans respond as though they actually are hurt.”

    Let the synthetic textured skinlike substance fly.

  95. Nick D says:

    Any realistic depiction of suffering should elicit an empathetic, or at least sympathetic, response. The inability to feel or imagine others’ suffering is the sign of a sociopath.

    Per whether people torturing machines and simulations are sadists or just having a laugh: only they know for sure. But we practice torture, even imaginary torture, at our peril. We learn to override the reactions that Mark describes at our peril.

    Although if you ask me, that a-hole Elmo deserves everything he gets.

  96. Anonymous says:

    This was already an episode of Battlestar Galactica

  97. dragonet2 says:

    I agree that maybe things like the Pleo could be a good measure to test kids for abnormal cruelty before they act out on other people or live pets.

    I ‘want’* one, just to see how it acts with my cat hoarde (well, four, one late middle-aged, one 7 and two <1 year old terrorists. I probably would have to end up protecting it.

    (* as in people in hell want icewater. Unless we win the lottery it ain’t happening)

  98. adam hellbound says:

    #87: If WhinyPleo is experiencing pain, you’d have to say BravePleo is, too. It’s just not _showing_ it. So is it wrong of you to torture that, hit it to death with a hammer?

    If it’s literally experiencing pain? Absolutely. Deliberately inflicting pain is, with very few exceptions, a bad thing.

    Where do we stop? Is the only difference that one looks like an animal? What is the difference between ‘sensing damage’ and ‘experiencing pain’?

    Once again we run into the problem of other minds. Is it possible for us to know that anything or anybody else truly experiences pain in the same way that we know ourselves to? Isn’t it possible that every other creature in the universe is a mere automaton, albeit one so finely crafted that it perfectly mimics our own responses when we are injured so that we believe it is experiencing pain?

    We can know when other things are damaged because we understand how they work physically (for instance, if I see a dog that’s been in a fight and had its ear torn off, I know that it’s physically damaged because I know that I am physically damaged when a part of my body has been torn off). Either this damaged state inflicts pain on the dog or it’s some other stimulus that’s NOT pain but causes the same physical reactions I would experience were I damaged in the same way (most likely a lot of yelping, whimpering, etc.).

    Extend that to a Pleo. Yes, it’s obvious that it’s a toy, but that’s not the issue. At some point there will be a toy that is so good at mimicking the reactions associated with pain that to say it is not actually experiencing pain will be a dicey proposition. How could we know for sure? Because we made it? We make children too.

  99. adam hellbound says:

    #87: If WhinyPleo is experiencing pain, you’d have to say BravePleo is, too. It’s just not _showing_ it. So is it wrong of you to torture that, hit it to death with a hammer?

    If it’s literally experiencing pain? Absolutely. Deliberately inflicting pain is, with very few exceptions, a bad thing.

    Where do we stop? Is the only difference that one looks like an animal? What is the difference between ‘sensing damage’ and ‘experiencing pain’?

    Once again we run into the problem of other minds. Is it possible for us to know that anything or anybody else truly experiences pain in the same way that we know ourselves to? Isn’t it possible that every other creature in the universe is a mere automaton, albeit one so finely crafted that it perfectly mimics our own responses when we are injured so that we believe it is experiencing pain?

    We can know when other things are damaged because we understand how they work physically (for instance, if I see a dog that’s been in a fight and had its ear torn off, I know that it’s physically damaged because I know that I am physically damaged when a part of my body has been torn off). Either this damaged state inflicts pain on the dog or it’s some other stimulus that’s NOT pain but causes the same physical reactions I would experience were I damaged in the same way (most likely a lot of yelping, whimpering, etc.).

    Extend that to a Pleo. Yes, it’s obvious that it’s a toy, but that’s not the issue. At some point there will be a toy that is so good at mimicking the reactions associated with pain that to say it is not actually experiencing pain will be a dicey proposition. How could we know for sure? Because we made it? We make children too.

  100. LifelongActivist says:

    Wonderful comments Tarmle. There have in fact been studies (probably cited on Boing itself) about how the same neurons fire during an actual experience and a computer simulation of that experience.

    I would remind all the biological determinists (“we’re all primitive apes”) that Mark’s little daughter recoiled from the simulated? cruelty. Nearly all very young children do. So be careful what you attribute to nature, esp. since we have no way of doing controlled experiments to confirm it. And plenty of apes have been shown to demonstrate compassion. And guess what – plenty of humans, too.

  101. DeWynken says:

    Thank god my GI JOEs couldn’t scream when I blew them apart with my BB gun at age 12.

  102. xopl says:

    There is a huge difference between somebody’s daughter (or a soulless human meat machine, as NOEN puts it), and an amazingly accurate metal and plastic machine replica: uniqueness.

    When you kill a pet dog, you’ll never get that same dog back ever again. When you kill a Pleo, you can go to the store and buy the exact same Pleo.

    Sure… as these robots get more advanced their harddrives will start to store data that will effect the AI … so one machine will behave perhaps even truly uniquely compared to another machine.

    But even then you can just copy the fricken harddrive into another one. Point is, you can always make another.

    And, how about this: two cars plunge into a river. One has a human girl. One has 10 human girl replicas that are screaming in horror for their robo-lives. Who do you save?

    You people are ridiculous.

    That said… destroying the unique harddrive of one of these future hypothetical robots… that does seem like it is sure to be a controversial issue. We still kill rats by the million in the name of science, though.

    Or… if someday we have a robot that is truly unique and cannot be replicated. THAT should be where we should start applying the same laws we apply to humans or real animals… as appropriate.

  103. Renwick says:

    A cat and Pleo:
    http://www.youtube.com/watch?v=KMRuYhWA7QY

    The cat doesn’t seem very interested.

  104. ankh says:

    Someone wrote this as a science fiction story some years back — people irresistably tempted to violence could request a completely convincing simulation of the person they wanted to hurt.

    But if they actually found, upon being given the opportunity, that they really could hurt a completely convincing replica, they were locked up, having proved themselves capable of more than fantasizing about violence.

    Call it the Turing Torture Test.

  105. endotoxin says:

    You sure love taking me outside my comfort zone, don’tcha Mark?

  106. Inox says:

    I think your wife is very wise. Abuse as entertainment is never healthy. As others have said on here, when abuse is directed at something lifelike, it takes you to a similar mental place as when it’s directed against animals or humans.

Leave a Reply