So Is It OK To Kill A Westworld Robot, Or What?

Dolores seems pretty sure where she stands on the question.

The HBO show Westworld is largely focused on a single question: what does it mean for people to kill and abuse robots that look, act, and possibly even feel like humans?

Westworld challenges viewers by showing "robots" who are played by flesh-and-blood actors. It tells us they're not real, but of course, we all know Evan Rachel Wood and Thandie Newton are real, so we can more easily make the jump into the show's moral quagmire.

If a robot looked and talked like Dolores, Teddy, or Maeve, would anyone feel ok about shooting that robot in the face? What if the robot was trying to kill them? What if they knew the robot only looked like it was feeling pain? And as we in the real world make our voyage from "1993 Doom Monster" to "2013 Grand Theft Auto Civilian" to, presumably, "2027 Westworld Android," how will we know if we've crossed a line?

Sean Illing at Vox published a cool conversation with Yale moral psychologist Paul Bloom about robo-morality and Westworld, in response to Bloom's New York Times essay on the subject, which he co-authored with Sam Harris.

Their dialogue starts out with Bloom's clear-cut take on the idea. "If we create machines that are just like us, and feel pain and anguish and suffering and shame and all of that stuff, then it would be as wrong to hurt them as it would be to hurt each other," he says. "That's the low-hanging fruit in all of this." Then things immediately become more complicated:

Sean Illing: Well, it's easy if you assume they can actually feel pain, but it's not so easy if it's a machine that's been programmed to replicate human suffering. In that case, is it actually feeling pain or is it just mechanically signalling the experience of pain?

Paul Bloom: Yeah, that's when the hard questions arise. I think it really matters whether the robot is feeling pain or signalling the experience of pain. There's all the difference in the world between a creature that feels pain and really suffers versus something that has no more sentience than a toaster.

So it's possible that you could have Westworld-like robots that look and talk like us but literally have nothing going on inside. They're no more conscious than my iPhone. In that case, there's nothing morally wrong about mistreating [them].

On the flip side, it's possible that we could create machines that don't seem conscious, they don't have human faces and bodies, but might actually be fully conscious organisms, in which case, making them suffer would be as wrong as making a person suffer.

If you've been watching Westworld and have thought about this stuff at all, check out the full conversation in the Vox post. It probably echoes conversations you've been having in your own living room.

I thought I'd ask some of my Kotaku colleagues for their thoughts on the matter. Is it morally ok to kill a Westworld robot? Here's what they said, and bear in mind that I asked them to keep their answers short:

Gita Jackson: No. This is a lot like the question that my vegan ex used to pose to me: is it ok to eat honey? Are invertebrates ok to eat? But the show has pretty decisively shown that they have consciousness and that killing them is at least a cousin to murder. I'm not a vegan and eat honey and shellfish but, that's How I Feel.

Heather Alexandra: No. It think was immoral from the moment they could feel pain, let alone achieving proper sentience. I don't hit bugs.

Riley MacLeod: Yes ... an answer which surprises me.

Chris Person: It's super not OK to kill a Westworld robot. Like the entire series is telegraphing (at least based on the first season) that humanity is not that special in the grand scheme of things and that the difference between us and the robots is really an aesthetic distinction at best.

Stephen Totilo: No, because the Westworld robots have shown that they might have sentience and be capable of personal agency. That is not true for the enemies I can kill in Red Dead Redemption. This does get uncomfortably close to abortion discussions that about when life begins, of course.

Jason Schreier: I think that it'd be incredibly disturbing for anyone to take the life of a facsimile that's so uncannily close to being human, and given what we've learned over the series about the Westworld robots' evolving consciousnesses, I'd say it's 100% not OK.

Patricia Hernandez: Sure, if they're trying to kill you - which they might be, in the current season! If they're just being chill, I'd say no.

I turn it over to you, dear readers. Given what we know about them so far, is it morally ok to kill a Westworld robot?


Comments

    If I could go to Westworld I'd be doing one thing and one thing only.

    You'll find me at Maeve's saloon.

    Last edited 04/05/18 4:54 pm

    Does the robot have a soul? If yes than it's murder. If not then it's vandalism.

      Do people have souls?

        Yes.

          Clearly Neo has died, been to heaven, and come back to tell us all about it. Well done sir....

          Can you point me to where it is located? I'm not doctor but I don't ever remember seeing it on an anatomy chart.

    Realistically the answer seems pretty obviously 'no if they have sentience.'

    Westworld spoilers:
    In terms of Westworld, the sentience thing is a bit tricky, most of S1 they were shown to clearly not be sentient. Even Maeve who seemed to be showing signs of sentience was later revealed to be just following her programming (until she stepped off the train anyway). For all the S2 stuff, it's tough to really say yet how much is them acting from their own free-will or just based on their coding, I assume we'll learn that later in the season.

    I guess you could argue that even at their basic level, the Westworld robots are designed to dream, think, solve problems, work together, fulfill needs, ect... They might not meet the human standards of sentience or consciousness but they would have to be around the same standard as some animals at least.

    Also, slightly off-topic;
    Heather Alexandra: No. It think was immoral from the moment they could feel pain, let alone achieving proper sentience. I don't hit bugs.
    I do, fuck bugs.

    OK? dunno, probably not.

    Legally permissible? absolutely.

    I like Patricia's answer. if some Westworld robot is trying to kill me I don't care how realistic they are, they are going down with me (because let's face it they are probably stronger and more resilient then a normal human heh)

    Ford spelled it out in Season 1, They don't feel anything except for what they are programmed to.

      He also said he later decided he was wrong about that though which is why he wrote his new narrative.

    You strike a robot and it's reaction tells you it is experiencing pain and suffering. It screams, it winces, it clutches the area hit.

    But is it actually experiencing pain in any form or is it following a programmed reaction to stimuli ? If it is experiencing pain what is that pain actually like?

      You can make this argument about everything else in the world that isn't yourself. Do animals feel pain like you do? Do other people? Isn't the way you react to pain just programming from your biology?

        Except the difference in the case of an AI, how it reacts to stimuli is directly chosen by the programmers.

Join the discussion!

Trending Stories Right Now