Dolores seems pretty sure where she stands on the question.
The HBO show Westworld is largely focused on a single question: what does it mean for people to kill and abuse robots that look, act, and possibly even feel like humans?
Westworld challenges viewers by showing "robots" who are played by flesh-and-blood actors. It tells us they're not real, but of course, we all know Evan Rachel Wood and Thandie Newton are real, so we can more easily make the jump into the show's moral quagmire.
If a robot looked and talked like Dolores, Teddy, or Maeve, would anyone feel ok about shooting that robot in the face? What if the robot was trying to kill them? What if they knew the robot only looked like it was feeling pain? And as we in the real world make our voyage from "1993 Doom Monster" to "2013 Grand Theft Auto Civilian" to, presumably, "2027 Westworld Android," how will we know if we've crossed a line?
Sean Illing at Vox published a cool conversation with Yale moral psychologist Paul Bloom about robo-morality and Westworld, in response to Bloom's New York Times essay on the subject, which he co-authored with Sam Harris.
Their dialogue starts out with Bloom's clear-cut take on the idea. "If we create machines that are just like us, and feel pain and anguish and suffering and shame and all of that stuff, then it would be as wrong to hurt them as it would be to hurt each other," he says. "That's the low-hanging fruit in all of this." Then things immediately become more complicated:
Sean Illing: Well, it's easy if you assume they can actually feel pain, but it's not so easy if it's a machine that's been programmed to replicate human suffering. In that case, is it actually feeling pain or is it just mechanically signalling the experience of pain?
Paul Bloom: Yeah, that's when the hard questions arise. I think it really matters whether the robot is feeling pain or signalling the experience of pain. There's all the difference in the world between a creature that feels pain and really suffers versus something that has no more sentience than a toaster.
So it's possible that you could have Westworld-like robots that look and talk like us but literally have nothing going on inside. They're no more conscious than my iPhone. In that case, there's nothing morally wrong about mistreating [them].
On the flip side, it's possible that we could create machines that don't seem conscious, they don't have human faces and bodies, but might actually be fully conscious organisms, in which case, making them suffer would be as wrong as making a person suffer.
If you've been watching Westworld and have thought about this stuff at all, check out the full conversation in the Vox post. It probably echoes conversations you've been having in your own living room.
I thought I'd ask some of my Kotaku colleagues for their thoughts on the matter. Is it morally ok to kill a Westworld robot? Here's what they said, and bear in mind that I asked them to keep their answers short:
Gita Jackson: No. This is a lot like the question that my vegan ex used to pose to me: is it ok to eat honey? Are invertebrates ok to eat? But the show has pretty decisively shown that they have consciousness and that killing them is at least a cousin to murder. I'm not a vegan and eat honey and shellfish but, that's How I Feel.
Heather Alexandra: No. It think was immoral from the moment they could feel pain, let alone achieving proper sentience. I don't hit bugs.
Riley MacLeod: Yes ... an answer which surprises me.
Chris Person: It's super not OK to kill a Westworld robot. Like the entire series is telegraphing (at least based on the first season) that humanity is not that special in the grand scheme of things and that the difference between us and the robots is really an aesthetic distinction at best.
Stephen Totilo: No, because the Westworld robots have shown that they might have sentience and be capable of personal agency. That is not true for the enemies I can kill in Red Dead Redemption. This does get uncomfortably close to abortion discussions that about when life begins, of course.
Jason Schreier: I think that it'd be incredibly disturbing for anyone to take the life of a facsimile that's so uncannily close to being human, and given what we've learned over the series about the Westworld robots' evolving consciousnesses, I'd say it's 100% not OK.
Patricia Hernandez: Sure, if they're trying to kill you - which they might be, in the current season! If they're just being chill, I'd say no.
I turn it over to you, dear readers. Given what we know about them so far, is it morally ok to kill a Westworld robot?