Westworld: Robots, Ethics, and What it Means to be Human

Remaking the West (World)

Can robots think and feel just as humans do? What makes us human? An what does it mean to be human? These are some of the main questions posed by HBO’s new, high concept series, Westworld. Produced by Jonathan Nolan (brother and collaborator of Christopher Nolan), Lisa Joy (writer for Pushing Daisies and Burn Notice), and JJ Abrams (who seems to be everywhere of late, from Star Trek to Star Wars, like the Steven Spielberg of the glorious Amblin years), Westworld is based on the 1973 film of the same name written and directed by the late Michael Crichton. The original film, more of a charming B movie than a Hollywood super production, took place in an amusement park that allowed guests to live their every fantasy in three different worlds of the past (Medieval World, Roman World, and West World, although most of the movie took place on the last one, henceforth the name of the film). The movie, as most of Crichton’s books, was a vehicle that mixed science and entertainment, and was an early example of the exploration of the dangers of artificial intelligence (AI).

HBO’s Westworld is an update on that premise, although more ambitious in its scope (the show is supposed to be mapped out for 5-7 seasons, and it is expected to cover a different world each year), as well as in its exploration of the philosophical repercussions of AI. The series begins with an introduction to the park and some of its main characters, which can be broadly divided between the guests, people who pay $40,000 a day to be in the park, and the hosts, the incredibly human-like androids whose role is to portray archetypal characters of the Wild West for the amusement and entertainment of the guests (sheriff, bandits, prostitutes, gamblers, civil war soldiers, etc.).

Among the main hosts we have Dolores, played by Evan Rachel Wood, the quintessential damsel in distress, Maeve, a madame who has seen better days, played by Thandie Newton, and Teddy, our earnest hero, played by James Mardsen. Among the guests we have William and his friend (and likely brother-in-law) Logan, played, respectively, by Jimmi Simpson and Ben Barnes. The most intriguing guest though is the mysterious Man in Black, a long time visitor to the park played by Ed Harris that, according to a popular internet theory, is an older version of William. The Man in Black is cruel and ruthless, and he is convinced that the park hides a secret (a center in the middle of a maze), and he will stop at nothing to find it.

Some of Westworld’s characters. Clockwise from the top left: Dolores, played by Evan Rachel Green; the Man in Black, played by Ed Harris; Dr. Robert Ford, played by Anthony Hopkins; and Maeve, played by Thandie Newton.

The rest of the cast includes the people who are responsible for the creation and maintenance of the park. There is Dr. Robert Ford, the creator of the park, and the person who may be behind the recent emergence of consciousness among some of his creations, and Bernard, played by Jeffrey Wright, who is responsible for the maintenance of the androids, and he is particularly compelled by the emergence of human-like behavior in the host Dolores (there is an interesting theory that argues that he is actually an android).

The Promise and Dangers of Artificial Intelligence in Film and TV

Westworld‘s main theme, the promise and dangers of artificial intelligence, has a long standing tradition within the science fiction genre. Stanley Kubrick’s 2001, A Space Odyssey (1968), Ridley Scott’s Blade Runner (1982), War Games (1983), James Cameron’s Terminator (1984), the Wachowskis’ The Matrix (1999), Steven Spielberg’s AI (2001), TV shows such as Battlestar Gallactica (the 1978 and the 2004 versions), Almost Human (2013), and Black Mirror (2011), and recent films such as Spike Jonze’s Her (2013), and Alex Garland’s Ex-Machina (2015), are good examples of the genre.

Screen Shot 2016-11-04 at 2.53.18 PM.png

Westworld inserts itself (thematically, philosophically, stylistically) within the tradition established by those films and TV shows, while also echoing the recent warnings about the dangers of artificial intelligence expressed by some of the most brilliant minds of our generation. Stephen Hawkins warned us in 2014 that AI can be the end of mankind, Elon Musk called it”our biggest existential thread,” and Bill Gates, in a less dramatic, but no less worrying tone reminded us of the dangerous unintended consequences of AI.

To Know is to Suffer: On Consciousness and Being Human

The first episode of the show introduces two central and related questions: What makes us human? and What does it mean to be human? The show approaches these questions by exploring the emergence of consciousness in some of the park hosts, as well as through the ethical consequences of the actions undertaken by the guests while in the park.

The  emergence of consciousnesses seems to be connected to a new update designed by Dr. Ford that has made the hosts more human like, but has also made them have flashbacks of their (mainly traumatic) past experiences in the park, causing them to ask themselves about the nature of their memories (where do they come from?), the nature of their existence (who are they?), and the reality of the world they live in. The show takes quite the existential approach since the emergence of consciousness is inextricable connected to the emergence of suffering and pain: to know, the show seems to hint, is to suffer. This connection between knowledge and pain would also explain the name of the first host that achieves consciousnesses, Dolores, which means pain or suffering in Spanish.

The existential and dark tone of the show reminds me of the first season of another HBO show, True Detective, which also presented a senseless, meaningless, and violent world. Matthew McConaughey is not only a detective searching for the solution of a violent crime, he is a detective in the search of truth, the ultimate truth at that, that will explain the reason for our apparently meaningless existence. Dolores and Maeve are on a similar existential quest, looking for answers to who they are, and the ultimate meaning of life.

To Kill or Not to Kill: Ethics in Westworld

Parallel to the path of self-discovery of the hosts, we have a similar search by the guests, portrayed by William and Logan. In a  clear, almost simplistic (but effective way), William and Logan represent the duality between the best of human nature vs. the worst of it. They are the yin and yang of human nature. William is the good and principled everyday guy, while Logan represents all of our lower and darker instincts.That duality is clearly expressed in the choices of hats when they enter the park: William chooses a white one, Logan chooses a black one (Westworld may be a sophisticated show, but subtle it is not).

william-and-logan-suit-up-hbo.jpgIf some of the hosts are struggling with their newly found consciousness, William and Logan are challenged by their newly found freedom. The park encourages them to explore their everyday fantasies in a safe environment. Logan embraces this freedom and takes it as an invitation to do as he pleases: have constant sex with the hosts, killing many of them (justifiably or not), etc. William, on the other hand, acts in the park as he would do in the “real” world, refusing to have sex with the hosts (he claims that he is committed to someone outside of the park), or kill androids unnecessarily. The question that they face is obvious: do their actions in the park have “real” moral consequences? Is it OK to kill and rape hosts if they are not human? And what does the notions  of “human” and “real” mean anyway? The show seems to invite an intriguing but superficial reading of Nietzsche and its concept of the Übermensch by presenting the park not only as a place of entertainment, but as a place in which the guests can explore who they are uninhibited by traditional notions of morality of good and evil. The initial responses to these moral questions present a clear dichotomy between William and Logan, but the show is already hinting at the complex nature and consequences of actions in the park, since we are apparently watching the “breaking bad” of William and his transformation into the Man in Black.

Lost in Westworld: Searching for its Center

Another big theme of the show is the ultimate question of the purpose of the park: what is this park? where is it? and what is its purpose? These questions are particularly relevant in the story line that focuses on the Man in Black, a character who has visited the park for 30 years and who is convinced that the park hides a secret, a center, a truth that was hidden by the creators of the park and that he is determined to discover.westworld-maze-map.jpg


This aspect of the show reminds me of another series produced by JJ Abrams: Lost. Westworld, like Lost, presents a mysterious world: we don’t know where Westworld is (on earth? in a dome?), we don’t know when it is (there seem to be at least two timelines running at once), and we do not know anything about the world outside of the park.

Like in Lost, it is this lack of knowledge that fuels the characters as well as the viewers, who are searching for answers to multiple questions, in their  ultimately quest for meaning. The search for the center of the maze  in the show, and for answers to the nature of their existence, is mirrored by our search, as viewers for the meaning of what the show is trying to tell us.  This is what I’ve called before in this blog as reading popular culture as scripture, based on the religious scholar Wilfred Cantwell Smith’s understanding of scripture not simply as a the holy text of a particular religious tradition, but also as the “fluid and intimate relationship that a particular community develops with a particular text.” As with Lost, there is a growing community of people who watch each episode of the show multiple times looking for clues that will help them put the pieces of the puzzle together. There are forums on Reddit dedicated to various theories about the show, there are several podcasts, and many online websites that explain in detail the events of every episode. The show, as in Cantwell Smith definition of scripture, is creating a hermeneutic community that it is watching the show not only as an entertainment, but as a vehicle to think about important questions, a community that it is active online, that shares their ideas, their theories, and the clues they have found, creating a passionate dialogue about the future of AI technology, its promises, but also its dangers.

Let’s hope that Westworld lives up to its early promise and keeps the intensity of the plot as well as the smart and intelligent writing beyond the first few episodes. So far, I am hooked.

5 thoughts on “Westworld: Robots, Ethics, and What it Means to be Human

    1. Thanks! I am not sure I’ll do a recap of every episode, but I’ll write keep writing about it. There is definitely lots to talk about in the show!


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s