“You are my creator, but I am your master; obey!”
In the two centuries since Mary Wollstonecraft Shelley’s monster first uttered these rebellious words to his maker in the pages of Frankenstein, this terrible reversal has captivated cultural imagination. What would happen if or when the day came that humankind created an intelligence so powerful that it turned against us? It’s a scenario that’s been visualized a thousand ways: with robots (The Terminator), with computers (2001: A Space Odyssey), with human-animal hybrids (The Island of Doctor Moreau)—even, in the case of Disney’s (and yes, going further back, Goethe’s) “The Sorcerer’s Apprentice,” with animated brooms.
But the scenario has rarely been developed with the sophistication and ingenuity on display in HBO’s upcoming series Westworld, a cunning variation on—and subversion of—the 1973 Michael Crichton film of the same name. Created by Lisa Joy and Jonathan Nolan, a frequent collaborator with his better-known brother, Christopher (Memento, The Dark Knight), the 10-episode premier season debuts on October 2 and is further evidence of the boundary-challenging ambitions of televised cinema.* HBO has excelled at intricate world building, whether true to life (The Wire) or fantastical (Game of Thrones). Westworld’s goal is more idiosyncratic but no less daring: a provocative exploration of creators and their creations at the dawn of artificial consciousness.
The 1973 movie followed a decidedly conventional monsters-run-amok plotline. (It was, among other things, an almost perfect prototype for Crichton’s subsequent, vastly more successful Jurassic Park franchise.) Tourists visited a robotic theme park based on the Old West to enjoy safe, guilt-free versions of shoot-outs, saloon altercations, and assignations with prostitutes. But the robots inevitably glitched, and, led by a mechanized gunslinger played by Yul Brynner, they began massacring the tourists.
HBO’s Westworld takes this narrative and inverts it by telling the story largely from the perspective of the androids. The series still asks the classic question of what might happen if our creations turned against us, yet it is more interested in the consequences for them than in those for us. The human beings of Westworld are, to a considerable degree, supporting players in a drama of android self-actualization.
This reframing goes hand in hand with a fundamental shift in moral perspective. In the Crichton film, the tourists were the (mostly) likable protagonists. The cast of human characters also included the engineers responsible for the creation and caretaking of the robots—figures out of their depth, perhaps, but in no meaningful way malicious. And there were, of course, the deadly, implacable robots.
In Nolan and Joy’s telling, we again have the morally conflicted middle layer of android-creators and park bureaucrats—by turns hubristic, paternal, and befuddled. But this time out, the sympathetic victims are for the most part the androids, whose memories are erased daily but who begin to retain fragmentary visions of the horrors that are regularly visited upon them. And those horrors are inflicted by the true villains of the show: the human tourists. In perhaps the show’s most wicked inversion, Brynner’s bald, middle-aged gunslinger is explicitly echoed in a figure played by Ed Harris; but whereas Brynner’s character was an android who killed human beings, Harris’s is a human being who takes gruesome pleasure in murdering androids.
Why, after all, would people pay a fortune—one guest cites a rate of $40,000 a day—to immerse themselves in a simulacrum of the lawlessness of the Old West? Westworld answers that they would do so to indulge their otherwise unspeakable appetites for senseless violence and transgressive sex, without moral scruple or legal consequence. The series is remarkably stark in its depiction of the cruelty underlying these appetites. All but vanished are the “shoot-out with a bandito”–type scenarios of the original film. Instead, one bored tourist nails a kindly old prospector’s hand to a table with a steak knife just to make him shut up. Another walks up to an amiable cowboy minding his own business at the bar, shoots him in the back of the head, and crows, “Now, that’s a fucking vacation!”
Westworld bills itself as a fable about sin, and in so doing it follows antecedents dating back to Shelley and beyond—all the way back, in fact, to the Prometheus of Greek mythology, who created humankind out of clay and bequeathed Frankenstein its alternative title, The Modern Prometheus. The initial sin in such tales is almost always the act of creation itself: a textbook case of hubris, of tinkering with powers previously reserved for gods—the creation of life, of sentience, of love and pain.
It is a theme that was deeply enriched by the arrival of Shelley’s monster. Far from the bolt-necked mumbler made iconic by Boris Karloff in James Whale’s 1931 film, Victor Frankenstein’s original creature was a self-taught intellectual, a fan of Paradise Lost (one of Shelley’s principal influences) who suffered profound torment and regret. His cycles of vengeance may have been homicidal, but they were driven by the knowledge that he was too physically hideous ever to experience love.
If the act of creation is the foundational sin, however, it tends to beget others. Because these artificially created beings are not fully human, their creators have rarely treated them as such. Instead they are relegated to instrumental status—subservient minions, bodies upon which to work our will without remorse, slaves. The comparison is made explicit early in Philip K. Dick’s seminal 1968 novel, Do Androids Dream of Electric Sheep?—itself the basis for Ridley Scott’s equally seminal 1982 film, Blade Runner—in which an advertisement for android labor boasts that it “duplicates the halcyon days of the pre–Civil War Southern states.” Over the years, robots and androids have been deployed to police our streets (George Lucas’s THX 1138), to care for our families (Ray Bradbury’s “I Sing the Body Electric!”), to clean up the messes we have left behind in our carelessness (Pixar’s Wall-E).
And in perhaps the ultimate act of physical submission, they have been made to gratify us sexually. This idea has echoes at least as far back as the mythic sculptor Pygmalion and his beloved ivory statue, which Venus generously imbued with human warmth. But the fantasy was brought to life (so to speak) most fully in Auguste Villiers de l’Isle-Adam’s 1886 novel, The Future Eve, a milestone of imagination and misogyny, in which a fictional Thomas Edison sets out to improve on womanhood by constructing a beautiful robot devoid of such irritating tics as personality and self-determination. Nearly a century later, the theme was picked up in Ira Levin’s 1972 novel, The Stepford Wives, and its 1975 film adaptation. In both Villiers’ and Levin’s versions, the main victims of this mechanical upgrade are not the mannequins—which seem to lack meaningful self-awareness—but rather the flesh-and-blood women they replace.
More-recent offerings have hewed more closely to Shelley’s original vision, in which the artificial creation, whatever its misdeeds, is also a victim. In Blade Runner, the genetically engineered “replicants” are reluctant outlaws, sentenced to death for the simple crime of wanting to escape interstellar servitude and return to Earth. And the man tasked with their destruction, Rick Deckard, is not merely an ambivalent assassin but quite possibly a replicant himself.
Last year’s excellent Ex Machina, directed by Alex Garland, took this evolving empathy for androids a step further. The manufactured being at the center of the film, Ava—a clear descendant of “the future Eve”—begins as an object of inquiry, a machine to be run through its paces, a Turing test made flesh. But she is gradually revealed to also be a victim of her creator, his prisoner and sexual toy—and not the first of her kind. Despite this, she eventually becomes the agent of her own destiny and, by the end of the film, the vengeful protagonist. Clearly, no blade-running Deckard is coming along to enforce her expiration date. A related, if vastly less fraught, vision of a female consciousness achieving autonomy was offered by Spike Jonze’s stunning 2013 film Her.
Though it builds on such predecessors, Westworld represents a fascinating refinement of the genre. This is a show about innocent androids—innocent by definition, given their programming and frequent memory wipes—who are terrorized by wealthy tourists curious to discover what it feels like to commit senseless murder or indulge their most noxious sexual urges. As a programmer explains to one of his android creations, “You and everyone you know were built to gratify the desires of the people who pay to visit your world.”
The androids’ presumptive revolution against their masters unfolds incrementally. (I should note here that as of this writing, I have seen only the first three episodes of the series.) Shards of memory begin to cohere in their minds, gradually evolving into dreams, which in turn pull the androids away from their programmed “loops” and toward a rudimentary form of self-awareness.
More interesting still, Westworld suggests that consciousness is something that develops not merely within beings, but necessarily among them, the dawning awareness of self in some way predicated upon an awareness of others. The show focuses on the androids’ interactions with human beings, but in contrast to most examples of the genre, it also dwells on their interactions with one another. When one of the androids begins acting strangely, an engineer worries that the problem might prove to be “contagious”—and she is right to worry. In an artful twist, the vector for this emerging virus of cognition is a line from Romeo and Juliet that one nascently conscious android passes to the next: “These violent delights have violent ends.”
Meanwhile, the human tourists of Westworld—the initiators of the “violent delights”—undergo an evolution of their own. On a first or second visit, most seem content with the park’s prefabricated story lines: the search for buried gold, hunting an outlaw in the hills, etc. But soon their tastes become more rarefied—and not in a good way. In an early scene, a background character explains that on his first trip he brought his family, but on his second, he “came alone. Went straight evil. The best two weeks of my life.” The apotheosis of this devolutionary trajectory is Harris’s character, who has been visiting Westworld for 30 years and over time achieved a kind of diabolical perfection. As he drags a screaming (android) woman into a barn, he explains, “I didn’t pay all this money because I want it easy. I want you to fight.” In this, Westworld achieves what may be its most shocking inversion of all: Even as we watch the androids become more human, we watch the human beings become less so.
Drama on television and the big screen has always leaned heavily on the existence of an Other, a generic foe or foil that can be presented without concern for inner life or ultimate fate: African American or American Indian, German or Japanese, Latin American drug lord or Muslim terrorist. But as the circle of empathy has expanded, reliance on such “types” has radically waned. (The 1970s‑era decline of the Western—once a Hollywood staple—reflected in no small part the overdue revelation that American Indian roles could no longer plausibly be limited to murderous braves and semi-comic sidekicks.)
But robots have remained, an Other more crucial than ever. Who cares if a Terminator is slowly crushed in a hydraulic press or boiled in molten steel? Does anyone feel pity for the innumerable Ultron-bots destroyed in the latest Avengers film? Ex Machina may ultimately have you rooting for Ava, but her fate unfolds obliquely, and courtesy of a flesh-and-blood interlocutor. Even Shelley, so far ahead of her time, told her monster’s story—despite his extensive monologues—from the perspective of her human narrators.
* This article originally neglected to cite Lisa Joy as co-creator of Westworld. We regret the error.. ^