This story contains spoilers for Season 3 of Westworld.
Every season of Westworld, HBO’s glossy sci-fi show about robot cowboys and the perils of overly ambitious theme-park design, diagnoses a new villain in society. In Season 1, the evil was humanity itself, as illustrated by how easily people unleashed their violent tendencies when allowed to roam free in a grown-up playground. In the second season, the true evil was Westworld’s corporate overlords, who were revealed to be profiting from the park by surveilling its visitors’ every move. The third season, which aired its finale on Sunday, took the show’s plot to its logical conclusion by leaving the park and exploring a dystopian future ruled by big data—a world where every inch of society is predicted and controlled by a room-size algorithmic computer.
The pervasiveness of surveillance and data-gathering in the tech sector is a worthy and timely subject, one that was the central theme of another big sci-fi show this year, Alex Garland’s slickly made yet ponderous Devs. And as Westworld’s latest season went on, it reflected some of the ongoing battles over social and civil liberties happening around the world—battles that have only intensified in light of the coronavirus pandemic. Even though this season was full of expensive-looking locations, intense action set pieces, and A-list cast additions such as Aaron Paul and Vincent Cassel, none of that could disguise a fundamental dramatic shortcoming: No matter how unsettling it may be to think about conglomerates having tons of information on everyone, big data simply doesn’t make for a compelling onscreen villain.
That flaw had been nagging at me for each of the season’s eight episodes, which dispensed gory action with practiced efficiency but felt hollow nonetheless. The show’s villain problem was crystallized in the season finale, “Crisis Theory,” in which the vengeful robots Dolores (played by Evan Rachel Wood) and Maeve (Thandie Newton) faced off against the trillionaire mogul Engerraund Serac (Cassel) and Rehoboam, the colossal AI that he used to impose order on human affairs. Rehoboam, a glowing red sphere that pulsated with energy and spoke in philosophical riddles, was impressive; its demise, which involved flicking a few switches to turn it off, was less so. For the most part, the threat of big data was amorphous and faceless.
The storytelling impulses that led Westworld creators Jonathan Nolan and Lisa Joy down this narrative path are understandable. Despite its sci-fi trappings (futuristic robots et al.), Westworld has been fundamentally concerned with human behavior from its first episode, analyzing the depravity of Westworld’s attendees alongside the developing consciousness of the “hosts” (the robots built to populate the park). Designed only to mimic reality, the hosts ended up shattering it, rising up against Westworld’s visitors and owners to assert their newly developed independence. The first season turned out to be a curious and arresting parable of how violence is a universal language.
Ever since that robot revolution, Westworld has been stuck trying to find newer angles on the same question: What happens when you combine advanced machine learning with humanity acting on its worst instincts? The second season responded to that prompt by advancing the show’s world-building. It revealed that Delos, the park’s corporate owner, was gathering information on visitors and perhaps looking to make robotic clones out of them, thus inventing a form of artificial immortality. But the plotting was molasses-slow, charting the further disintegration of the theme park over the course of just a few days, and only teasing any exploration of the outside world at the very end.
So Season 3 went full speed ahead with its narrative, starting with Dolores finally being set loose on society and wreaking havoc, and ending with the world descending into an apocalypse, with cities engulfed in rioting and chaos. Westworld the park was largely forgotten. Rehoboam exploded the creepy implications of Delos’s surveillance a thousandfold by functioning as a global oracle of sorts. Because the machine was plugged into everyone’s public and private lives, Serac could use algorithms to literally shape the future, partly by murdering any humans who might rebel. Caleb (Paul) was one such human, enlisted by Dolores to serve as an ally in her quest to topple Rehoboam.
Westworld’s “real world” was probably the most detailed depiction of a new and horrifyingly plausible future—the data dystopia, where people’s lives are kept in check by corporate forces they’re barely aware of. Through gene editing, stock-market manipulation, and cold-blooded assassination, Serac and his corporate lackeys exerted godlike mastery over society. Throughout the season, he claimed it was all in service of civilization, to help the planet stave off catastrophes like nuclear war; in doing battle with him, Dolores is fighting for free will, and for humanity’s collective right to make those terrible mistakes. Unfortunately, both she and Caleb are dull enigmas throughout the season, failing to demonstrate the independent streaks that supposedly make them different from the reptilian Serac.
It’s too bad, because Nolan and Joy tried to use this season to pose a fascinating question: Is it worth sacrificing some element of free will to live in a relatively safe society? Serac’s world was chilly in its futurism, all glassy skyscrapers and clean, empty streets, but after Dolores turned off his machines and revealed the extent of his domination to the public, those streets erupted into war. In a postmortem interview, Nolan and Joy said they were inspired by the Hong Kong protests of 2019. But they also acknowledged the strange symmetry the show had with more recent protests against government-mandated social distancing, at a time when the tensions between public safety and personal freedoms have been especially profound.
“I never anticipated in a million years anyone would be fucking stupid enough to protest, you know, a disease, right?” Nolan told Variety, referring to the pandemic. “This is the tricky thing about making a show. The idea of social revolution—that’s an idea that can be interpreted and reinterpreted by people however they want.” When Westworld’s third season was being made, it was imagining a bright, tech-led future with a sinister undercurrent. Now even the villainy of big data feels quaint, and Dolores’s bloody fight for free will against any notion of government control may seem more fraught than originally intended.
We want to hear what you think about this article. Submit a letter to the editor or write to firstname.lastname@example.org.