Forest (Nick Offerman) is the delphic overlord of a 'quantum AI' company called Amaya in 'Devs.'FX

Devs, the new eight-part drama written and directed by Alex Garland (Ex Machina, Annihilation), is the kind of series that signals its grandiosity from the word go, with an abstract montage featuring choral music, saxophone interruptions, and fragmented scenes of San Francisco. In the opening seconds, the camera pans in slowly on the darkened features of Forest (played by Nick Offerman), a tech-company CEO with a bedraggled beard and a frozen expression, like a GEICO caveman who’s seen some stuff. Then it cuts to a triptych of video installations featuring a small girl blowing puffy white seeds off a dandelion. Devs is immediately ponderous, alienating, and full of unintentionally funny details: Why is there a 100-foot-high sculpture of that same small girl in the middle of the redwoods? Has the Golden Gate Bridge always seemed so IKEA-poster generic? Why is the most high-tech coding campus in Silicon Valley as gilded and blandly opulent as a Mandarin Oriental business center?

With Devs (one of the first shows to air on Hulu under the “FX on Hulu” mantle), and with the third season of Westworld, which debuts on HBO on Sunday, TV seems to be entering its age of algorithmic anxiety. There are no robots in Devs, but the characters are so flatly preoccupied with determinism—and with data’s potential ability to assess and contain the complexity of human lives within lines of code—that there may as well be. Every character in the show seems oddly muted in some way, tranquilized into mechanical acquiescence. It’s not that Offerman doesn’t have the range to play Forest, the delphic overlord of a “quantum AI” company called Amaya, with its unspecified products and creepy child logo. It’s that on-screen, the actor practically bursts with ebullience, and this is a whimsy-free zone. I burst out laughing when, in one scene, Forest shoved salad into his face without using any utensils, like a combless Amy Klobuchar. It was the one scene in eight plodding hours when Devs, for a minute, seemed as if it were in on the joke.

Sonoya Mizuno (Ex Machina) plays Lily, a young employee at Amaya who commutes cozily to work from San Francisco each day on the company bus with her boyfriend, Sergei (Karl Glusman). In the first episode, Sergei is handpicked by Forest to join “devs,” Amaya’s top-secret development initiative. The story unfolds in obtuse layers: Sergei’s walk down a (literal) garden path toward devs’ concrete-sealed headquarters; his introduction to the code that spells out what devs actually does; his visceral shock in response. By the time Sergei goes missing, viewers have seen enough to know that the “official” security footage of his dramatic self-immolation at the feet of Amaya’s enormous child idol is entirely unreliable.

Devs is one several recent shows whose characters are flatly preoccupied with data’s ability to contain the complexity of human lives within lines of code. (FX)

Devs is only the latest in a series of puzzle-box shows more preoccupied with their own cleverness and their labyrinthine twists than with the burden of watchability. The past two seasons of Westworld have prized complexity over coherence; the work of Sam Esmail, specifically USA’s Mr. Robot and Amazon’s Homecoming, has set a tone for jarring, dour auteur-driven drama. Garland’s own style is distinct (think the chilling, philosophical agitations of Ex Machina or the vivid eco-horror of Annihilation), and yet the director seems to have come to television, like so many of his film peers, with little sense of what the medium offers other than extra time. The mysteries of Devs don’t unspool so much as eke out in a torturously slow drip. And the show’s aesthetic details—the score by Ben Salisbury and Portishead’s Geoff Barrow, the Kubrickian jumps and color-blocked portrait shots—feel so detached from the story that they’re often insufferable.

The overarching theme within Devs is the relationship between data and determinism. The more data tell us about ourselves, the more we can predict human behavior and the more free will is eroded, Garland suggests. Forest and his deputy, Katie (Alison Pill), talk about the delineations and divides within determinist theory in surprising detail (although the show’s casual treatment of quantum computing and the meaning of the multiverse might send you straight to Google). The universe, Forest explains to Sergei in the first episode, is “godless and neutral and defined only by physical laws.” Humans “fall into an illusion of free will,” he argues, “because the tramlines are invisible.” But they’re there, all the same.

Oddly, Garland seems flummoxed by simpler dialogue. “Sir, you’ve got more money than God,” an employee tells Forest. “You think I care about money?” he replies. “You did once.” “Well now I don’t.” Lily’s ex-boyfriend tells her, “I know you. You do stuff. The stuff other people only think about, you go ahead and do it.” By the time a character in the final episode trots out the old “Don’t blame me; it was predetermined” excuse, it’s hard to believe that these characters are human beings at all.

The more data tell us about ourselves, the more we can predict human behavior and the more free will is eroded, Garland suggests. (FX)

Garland was an art-history major, and Devs leans heavily on the idea that art in particular is what separates humans from advanced artificial intelligence. A pivotal character quotes Larkin and Yeats and cites Bach and Coltrane as the pinnacle of human significance. So it’s ironic that Devs is so robotic. Interactions between characters are as languid and ambiguous as a Harold Pinter play, without the accompanying tension. If Garland is trying to make the point that working in tech has robbed these people of their soul, he’s succeeded. But he has also left his show devoid of animation, of passion, of any emotion that might pull the story out of the automated culture he’s trying to indict.


The determinism-driven paranoia of Devs dances through Season 3 of Westworld, which for the first time leaves the park of the show’s title to explore what real life looks like in a world that abuses robots. The first two seasons of Westworld explored the idea that the AI “hosts” were being tortured—not only by the physical and sexual violence they endured at the hands of the park’s glumly sadistic visitors, but by the scripted narratives, or story loops, that suppressed their agency, their ability to think or feel for themselves outside of their coding. In Season 3, whose tagline is “Free will is not free,” the show suggests for the first time that the robots aren’t the only ones whose lives conform to existing scripts. One of the new villains in the first four episodes is Serac (played by the French actor Vincent Cassel), a reclusive trillionaire who’s found an algorithm that uses data to predict the future for every human on Earth.

The show’s creators, Lisa Joy and Jonathan Nolan, seem bruised by some of the criticism of the second season, with its indecipherable strata of conceits involving the Man in Black, the mythical game-within-a-game, the potential immortality of both humans and robots, and a timeline jumpier than a flea circus. So in Season 3, everything is markedly different. In a futuristic, Blade Runner–esque Los Angeles, a new character, Caleb (played by Breaking Bad’s Aaron Paul), tries to navigate life as a veteran, unfulfilled by his day job as a construction worker and disaffected by his nighttime activities of app-enabled petty crime. Through Caleb, Westworld suggests that humans outside the park are being manipulated into following the same prewritten paths, the same “tramlines,” as Dolores (Evan Rachel Wood), Maeve (Thandie Newton), and the other hosts.

Westworld Season 3’s tagline is “Free will is not free.” The show suggests for the first time that the robots aren’t the only ones whose lives conform to existing scripts.

Determinism aside, this is a zanier, sillier Westworld, and much more entertaining for it. When Caleb successfully commits crimes, his app tells him, “You made bank, now get drank.” Self-flying drones transport the ludicrously wealthy from one rooftop poolside martini bar to another. Dolores apparently escaped Westworld at the end of Season 2 to joyride motorcycles in scarlet bandage dresses, as if she’d somehow teleported into the Fast and the Furious franchise. Marshawn Lynch plays a new character whose T-shirt acts as a mood ring, spelling out whether he’s angry or amused or bored.

But through it all, the show’s anxiety about free will—and its extension of that conundrum to its human characters—is apparent. That everything takes place in a near-future landscape complete with product placement for Coach and Tory Burch only makes the show’s willingness to untangle subliminal cues more ironic. How much do we actually decide things for ourselves, Westworld wonders, and how much are we steered by the systems around us? “You and I are a lot alike,” Dolores tells Caleb in one scene. “They put you in a cage, Caleb. Decided what your life should be. They did the same thing to me.” If Devs is angsting over the moral and existential significance of technology that still seems—for now—out of reach, Westworld is taking the data mining and user profiling of contemporary life to its logical, dystopian extreme. The requisite “We’re not so different, you and I” speech here comes not between hero and villain, but between human and robot.

We want to hear what you think about this article. Submit a letter to the editor or write to letters@theatlantic.com.