Pop Culture Is Finally Getting Hacking Right

Movies and TV shows have long relied on lazy and unrealistic depictions of how cybersecurity works. That’s beginning to change.


The idea of a drill-wielding hacker who runs a deep-web empire selling drugs to teens seems like a fantasy embodying the worst of digital technology. It’s also, in the spirit of CSI: Cyber, completely ridiculous. So it was no surprise when a recent episode of the CBS drama outed its villain as a a video-game buff who lived at home with his mother. For a series whose principal draw is watching Patricia Arquette yell, “Find the malware!”, that sort of stereotypical characterization and lack of realism is to be expected.

But CSI: Cyber is something of an anomaly when it comes to portraying cybersecurity on the big or small screen. Hollywood is putting more effort into creating realistic technical narratives and thoughtfully depicting programming culture, breaking new ground with shows like Mr. Robot, Halt and Catch Fire, and Silicon Valley, and films like Blackhat. It’s a smart move, in part because audiences now possess a more sophisticated understanding of such technology than they did in previous decades. Cyberattacks, such as the 2013 incident that affected tens of millions of Target customers, are a real threat, and Americans generally have little confidence that their personal records will remain private and secure. The most obvious promise of Hollywood investing in technically savvy fiction is that these works will fuel a grassroots understanding of digital culture, including topics such as adblockers and surveillance self-defense. But just as important is a film and TV industry that sees the artistic value in accurately capturing a subject that’s relevant to the entire world.

In some ways, cyberthrillers are just a new kind of procedural—rough outlines of the technical worlds only a few inhabit. But unlike shows based on lawyers, doctors, or police officers, shows about programmers deal with especially timely material. Perry Mason, the TV detective from the ’50s and ’60s, would recognize the tactics of Detective Lennie Briscoe from Law & Order, but there’s no ’60s hacker counterpart to talk shop with Mr. Robot’s Elliot Alderson. It’s true that what you can hack has changed dramatically over the past 20 years: The amount of information is exploding, and expanding connectivity means people can program everything from refrigerators to cars . But beyond that, hacking itself looks pretty much the same, thanks to the largely unchanging appearance and utility of the command-line—a text-only interface favored by developers, hackers, and other programming types.

Laurelai Storm / Github

So why has it taken so long for television and film to adapt and accurately portray the most essential aspects of programming? The usual excuse from producers and set designers is that it’s ugly and translates poorly to the screen. As a result, the easiest way to portray code in a movie has long been to shoot a green screen pasted onto a computer display, then add technical nonsense in post-production. Faced with dramatizing arcane details that most viewers at the time wouldn’t understand, the overwhelming temptation for filmmakers was to amp up the visuals, even if it meant creating something utterly removed from the reality of programming. That’s what led to the trippy, Tron-like graphics in 1995’s Hackers, or Hugh Jackman bravely assembling a wire cube made out of smaller, more solid cubes in 2001’s Swordfish.

A scene from Hackers (MGM)
A scene from Swordfish (Warner Bros.)

But more recent depictions of coding are much more naturalistic than previous CGI-powered exercises in geometry. Despite its many weaknesses, this year’s Blackhat does a commendable job of representing cybersecurity. A few scenes show malware reminiscent of this decompiled glimpse of Stuxnet—the cyber superweapon created as a joint effort by the U.S. and Israel. The snippets look similar because they’re both variants of C, a popular programming language commonly used in memory-intensive applications. In Blackhat, the malware’s target was the software used to manage the cooling towers of a Chinese nuclear power plant. In real-life, Stuxnet was used to target the software controlling Iranian centrifuges to systematically and covertly degrade the country’s nuclear enrichment efforts.

An image of code used in Stuxnet (Github)
Code shown in Blackhat (Universal)

In other words, both targeted industrial machinery and monitoring software, and both seem to be written in a language compatible with those ends. Meaning that Hollywood producers took care to research what real-life malware might look like and how it’d likely be used, even if the average audience member wouldn’t know the difference. Compared to the sky-high visuals of navigating a virtual filesystem in Hackers, where early-CGI wizardry was thought the only way to retain audience attention, Blackhat’s commitment to the terminal and actual code is refreshing.

Though it gets the visuals right, Blackhat highlights another common Hollywood misstep when it comes to portraying computer science on screen: It uses programming for heist-related ends. For many moviegoers, hacking is how you get all green lights for your getaway car (The Italian Job) or stick surveillance cameras in a loop (Ocean’s Eleven, The Score, Speed). While most older films frequently fall into this trap, at least one action hacker flick sought to explore how such technology could affect society more broadly, even if it fumbled the details. In 1995, The Net debuted as a cybersecurity-themed Sandra Bullock vehicle that cast one of America’s sweethearts into a kafkaesque nightmare. As part of her persecution at the hands of the evil Gatekeeper corporation, Bullock’s identity is erased from a series of civil and corporate databases, turning her into a fugitive thanks to a forged criminal record. Technical jibberish aside, The Net was ahead of its time in tapping into the feeling of being powerless to contradict an entrenched digital bureaucracy.

It’s taken a recent renaissance in scripted television to allow the space for storytellers to focus on programming as a culture, instead of a techy way to spruce up an action movie. And newer television shows have increasingly been able to capture that nuance without sacrificing mood and veracity. While design details like screens and terminal shots matter, the biggest challenge is writing a script that understands and cares about programming. Mr. Robot, which found critical success when it debuted on USA this summer, is perhaps the most accurate television show ever to depict cybersecurity. In particular, programmers have praised the show’s use of terminology, its faithful incorporation of actual security issues into the plot, and the way its protagonist uses real applications and tools. The HBO comedy series Silicon Valley, which was renewed for a third season, had a scene where a character wrote out the math behind a new compression algorithm. It turned out to be fleshed-out enough that a fan of the show actually recreated it. And even though a show like CSI: Cyber might regularly miss the mark, it has its bright spots, such as an episode about car hacking.

There’s a more timeless reason for producers and writers to scrutinize technical detail: because it makes for good art. “We’re constantly making sure the verisimilitude of the show is as impervious as possible,” said Jonathan Lisco, the showrunner for AMC’s Halt and Catch Fire, a drama about the so-called Silicon Prairie of 1980s Texas. The actress Mackenzie Davis elaborated on the cachet such specificity could lend a show: “We need the groundswell of nerds to be like, ‘You have to watch this!’” The rise of software development as a profession means a bigger slice of the audience can now tell when a showrunner is phoning it in, and pillory the mistakes online. But it’s also no coincidence that Halt and Catch Fire is on the same network that was once home to that other stickler for accuracy—Mad Men.

Rising technical literacy and a Golden Age of creative showrunners have resulted in a crop of shows that infuse an easy but granular technical understanding with top-notch storytelling. Coupling an authentic narrative with technical aplomb can allow even average viewers to intuitively understand high-level concepts that hold up under scrutiny. And even if audiences aren’t compelled to research on their own, the rough shape of a lesson can still seep through—like how cars are hackable, or the importance of guarding against phishing and financial fraud. But above all, more sophisticated representations of hacking make for better art. In an age of black mirrors, the soft glow of an open terminal has never radiated more promise.