What Motivated the YouTube Shooter?

The suspect, Nasim Aghdam, was driven by animus toward the video site’s policies.

Police officers stand guard outside YouTube headquarters.
Police officers stand guard outside YouTube headquarters following the shooting on April 4. (Elijah Nouvelage / Reuters)

On Tuesday, a woman armed with a 9-millimeter handgun entered the YouTube headquarters in San Bruno, California, during lunchtime. She found a courtyard where employees were eating, and she began shooting, injuring three people before killing herself. One of the victims remains in critical condition.

Police have identified the suspect as Nasim Aghdam, a 39-year-old woman from Southern California. They have not yet named a motive.

But press coverage has filled in some of the gaps. Aghdam’s brother tells CNN that he was worried about his sister in the days before the attack. After she stopped answering her phone this weekend, and after her car turned up near Google’s headquarters in Mountain View, he telephoned the police.

“I googled ‘Mountain View,’ and it was close to YouTube headquarters. And she had a problem with YouTube,” he told the network. He told police: “She went all the way from San Diego, so she might do something.”

According to the Los Angeles Times, investigators are looking at a website bearing Nasim Aghdam’s name. (Authorities haven’t yet confirmed the website was hers.) The garish and poorly organized site rails against YouTube and its workers. “There is no equal growth opportunity on YOUTUBE or any other video sharing site, your channel will grow if they want to!!!!!” it says.

In one section, the writer alleges that “close-minded youtube employees ... began filtering my videos to reduce views & suppress & discaurage [sic] me from making videos.”

In another erratic paragraph, the writer warns that “dictatorship exists in all countries,” then alludes to Hitler’s “big lie” theory of propaganda. “There is no free speech in real world & you will be suppressed for telling the truth that is not supported by the system. Videos of targeted users are filtered & merely relegated, so that people can hardly see their videos!” she says.

She also appears to claim on the site that “anti-vegan ... criminals” inserted a nail into her car tire.

She embeds a number of videos from other YouTube users. (There is no reason to believe that any of these individuals have a connection to Aghdam). One video, from an editor of the conspiracy-theory site Infowars, alleges that pop culture is manipulated to encourage “a toxic identity that the music industry force-feeds young people.” Another, from a social-media vegan advocate, is titled “I’m Being Censored | YouTube’s War Against Vegans.” In the video, the advocate claims that YouTube had placed an inappropriate age restriction on one of her videos after claiming it showed violence.

The website’s author also embeds a video from Casey Neistat, a popular YouTube personality, explaining and criticizing the social network’s move toward “demonetizing” videos.

“If YouTube determines your video to be not advertiser friendly, it pulls all monetization so you don’t make any money from that video. That sort of, kind of, almost makes sense,” Neistat says. “Where it doesn’t make sense is in just how nebulous the terms are that determine just what ‘advertiser-friendly’ is.”

He argues that YouTube’s algorithms, which filter the site, interpret terms like “excessive violence” and “excessive swearing” too broadly.

Neistat’s video is about an ongoing and well-documented phenomenon: the “Adpocalypse” that struck YouTube early in 2017. After a number of major corporations, including Amazon and Coca-Cola, discovered their ads ran next to hateful content on the site, they stopped advertising on the service. YouTube responded by releasing algorithmic filters across the site, which analyzed videos and stripped all advertising from offending ones. But the filters were imperfect, and some of YouTube’s categories seemed overbroad: Some videos about “sensitive social issues” or “tragedy and conflict” were stripped of ads, according to New York magazine.

Immediately after the new policy took hold, some popular YouTubers told TubeFilter, an industry publication, that they lost almost two-thirds of their income.

The Atlantic has had some experience with a related version of this: Last month, YouTube removed a video depicting a white-nationalist rally, captured by Atlantic journalists, from its search and recommendation engines. The social network seemed to believe the video was extremist in nature, not journalism about extremists. It restored the video to its search pages after the publication protested.

Taken together, the two interests that unite Aghdam’s apparent online presence are an aggressive support for veganism and a suspicion of YouTube’s policies. Neither of these subjects are outside the mainstream by themselves, of course, though Aghdam’s writing about them was especially confused and rambling.

It’s a cliché, at this point, to call the major social networks “communities.” The term is inaccurate, but it implies something true: that the networks are more than just their software, that they’re full of people who have a real or imagined stake in their structure. Those people care about what the network is like, arguing endlessly over who participates in them, who wields power in them, and what their ideal form would look like.

This is another way of saying that social networks like YouTube don’t just affect national electoral politics. Instead, they have a strange and internal politics all their own. Aghdam, as apparently misguided and unwell as she was, seems to have tried to affect those politics. Terrorism is, in any form, for any cause, spectacular violence for political ends. It seems appropriate to call Aghdam the first terrorist of a major social network.