What to Make of Google's Decision to Block the 'Innocence of Muslims' Movie

More

The attacks on U.S. missions abroad this week have been a test for Google's "bias in favor of free expression."

RTR37VPT-615.jpg

The inside of the U.S. consulate in Benghazi following the attack earlier this week. (Reuters)

Wednesday morning must have been a nightmare for the people who work at YouTube. Late the night before, angry demonstrators had attacked the U.S. missions in Cairo and Egypt, killing four Americans, purportedly provoked by an American-made video that villified and mocked Muhammad. That video, like pretty much all videos these days, was available on YouTube, a site where Google (which owns YouTube) has the power to block access to content on a country-by-country basis. By midday Wednesday, the company had decided that that was just what it was going to do.

A YouTube spokesperson explained via email:

We work hard to create a community everyone can enjoy and which also enables people to express different opinions. This can be a challenge because what's OK in one country can be offensive elsewhere. This video -- which is widely available on the Web -- is clearly within our guidelines and so will stay on YouTube. However, given the very difficult situation in Libya and Egypt we have temporarily restricted access in both countries. Our hearts are with the families of the people murdered in Tuesday's attack in Libya.

YouTube is in a tough spot here. It certainly doesn't want to play any part, even an indirect one, in fueling violence that has already resulted in four American deaths. But censoring the video also cuts against Google's stated ideology, which has a "bias in favor of free expression -- not just because it's a key tenet of free societies, but also because more information generally means more choice, more power, more economic opportunity and more freedom for people." Google's top leaders have championed the power of the Internet to make society more free by making the Internet more free, and the company has been a vocal and constant critic of China's efforts to control what people do and say online. In certain instances, Google has prominently defied a government's request to remove content, such as when it protected videos documenting police brutality here in the United States.

This is not to say that Google is absolutist about free expression. Quite the contrary: Google has made a point of its position that it takes takedown requests very seriously, and has released a Transparency Report every six months over the past two years, detailing just how much content it has removed and where. Google has a careful but somewhat opaque process by which they determine whether to comply with a government's requests, taking into account a country's local laws (e.g. laws that prohibit pro-Nazi content in Germany) and whether the request is appropriately narrow, as Google analyst Dorothy Chou explained to me earlier this year. Additionally, YouTube has a pretty reasonable set of "Community Guidelines" that prohibit sexually explicit content, "bad stuff like animal abuse, drug abuse, under-age drinking and smoking, or bomb making," and hate speech.

So content removal is nothing new to Google, and it works hard and throws a lot of people at managing takedown requests. But, even considering that context -- or, perhaps, especially considering that context -- Google's decision to block access to the "Innocence of Muslims" video is unprecedented. Even by Google's own assessment, the video is "clearly within our guidelines" -- meaning it is not hate speech and did not otherwise violate the website's terms of service. (An update from a YouTube spokesperson by email late today added that the video additionally had been blocked in India and Indonesia, where it was, in fact, in violation of local law.)

Why did Google make such an extraordinary decision? Perhaps it felt somehow culpable for the deaths or feared that if the video spread further, more would come; perhaps -- as the Los Angeles Times thinly suggested -- it felt pressured to do so by Obama administration officials. Google, for it's part, won't say, and we are left guessing. (I have twice criticized Google's Transparency Report for, ironically, not being very transparent on this key question of how Google makes decisions regarding what content to remove.)

Jump to comments
Presented by

Rebecca J. Rosen is a senior editor at The Atlantic, where she oversees the Business Channel. She was previously an associate editor at The Wilson Quarterly.

Get Today's Top Stories in Your Inbox (preview)

What Is the Greatest Story Ever Told?

A panel of storytellers share their favorite tales, from the Bible to Charlotte's Web.


Elsewhere on the web

Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register. blog comments powered by Disqus

Video

The Death of Film

You'll never hear the whirring sound of a projector again.

Video

How to Hunt With Poison Darts

A Borneo hunter explains one of his tribe's oldest customs: the art of the blowpipe

Video

A Delightful, Pixar-Inspired Cartoon

An action figure and his reluctant sidekick trek across a kitchen in search of treasure.

Video

I Am an Undocumented Immigrant

"I look like a typical young American."

Video

Why Did I Study Physics?

Using hand-drawn cartoons to explain an academic passion

Writers

Up
Down

More in Technology

Just In