What Does Google's Subtle Censorship Say About Us?

The latest news from Google is that searches related to online piracy through torrent downloads are being dropped from the much touted 'autocomplete' and 'instant search' functions. As we have noted in the past, the notion of Google shaping or even suggesting searches is a relatively strange concept on its own. And now terms like 'BitTorrent,' and 'Vodo' will no longer show up automatically in your search bar. Yet the filter seems somewhat arbitrary: An autocomplete search for 'how to pirate music' yields the large torrent site the Pirate Bay, and as plenty of critics have pointed out, a significant number of torrent downloads are legitimately legal. What gives? Did Google buckle to pressure from the MPAA, RIAA, and other entertainment industry interests?

Even if we were to pretend that all torrent downloads were illegal, Google's blocking has raised some interesting questions about its relationship with potentially criminal activities. Last I checked, making an explosive is a pretty serious crime; but when we type 'how to make a bomb' in the search bar, Google suggests 'out of household items' to complete the phrase. Write 'where to buy drugs' and 'where to buy crack in D.C.' is the instant result. Enter 'how to kill a person' and 'and get away with it' is what Google recommends. Gosh, it's really swell of Google to do its part to shut down all of the menacing downloading out there! I'm all for the freedom of potentially scandalous, even illegal information, but shouldn't it be consistent? Autocomplete has even blocked the phrase 'Google and crime.'

Or what about autocomplete's questionable assistance with what may be legal, but still offensive terms? Type in 'Asians have' and autocomplete is right there with 'no souls.' Try 'Jews have' instead and 'horns' is the result that the search giant recommends. Enter 'Black people are' and Google spits out 'lazy.' And why do we get help with 'sexual predators' but not 'sexual positions'? There are no obvious answers.

It's a known fact that Google's autocomplete filters out searches for things like curse words, pornographic terms, and a handful of potentially illegal activities. And what they do and don't filter could simply be a result of various company biases and decisions. But if, as the leader in digital queries, Google's search is more a reflection of ourselves and the times we live in, we might want to wonder why it's easier to get information about 'giving drugs' (you) 'to a minor' (Google) than 'giving a massage' (Google goes blank). Is the mirror broken, or is it us darling?

Presented by

Eli Rosenberg is a writer in Washington, D.C. whose work has appeared on Esquire.com, Salon, and The Onion's A.V. Club.

Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register with Disqus.

Please note that The Atlantic's account system is separate from our commenting system. To log in or register with The Atlantic, use the Sign In button at the top of every page.

blog comments powered by Disqus

Video

A Stop-Motion Tour of New York City

A filmmaker animated hundreds of still photographs to create this Big Apple flip book

Video

The Absurd Psychology of Restaurant Menus

Would people eat healthier if celery was called "cool celery?"

Video

This Japanese Inn Has Been Open For 1,300 Years

It's one of the oldest family businesses in the world.

Video

What Happens Inside a Dying Mind?

Science cannot fully explain near-death experiences.

More in Technology

Just In