What Does Google's Subtle Censorship Say About Us?

More

The latest news from Google is that searches related to online piracy through torrent downloads are being dropped from the much touted 'autocomplete' and 'instant search' functions. As we have noted in the past, the notion of Google shaping or even suggesting searches is a relatively strange concept on its own. And now terms like 'BitTorrent,' and 'Vodo' will no longer show up automatically in your search bar. Yet the filter seems somewhat arbitrary: An autocomplete search for 'how to pirate music' yields the large torrent site the Pirate Bay, and as plenty of critics have pointed out, a significant number of torrent downloads are legitimately legal. What gives? Did Google buckle to pressure from the MPAA, RIAA, and other entertainment industry interests?

Type 'Jews have' and Google spits out 'horns.' Enter 'Black people are' and the search giant recommends 'lazy.'

Even if we were to pretend that all torrent downloads were illegal, Google's blocking has raised some interesting questions about its relationship with potentially criminal activities. Last I checked, making an explosive is a pretty serious crime; but when we type 'how to make a bomb' in the search bar, Google suggests 'out of household items' to complete the phrase. Write 'where to buy drugs' and 'where to buy crack in D.C.' is the instant result. Enter 'how to kill a person' and 'and get away with it' is what Google recommends. Gosh, it's really swell of Google to do its part to shut down all of the menacing downloading out there! I'm all for the freedom of potentially scandalous, even illegal information, but shouldn't it be consistent? Autocomplete has even blocked the phrase 'Google and crime.'

Or what about autocomplete's questionable assistance with what may be legal, but still offensive terms? Type in 'Asians have' and autocomplete is right there with 'no souls.' Try 'Jews have' instead and 'horns' is the result that the search giant recommends. Enter 'Black people are' and Google spits out 'lazy.' And why do we get help with 'sexual predators' but not 'sexual positions'? There are no obvious answers.

It's a known fact that Google's autocomplete filters out searches for things like curse words, pornographic terms, and a handful of potentially illegal activities. And what they do and don't filter could simply be a result of various company biases and decisions. But if, as the leader in digital queries, Google's search is more a reflection of ourselves and the times we live in, we might want to wonder why it's easier to get information about 'giving drugs' (you) 'to a minor' (Google) than 'giving a massage' (Google goes blank). Is the mirror broken, or is it us darling?

Jump to comments
Presented by

Eli Rosenberg is a writer in Washington, D.C. whose work has appeared on Esquire.com, Salon, and The Onion's A.V. Club.

Get Today's Top Stories in Your Inbox (preview)

Saving Central: One High School's Struggle After Resegregation

Meet the students and staff at Tuscaloosa’s all-black Central High School in a short documentary film by Maisie Crow. 


Elsewhere on the web

Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register. blog comments powered by Disqus

Video

Where Time Comes From

The clocks that coordinate your cellphone, GPS, and more

Video

Computer Vision Syndrome and You

Save your eyes. Take breaks.

Video

What Happens in 60 Seconds

Quantifying human activity around the world

Writers

Up
Down

More in Technology

Just In