What Does Google's Subtle Censorship Say About Us?

More

The latest news from Google is that searches related to online piracy through torrent downloads are being dropped from the much touted 'autocomplete' and 'instant search' functions. As we have noted in the past, the notion of Google shaping or even suggesting searches is a relatively strange concept on its own. And now terms like 'BitTorrent,' and 'Vodo' will no longer show up automatically in your search bar. Yet the filter seems somewhat arbitrary: An autocomplete search for 'how to pirate music' yields the large torrent site the Pirate Bay, and as plenty of critics have pointed out, a significant number of torrent downloads are legitimately legal. What gives? Did Google buckle to pressure from the MPAA, RIAA, and other entertainment industry interests?

Type 'Jews have' and Google spits out 'horns.' Enter 'Black people are' and the search giant recommends 'lazy.'

Even if we were to pretend that all torrent downloads were illegal, Google's blocking has raised some interesting questions about its relationship with potentially criminal activities. Last I checked, making an explosive is a pretty serious crime; but when we type 'how to make a bomb' in the search bar, Google suggests 'out of household items' to complete the phrase. Write 'where to buy drugs' and 'where to buy crack in D.C.' is the instant result. Enter 'how to kill a person' and 'and get away with it' is what Google recommends. Gosh, it's really swell of Google to do its part to shut down all of the menacing downloading out there! I'm all for the freedom of potentially scandalous, even illegal information, but shouldn't it be consistent? Autocomplete has even blocked the phrase 'Google and crime.'

Or what about autocomplete's questionable assistance with what may be legal, but still offensive terms? Type in 'Asians have' and autocomplete is right there with 'no souls.' Try 'Jews have' instead and 'horns' is the result that the search giant recommends. Enter 'Black people are' and Google spits out 'lazy.' And why do we get help with 'sexual predators' but not 'sexual positions'? There are no obvious answers.

It's a known fact that Google's autocomplete filters out searches for things like curse words, pornographic terms, and a handful of potentially illegal activities. And what they do and don't filter could simply be a result of various company biases and decisions. But if, as the leader in digital queries, Google's search is more a reflection of ourselves and the times we live in, we might want to wonder why it's easier to get information about 'giving drugs' (you) 'to a minor' (Google) than 'giving a massage' (Google goes blank). Is the mirror broken, or is it us darling?

Jump to comments
Presented by

Eli Rosenberg is a writer in Washington, D.C. whose work has appeared on Esquire.com, Salon, and The Onion's A.V. Club.

Get Today's Top Stories in Your Inbox (preview)

CrossFit Versus Yoga: Choose a Side

How a workout becomes a social identity


Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register. blog comments powered by Disqus

Video

CrossFit Versus Yoga: Choose a Side

How a workout becomes a social identity

Video

Is Technology Making Us Better Storytellers?

The minds behind House of Cards and The Moth weigh in.

Video

A Short Film That Skewers Hollywood

A studio executive concocts an animated blockbuster. Who cares about the story?

Video

In Online Dating, Everyone's a Little Bit Racist

The co-founder of OKCupid shares findings from his analysis of millions of users' data.

Video

What Is a Sandwich?

We're overthinking sandwiches, so you don't have to.

Video

Let's Talk About Not Smoking

Why does smoking maintain its allure? James Hamblin seeks the wisdom of a cool person.

Writers

Up
Down

More in Technology

Just In