The latest news from Google is that searches related to online piracy through torrent downloads are being dropped from the much touted 'autocomplete' and 'instant search' functions. As we have noted in the past, the notion of Google shaping or even suggesting searches is a relatively strange concept on its own. And now terms like 'BitTorrent,' and 'Vodo' will no longer show up automatically in your search bar. Yet the filter seems somewhat arbitrary: An autocomplete search for 'how to pirate music' yields the large torrent site the Pirate Bay, and as plenty of critics have pointed out, a significant number of torrent downloads are legitimately legal. What gives? Did Google buckle to pressure from the MPAA, RIAA, and other entertainment industry interests?
Even if we were to pretend that all torrent downloads were illegal, Google's blocking has raised some interesting questions about its relationship with potentially criminal activities. Last I checked, making an explosive is a pretty serious crime; but when we type 'how to make a bomb' in the search bar, Google suggests 'out of household items' to complete the phrase. Write 'where to buy drugs' and 'where to buy crack in D.C.' is the instant result. Enter 'how to kill a person' and 'and get away with it' is what Google recommends. Gosh, it's really swell of Google to do its part to shut down all of the menacing downloading out there! I'm all for the freedom of potentially scandalous, even illegal information, but shouldn't it be consistent? Autocomplete has even blocked the phrase 'Google and crime.'
Or what about autocomplete's questionable assistance with what may be legal, but still offensive terms? Type in 'Asians have' and autocomplete is right there with 'no souls.' Try 'Jews have' instead and 'horns' is the result that the search giant recommends. Enter 'Black people are' and Google spits out 'lazy.' And why do we get help with 'sexual predators' but not 'sexual positions'? There are no obvious answers.
It's a known fact that Google's autocomplete filters out searches for things like curse words, pornographic terms, and a handful of potentially illegal activities. And what they do and don't filter could simply be a result of various company biases and decisions. But if, as the leader in digital queries, Google's search is more a reflection of ourselves and the times we live in, we might want to wonder why it's easier to get information about 'giving drugs' (you) 'to a minor' (Google) than 'giving a massage' (Google goes blank). Is the mirror broken, or is it us darling?
- The Atlantic Wire: "Is Google Censoring the Search Function?"
We want to hear what you think about this article. Submit a letter to the editor or write to firstname.lastname@example.org.