The Tradeoffs in Google's New Crackdown on Child Pornography

With new algorithms to make it difficult to find abusive images, the search engine ends up blocking a lot of legitimate content, too.
David Cameron led an 'Internet Safety' Summit on, after calling for Google and Microsoft to implement new policies to limit the spread of child pornography this summer. (Reuters)

You don’t need permission to put a website online, and, once it’s there, anyone in the world can see it. Much of what makes the web incredible can also make it a harsh, unkind place: Just as photographers and musicians can find a global audience, child abusers can, too.

Today, Google and Microsoft announced they are taking technological steps to make the Internet less hospitable to child abusers. The news comes after more than 300 people were arrested worldwide last week—and 400 children rescued—in one of the largest-ever crackdowns on child pornographers. It also comes months after the U.K.’s Prime Minister David Cameron called on the search engines to do more to obstruct child pornographers.

Google has long had “zero tolerance” for child pornography. But in practice, Google’s algorithms cannot perfectly detect every piece of pornography; they require making tradeoffs. The announcements today make it highly likely that, under pressure from the British government, the search giant opted to accept many more “false positives”—blocking some non-pornographic content in order to make it harder to find child pornography.

In an op-ed in the Daily MailGoogle’s chairman, Eric Schmidt, announced the company’s specific actions. The search engine has moderated search results worldwide to make it even harder to find child pornography. The company has “cleaned up” the results of more than 100,000 searches in 150 languages, he says. (Details about Microsoft’s plans remain less certain.)

Not only has Google amended its search results, Schmidt say it will now display “warnings–from both Google and charities–at the top of our search results for more than 13,000 queries.”

“These alerts make clear that child sexual abuse is illegal and offer advice on where to get help,” he writes. The search results themselves now direct users to news reports about the phenomenon, for instance, and ways to get help, rather than pornographic material itself.

The company also announced a new technology to detect pornographic material in YouTube videos. Though YouTube prohibits any kind of pornography on its site, the new algorithm is said to make finding it easier. 

Though prodded by Cameron, Google has in fact long worked to stop the distribution of child pornography on the Internet. In 2006, it joined a financial and technical coalition to fight the material; and in 2008, it began using “hashing” to detect child porn-like material. This June, Google’s chief legal officer, David Drummond, described the technique in the British Telegraph:

Since 2008, we have used “hashing” technology to tag known child sexual abuse images, allowing us to identify duplicate images which may exist elsewhere. Each offending image in effect gets a unique fingerprint that our computers can recognize without humans having to view them again. Recently, we have started working to incorporate these fingerprints into a cross-industry database.

The risk of measures like the new YouTube algorithm is that the software will file “safe” results as porn. As search engine expert Danny Sullivan wrote back in June:

The difficulty is that this approach will produce false positives. There will absolutely be images that are not child porn that will be blocked, because understanding what images really are is a touch search challenge. It’s even harder when you get into judgment calls of “art” versus “porn.”

Don’t get me wrong. I’m not trying to argue for anything that grants a loophole for actual child porn. I’m just saying that a politician thinking there’s some magic wand that can be waved is a politician doing what politicians do best, making grand statements that aren’t always easily backed up.

Indeed, while it can initially detect or identify content via algorithm, only a human being can separate benign family pictures from abusive content. According to some reports, Google employs hundreds of people to do this work, sorting through images all day and separating the lawful from the unlawful.

Presented by

Robinson Meyer is an associate editor at The Atlantic, where he covers technology.

Google Street View, Transformed Into a Tiny Planet

A 360-degree tour of our world, made entirely from Google's panoramas

Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register.

blog comments powered by Disqus

Video

Google Street View, Transformed Into a Tiny Planet

A 360-degree tour of our world, made entirely from Google's panoramas

Video

The 86-Year-Old Farmer Who Won't Quit

A filmmaker returns to his hometown to profile the patriarch of a family farm

Video

Riding Unicycles in a Cave

"If you fall down and break your leg, there's no way out."

Video

Carrot: A Pitch-Perfect Satire of Tech

"It's not just a vegetable. It's what a vegetable should be."

Video

The Benefits of Living Alone on a Mountain

"You really have to love solitary time by yourself."

More in Technology

Just In