The Tradeoffs in Google's New Crackdown on Child Pornography

With new algorithms to make it difficult to find abusive images, the search engine ends up blocking a lot of legitimate content, too.
More
David Cameron led an 'Internet Safety' Summit on, after calling for Google and Microsoft to implement new policies to limit the spread of child pornography this summer. (Reuters)

You don’t need permission to put a website online, and, once it’s there, anyone in the world can see it. Much of what makes the web incredible can also make it a harsh, unkind place: Just as photographers and musicians can find a global audience, child abusers can, too.

Today, Google and Microsoft announced they are taking technological steps to make the Internet less hospitable to child abusers. The news comes after more than 300 people were arrested worldwide last week—and 400 children rescued—in one of the largest-ever crackdowns on child pornographers. It also comes months after the U.K.’s Prime Minister David Cameron called on the search engines to do more to obstruct child pornographers.

Google has long had “zero tolerance” for child pornography. But in practice, Google’s algorithms cannot perfectly detect every piece of pornography; they require making tradeoffs. The announcements today make it highly likely that, under pressure from the British government, the search giant opted to accept many more “false positives”—blocking some non-pornographic content in order to make it harder to find child pornography.

In an op-ed in the Daily MailGoogle’s chairman, Eric Schmidt, announced the company’s specific actions. The search engine has moderated search results worldwide to make it even harder to find child pornography. The company has “cleaned up” the results of more than 100,000 searches in 150 languages, he says. (Details about Microsoft’s plans remain less certain.)

Not only has Google amended its search results, Schmidt say it will now display “warnings–from both Google and charities–at the top of our search results for more than 13,000 queries.”

“These alerts make clear that child sexual abuse is illegal and offer advice on where to get help,” he writes. The search results themselves now direct users to news reports about the phenomenon, for instance, and ways to get help, rather than pornographic material itself.

The company also announced a new technology to detect pornographic material in YouTube videos. Though YouTube prohibits any kind of pornography on its site, the new algorithm is said to make finding it easier. 

Though prodded by Cameron, Google has in fact long worked to stop the distribution of child pornography on the Internet. In 2006, it joined a financial and technical coalition to fight the material; and in 2008, it began using “hashing” to detect child porn-like material. This June, Google’s chief legal officer, David Drummond, described the technique in the British Telegraph:

Since 2008, we have used “hashing” technology to tag known child sexual abuse images, allowing us to identify duplicate images which may exist elsewhere. Each offending image in effect gets a unique fingerprint that our computers can recognize without humans having to view them again. Recently, we have started working to incorporate these fingerprints into a cross-industry database.

The risk of measures like the new YouTube algorithm is that the software will file “safe” results as porn. As search engine expert Danny Sullivan wrote back in June:

The difficulty is that this approach will produce false positives. There will absolutely be images that are not child porn that will be blocked, because understanding what images really are is a touch search challenge. It’s even harder when you get into judgment calls of “art” versus “porn.”

Don’t get me wrong. I’m not trying to argue for anything that grants a loophole for actual child porn. I’m just saying that a politician thinking there’s some magic wand that can be waved is a politician doing what politicians do best, making grand statements that aren’t always easily backed up.

Indeed, while it can initially detect or identify content via algorithm, only a human being can separate benign family pictures from abusive content. According to some reports, Google employs hundreds of people to do this work, sorting through images all day and separating the lawful from the unlawful.

Those workers do the hardest work in the process—and they’re also the hardest to find.

“They're precluded from speaking to the media, and it is difficult to reach out and find them,” Sarah Roberts, a professor at Western University in Canada, told NPR’s Rebecca Hersher in an excellent report on the workers from yesterday.

I think there's an aspect of trauma that can often go along with this work and many workers would rather go home and tune out, not talk about it,” Roberts said.

Cameron first pressured the two search giants to suppress “child abuse content” this summer. In that speech, he called for broader censorship of Internet pornography, specifically announcing plans to block pornography by default in U.K. homes. (He later backtracked on that specific idea.) 

According to Cameron, Google and Microsoft initially resisted these suggested policies. They’ve now, obviously, accepted them.

It’s hard, on the one hand, to see how the public benefit of these policies could be outweighed by any public harm. Child pornography is the very sort of thing a government should afflict.

But, as Danny Sullivan showed this summer, googling “child porn” didn’t really turn up child abuse—it returned charities and news articles. Google’s new algorithms seem to make it easier to find commentary around child abuse while making it harder to find the thing itself, but legality—and morality—have no algorithm. Writing and art about child abuse is the very sort of thing a government might promote, yet some of it might now be blocked, mathematically labeled as pornographic. 

I’m not sure anyone goes to Google to find work like that. But Google also operates all sorts of other tools, including academic indices, and code—like law—spins off consequences its authors never intended. Today’s news should be welcomed. But the policies and technologies put in effect—and those still promoted by Cameron—should be inspected.

Jump to comments
Presented by

Robinson Meyer is an associate editor at The Atlantic, where he covers technology.

Get Today's Top Stories in Your Inbox (preview)

Tracing Sriracha's Origin to a Seaside Town in Thailand

Ever wonder how the wildly popular hot sauce got its name? It all started in Si Racha.


Elsewhere on the web

Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register. blog comments powered by Disqus

Video

Where the Wild Things Go

A government facility outside of Denver houses more than a million products of the illegal wildlife trade, from tigers and bears to bald eagles.

Video

Adults Need Playtime Too

When was the last time you played your favorite childhood game?

Video

Is Wine Healthy?

James Hamblin prepares to impress his date with knowledge about the health benefits of wine.

Video

The World's Largest Balloon Festival

Nine days, more than 700 balloons, and a whole lot of hot air

Writers

Up
Down

More in Technology

Just In