In an op-ed in the Daily Mail, Google’s chairman, Eric Schmidt, announced the company’s specific actions. The search engine has moderated search results worldwide to make it even harder to find child pornography. The company has “cleaned up” the results of more than 100,000 searches in 150 languages, he says. (Details about Microsoft’s plans remain less certain.)
Not only has Google amended its search results, Schmidt say it will now display “warnings–from both Google and charities–at the top of our search results for more than 13,000 queries.”
“These alerts make clear that child sexual abuse is illegal and offer advice on where to get help,” he writes. The search results themselves now direct users to news reports about the phenomenon, for instance, and ways to get help, rather than pornographic material itself.
The company also announced a new technology to detect pornographic material in YouTube videos. Though YouTube prohibits any kind of pornography on its site, the new algorithm is said to make finding it easier.
Though prodded by Cameron, Google has in fact long worked to stop the distribution of child pornography on the Internet. In 2006, it joined a financial and technical coalition to fight the material; and in 2008, it began using “hashing” to detect child porn-like material. This June, Google’s chief legal officer, David Drummond, described the technique in the British Telegraph:
Since 2008, we have used “hashing” technology to tag known child sexual abuse images, allowing us to identify duplicate images which may exist elsewhere. Each offending image in effect gets a unique fingerprint that our computers can recognize without humans having to view them again. Recently, we have started working to incorporate these fingerprints into a cross-industry database.
The risk of measures like the new YouTube algorithm is that the software will file “safe” results as porn. As search engine expert Danny Sullivan wrote back in June:
The difficulty is that this approach will produce false positives. There will absolutely be images that are not child porn that will be blocked, because understanding what images really are is a touch search challenge. It’s even harder when you get into judgment calls of “art” versus “porn.”
Don’t get me wrong. I’m not trying to argue for anything that grants a loophole for actual child porn. I’m just saying that a politician thinking there’s some magic wand that can be waved is a politician doing what politicians do best, making grand statements that aren’t always easily backed up.
Indeed, while it can initially detect or identify content via algorithm, only a human being can separate benign family pictures from abusive content. According to some reports, Google employs hundreds of people to do this work, sorting through images all day and separating the lawful from the unlawful.