Google, the company that once declared "No humans were harmed or even used in the creation of this page," is now soliciting a human touch when necessary. Google Places, for example, crowd-sources information about the locations on its service. "Because we can’t be on the ground in every city and town, we enable our great community of users to let us know when something needs to be updated," explains an official blog post. In this case, enlisting humans to do the work sounds peachy. That is, until it goes all wrong. "In recent months, plenty of perfectly healthy businesses across the country have expired--sometimes for hours, other times for weeks--though only in the online realm cataloged and curated by Google," writes The New York Times's David Segal. "The reason is that it is surprisingly easy to report a business as closed in Google Places, the search giant’s version of the local Yellow Pages." While seeking input from the masses might seem like a good idea, on a mass scale with so much information and so many people it doesn't quite work.
It's not that crowdsourcing never works, but for companies that have such large user bases things can get messy. Smaller search engines, Bing and Yahoo "are the scene of far less mischief," but social media giant, Facebook, has run into similar issues. Earlier this year, Facebook removed a photo of a fully clothed gay couple kissing, sending the following statement to the posters. "Shares that contain nudity, or any kind of graphic or sexually suggestive content, are not permitted on Facebook." The photo had not even a bare elbow--the flagging had been a mistake, as Facebook later conceded, a result of human input. Like the places debacle, Facebook relies on its millions of users to flag inappropriate content and then acts accordingly. It doesn't always work.
Since anyone can flag anything these types of crowdsourcing practices can facilitate malicious behavior. In the case of Places, a competitor has an incentive to report a nearby business explains Segal. "But like any open system, this one can be abused. Search engine consultants say that 'closing' a business on Google has become an increasingly common tactic among unscrupulous competitors." Facebook posters had similar experiences. Though flagged content may not have resulted in lost business, reporting comments can get out of hand, as user Trip Affleck noted on a Facebook message board. "Excuse me, Facebook? why are you allowing every random douchebag on the planet to "Flag" my comments on my Friends' posts? if my Friends have a problem with the comment, they can Delete it...if they don't have a problem with my comment, i don't effing care if Milicent from Mosquito Wing, MO is offended by it and feels the need to "Flag" it!" And as we saw with the gay kissing couple, those who don't morally or politically agree with a group or action wield a lot of power.
Of course, these companies understand that with any human input, there's bound to be error--they just haven't accounted for those errors well enough. Google anticipated some false reporting with a "not true button," but the efforts weren't enough continues Segal. "Other owners ... say that the button doesn’t work, or that it takes a week to have any effect. Still others say that immediately after clicking the 'not true' button, their business is immediately 'closed again." Given that crowdsourcing opens the process up to the world, Google should have better built in responses to these wrong-doings. Facebook does not remove every picture flagged, as the company explains in its policy, but maybe that's not enough argues Richard Metzger, the man who posted the kissing men. "There shouldn't be a human being making that determination," he said, as we reported.
Google says it's working on fixing the issue--"That being said, we apologize to both business owners and users for any frustration this recent issue of spam labeling has caused, and we’re committed to making sure that users and potential customers continue to have the most up-to-date and accurate information possible"--which perhaps means limiting user influence just a tad.
This article is from the archive of our partner The Wire.
We want to hear what you think about this article. Submit a letter to the editor or write to email@example.com.