Facebook Workers Try to Spend Less Than 1 Second Determining Whether Content Is 'Appropriate'

More

Emily Bazelon's deeply reported piece on bullying at Woodrow Wilson Middle School in Watertown, Connecticut is full of important information. But for me, the most telling moment occurred when she traveled to Facebook headquarters to see how they dealt with so-called "third party" reports of about inappropriate content.

Once there, she found someone who scans through the requests. And she asked him how long he might spend deciding if a page should stay up or come down. In the section below, he tells her that they "optimize for half a second." Half a second! As it happens, in this story, a bullying Facebook group at Woodrow Wilson was incorrectly labeled as appropriate twice, despite not having been organized under a real name, so that it was clearly in violation of Facebook's terms of service, among other problems. The mislabeling by two reps also meant that any further requests to take it down would be ignored. Though no one making requests to take the page down -- like, say, the people working at the school -- would be informed that they were being ignored.

Facebook's come up with some remarkable tools for managing conflict on its site. They've pioneered efforts to identify suicidal users. And they've got a really, really tough problem on their hands when it comes to kids and bullying. Middle- and high-schoolers are all on Facebook and that means all their drama is on Facebook, too. (I should also note that no other social network handles these issues particularly well, either.)

But Facebook purports to be a safer, real-namey Internet. That's part of the pitch, right? And they claim to have ways of handling problems like this, which serves as a defense to the suggestion that perhaps a government agency should try to regulate them, especially around minors' use of the service. These methods they have, though, are almost always opaque. They claim everything is working well and we have to believe that's true because there are no independent audits being made.

And that's why Bazelon's account from inside Facebook is so important. She got to see the tools and management practices at work. And what she saw dismays me. Facebook could clearly provide better customer service, but they don't want to. Why not? It costs money. Even when they do hire people, of whom they clearly don't have enough, they force them to work at a pace that ensures mistakes. It strikes me that they're optimizing not for actual responsiveness to real concerns but the appearance of responsiveness to real concerns.

This probably isn't surprising. This is how businesses work. Facebook itself recognizes this is an issue and how they might solve it. Here's one of their disclosures in the company's annual report to the SEC:

We have in the past experienced, and we expect that in the future we will continue to experience, media, legislative, or regulatory scrutiny of our decisions regarding user privacy or other issues, which may adversely affect our reputation and brand. We also may fail to provide adequate customer service, which could erode confidence in our brand. Our brand may also be negatively affected by the actions of users that are deemed to be hostile or inappropriate to other users, or by users acting under false or inauthentic identities. Maintaining and enhancing our brand may require us to make substantial investments and these investments may not be successful. If we fail to successfully promote and maintain the Facebook brand or if we incur excessive expenses in this effort, our business and financial results may be adversely affected.

The bet Facebook is making is this: they'll catch most baldly inappropriate content if they give their reviewers half a second to look at each page. Sure, they'll miss some, but that's good enough to keep users on the platform and operations cost low enough for investors.

That's reasonable, at least until Facebook's consumers demand more accountability from the company.

The whole story should serve as a reminder that when we talk about "online bullying," we need to specify where that bullying is occurring and identify the actors involved. Decrying "the Internet" does little. Identifying the people who are responsible for reviewing bullying pages on Facebook -- and the processes that limit their effectiveness -- could do a lot.

Here's the full anecdote from Bazelon's piece:

Sullivan cycled through the complaints with striking speed, deciding with very little deliberation which posts and pictures came down, which stayed up, and what other action, if any, to take. I asked him whether he would ever spend, say, 10 minutes on a particularly vexing report, and Willner raised his eyebrows. "We optimize for half a second," he said. "Your average decision time is a second or two, so 30 seconds would be a really long time." (A Facebook spokesperson said later that the User Operations teams use a process optimized for accuracy, not speed.) That reminded me of Let's Start Drama. Six months after Carbonella sent his reports, the page was still up. I asked why. It hadn't been set up with the user's real name, so wasn't it clearly in violation of Facebook's rules?

After a quick search by Sullivan, the blurry photos I'd seen many times at the top of the Let's Start Drama page appeared on the screen. Sullivan scrolled through some recent "Who's hotter?" comparisons and clicked on the behind-the-scenes history of the page, which the Common Review Tool allowed him to call up. A window opened on the right side of the screen, showing that multiple reports had been made. Sullivan checked to see whether the reports had failed to indicate that Let's Start Drama was administered by a fake user profile. But that wasn't the problem: the bubbles had been clicked correctly. Yet next to this history was a note indicating that future reports about the content would be ignored.

We sat and stared at the screen.

Willner broke the silence. "Someone made a mistake," he said. "This profile should have been disabled." He leaned in and peered at the screen. "Actually, two different reps made the same mistake, two different times."

There was another long pause. Sullivan clicked on Let's Start Drama to delete it.

Jump to comments
Presented by

Alexis C. Madrigal

Alexis Madrigal is a senior editor at The Atlantic, where he oversees the Technology Channel. He's the author of Powering the Dream: The History and Promise of Green Technology. More

The New York Observer calls Madrigal "for all intents and purposes, the perfect modern reporter." He co-founded Longshot magazine, a high-speed media experiment that garnered attention from The New York Times, The Wall Street Journal, and the BBC. While at Wired.com, he built Wired Science into one of the most popular blogs in the world. The site was nominated for best magazine blog by the MPA and best science Web site in the 2009 Webby Awards. He also co-founded Haiti ReWired, a groundbreaking community dedicated to the discussion of technology, infrastructure, and the future of Haiti.

He's spoken at Stanford, CalTech, Berkeley, SXSW, E3, and the National Renewable Energy Laboratory, and his writing was anthologized in Best Technology Writing 2010 (Yale University Press).

Madrigal is a visiting scholar at the University of California at Berkeley's Office for the History of Science and Technology. Born in Mexico City, he grew up in the exurbs north of Portland, Oregon, and now lives in Oakland.

Get Today's Top Stories in Your Inbox (preview)

Sad Desk Lunch: Is This How You Want to Die?

How to avoid working through lunch, and diseases related to social isolation.


Elsewhere on the web

Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register. blog comments powered by Disqus

Video

Where Time Comes From

The clocks that coordinate your cellphone, GPS, and more

Video

Computer Vision Syndrome and You

Save your eyes. Take breaks.

Video

What Happens in 60 Seconds

Quantifying human activity around the world

Writers

Up
Down

More in Technology

Just In