Facebook Tries to Find the Right System for Flagging Suicidal Behavior

More

You're on Facebook one day when you notice that an acquaintance -- not someone really close, but a person with whom you're friendly -- posts a status update that seems despondent. Something like, "Man, life doesn't seem worth it. I can't take it." You look for an explanation on the person's profile, wonder if it's some kind of inside joke. But it dawns on you that it might be an honest expression of emotional pain, perhaps a cry for help.

What do you do?

It's a difficult social problem. It's not like you're a close friend of the person and would feel comfortable asking him to pour his heart out to you. Maybe you've only met him once. You very well might do nothing.

Facebook is trying to offer a new avenue to help you solve this new dilemma of the digital world. They have inserted the ability to anonymously flag someone as someone who might be suicidal. This is a very delicate user interaction design, obviously. On the one hand, Facebook wants people to be able to report real suicidal behavior, but they also don't want to create an obvious target for people who want to create mischief. Where they place the reporting mechanism as well as the behind-the-scenes processes for dealing with user reports could have very real consequences.

Let me spell out the compromise Facebook has come to. I think it is debatable, but there probably is no perfect answer in this situation. It's just weird to find ourselves in the situation of seeing expressions of suicidal ideation from people we don't know well.

The suicidal behavior reporting button is located within the normal mechanism for reporting questionable content. But it's *not* on the first menu of options, as you can see on the left. Instead, you have to mark something "Violence or harmful behavior" before you see the option to report "Suicidal Content." This seems suboptimal to me as I wouldn't think to put suicidal behavior into that category. A Facebook spokesperson told me, "We have been, and will continue to, work with the suicide prevention community and iterate on the placement of the Suicidal Content button."
Suicidemenus.jpg

Nonetheless, after someone makes this kind of report, they receive a followup email from an actual human being to whom they can respond. It reads like this:

We will do our best to assist you with this matter. Please describe the problem you are experiencing with Facebook in as much detail as possible and include any relevant web addresses (URLs). More detailed information will help us investigate the issue further.

Thanks for contacting Facebook,

[Name]

According to Facebook, they have an internal "systems to prioritize the most serious reports, and a trained team of reviewers who respond to reports and escalate them." That's good because on this issue, it doesn't seem like an algorithm could make the subtle discriminations necessary to offer people the kind of help they need while filtering out pranks.

If the internal infrastructure finds that the person reported is exhibiting suicidal behavior, they'll be offered a private chat session with someone from the National Suicide Prevention Lifeline, as well as the organization's phone number.

Jump to comments
Presented by

Alexis C. Madrigal

Alexis Madrigal is the deputy editor of TheAtlantic.com, where he also oversees the Technology Channel. He's the author of Powering the Dream: The History and Promise of Green Technology. More

The New York Observer has called Madrigal "for all intents and purposes, the perfect modern reporter." He co-founded Longshot magazine, a high-speed media experiment that garnered attention from The New York Times, The Wall Street Journal, and the BBC. While at Wired.com, he built Wired Science into one of the most popular blogs in the world. The site was nominated for best magazine blog by the MPA and best science Web site in the 2009 Webby Awards. He also co-founded Haiti ReWired, a groundbreaking community dedicated to the discussion of technology, infrastructure, and the future of Haiti.

He's spoken at Stanford, CalTech, Berkeley, SXSW, E3, and the National Renewable Energy Laboratory, and his writing was anthologized in Best Technology Writing 2010 (Yale University Press).

Madrigal is a visiting scholar at the University of California at Berkeley's Office for the History of Science and Technology. Born in Mexico City, he grew up in the exurbs north of Portland, Oregon, and now lives in Oakland.

Get Today's Top Stories in Your Inbox (preview)

The Time JFK Called the Air Force to Complain About a 'Silly Bastard'

51 years ago, President John F. Kennedy made a very angry phone call.


Elsewhere on the web

Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register. blog comments powered by Disqus

Video

Adventures in Legal Weed

Colorado is now well into its first year as the first state to legalize recreational marijuana. How's it going? James Hamblin visits Aspen.

Video

What Makes a Story Great?

The storytellers behind House of CardsandThis American Life reflect on the creative process.

Video

Tracing Sriracha's Origin to Thailand

Ever wonder how the wildly popular hot sauce got its name? It all started in Si Racha.

Video

Where Confiscated Wildlife Ends Up

A government facility outside of Denver houses more than a million products of the illegal wildlife trade, from tigers and bears to bald eagles.

Video

Is Wine Healthy?

James Hamblin prepares to impress his date with knowledge about the health benefits of wine.

Video

The World's Largest Balloon Festival

Nine days, more than 700 balloons, and a whole lot of hot air

Writers

Up
Down

More in Technology

Just In