Facebook Tries to Find the Right System for Flagging Suicidal Behavior

You're on Facebook one day when you notice that an acquaintance -- not someone really close, but a person with whom you're friendly -- posts a status update that seems despondent. Something like, "Man, life doesn't seem worth it. I can't take it." You look for an explanation on the person's profile, wonder if it's some kind of inside joke. But it dawns on you that it might be an honest expression of emotional pain, perhaps a cry for help.

What do you do?

It's a difficult social problem. It's not like you're a close friend of the person and would feel comfortable asking him to pour his heart out to you. Maybe you've only met him once. You very well might do nothing.

Facebook is trying to offer a new avenue to help you solve this new dilemma of the digital world. They have inserted the ability to anonymously flag someone as someone who might be suicidal. This is a very delicate user interaction design, obviously. On the one hand, Facebook wants people to be able to report real suicidal behavior, but they also don't want to create an obvious target for people who want to create mischief. Where they place the reporting mechanism as well as the behind-the-scenes processes for dealing with user reports could have very real consequences.

Let me spell out the compromise Facebook has come to. I think it is debatable, but there probably is no perfect answer in this situation. It's just weird to find ourselves in the situation of seeing expressions of suicidal ideation from people we don't know well.

The suicidal behavior reporting button is located within the normal mechanism for reporting questionable content. But it's *not* on the first menu of options, as you can see on the left. Instead, you have to mark something "Violence or harmful behavior" before you see the option to report "Suicidal Content." This seems suboptimal to me as I wouldn't think to put suicidal behavior into that category. A Facebook spokesperson told me, "We have been, and will continue to, work with the suicide prevention community and iterate on the placement of the Suicidal Content button."
Suicidemenus.jpg

Nonetheless, after someone makes this kind of report, they receive a followup email from an actual human being to whom they can respond. It reads like this:

We will do our best to assist you with this matter. Please describe the problem you are experiencing with Facebook in as much detail as possible and include any relevant web addresses (URLs). More detailed information will help us investigate the issue further.

Thanks for contacting Facebook,

[Name]

According to Facebook, they have an internal "systems to prioritize the most serious reports, and a trained team of reviewers who respond to reports and escalate them." That's good because on this issue, it doesn't seem like an algorithm could make the subtle discriminations necessary to offer people the kind of help they need while filtering out pranks.

If the internal infrastructure finds that the person reported is exhibiting suicidal behavior, they'll be offered a private chat session with someone from the National Suicide Prevention Lifeline, as well as the organization's phone number.

Presented by

Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register with Disqus.

Please note that The Atlantic's account system is separate from our commenting system. To log in or register with The Atlantic, use the Sign In button at the top of every page.

blog comments powered by Disqus

Video

A Stop-Motion Tour of New York City

A filmmaker animated hundreds of still photographs to create this Big Apple flip book

Video

The Absurd Psychology of Restaurant Menus

Would people eat healthier if celery was called "cool celery?"

Video

This Japanese Inn Has Been Open For 1,300 Years

It's one of the oldest family businesses in the world.

Video

What Happens Inside a Dying Mind?

Science cannot fully explain near-death experiences.

More in Technology

Just In