Facebook Tries to Find the Right System for Flagging Suicidal Behavior
You're on Facebook one day when you notice that an acquaintance -- not someone really close, but a person with whom you're friendly -- posts a status update that seems despondent. Something like, "Man, life doesn't seem worth it. I can't take it." You look for an explanation on the person's profile, wonder if it's some kind of inside joke. But it dawns on you that it might be an honest expression of emotional pain, perhaps a cry for help.
What do you do?
It's a difficult social problem. It's not like you're a close friend of the person and would feel comfortable asking him to pour his heart out to you. Maybe you've only met him once. You very well might do nothing.
Facebook is trying to offer a new avenue to help you solve this new dilemma of the digital world. They have inserted the ability to anonymously flag someone as someone who might be suicidal. This is a very delicate user interaction design, obviously. On the one hand, Facebook wants people to be able to report real suicidal behavior, but they also don't want to create an obvious target for people who want to create mischief. Where they place the reporting mechanism as well as the behind-the-scenes processes for dealing with user reports could have very real consequences.
Let me spell out the compromise Facebook has come to. I think it is debatable, but there probably is no perfect answer in this situation. It's just weird to find ourselves in the situation of seeing expressions of suicidal ideation from people we don't know well.

Nonetheless, after someone makes this kind of report, they receive a followup email from an actual human being to whom they can respond. It reads like this:
We will do our best to assist you with this matter. Please describe the problem you are experiencing with Facebook in as much detail as possible and include any relevant web addresses (URLs). More detailed information will help us investigate the issue further.
Thanks for contacting Facebook,
[Name]
According to Facebook, they have an internal "systems to prioritize the most serious reports, and a trained team of reviewers who respond to reports and escalate them." That's good because on this issue, it doesn't seem like an algorithm could make the subtle discriminations necessary to offer people the kind of help they need while filtering out pranks.
If the internal infrastructure finds that the person reported is exhibiting suicidal behavior, they'll be offered a private chat session with someone from the National Suicide Prevention Lifeline, as well as the organization's phone number.