Facebook Doesn't Want You to Commit Suicide

Parsing the company's attempts to become more benevolent

This article is from the archive of our partner .

Facebook doesn't want you to kill yourself: that seems to be the message with the company's decision to launch a partnership with Samaritans, a suicide watch and prevention organization in the UK. The way it works is that if anyone notices a Facebook friend posting potentially suicidal thoughts, he or shewill be able to report the message to Facebook through its help center. According to the BBC, Facebook will then decide whether to report the issue straight to the police or contact Samaritans, and the twenty-four hour organization would reach out to the party in question. Samaritans is based in Britain, so the change probably won't mean that much to those accessing the site from the U.S. Right now the "help center" page simply encourages U.S. users to call the National Suicide Prevention Hotline and suggests those in the U.K. contact Samaritans. The BBC reports that Facebook says it has always been policy to notify police if a user was at risk for "imminent bodily harm."

So what's behind the move? There's the moral concern aspect. But there are also plenty of other reasons why Facebook might be taking this step now. Mark Zuckerberg may be man of the year to some, but the company's reputation has still taken a hit: aside from a negative film portrayal and a recent backlash in the press, Facebook's once squeaky clean image has been blemished by its relation to a handful of suicides and bullying-related events. Last September, Rutger's freshman Tyler Clementi had posted on Facebook "Jumping off the gw bridge sorry," before taking his own life, after his roommate had posted videos of an intimate encounter between the 18 year-old and another male student on the web.

More recently, a charity shop worker named Simone Back had killed herself on Christmas in the UK, telling her 1,048 Facebook friends that she had taken all her pills and would "be dead soon"--yet no one raised alarm until the following day, according to the Guardian. The case seemed to reveal part of the cynical nature of Facebook-friendships: it's inconceivable to think of 1,000 real friends that would sit by if they heard a friend was trying to kill themselves, yet on Facebook this is what transpired. In fact,  "instead of trying to save her or get help, some of her online contacts left messages taunting her and arguing among themselves," the Guardian reports.

On a certain level, the change is unobtrusive. It is one that may seek to mimic the community concern that people have for each other's safety in the public sphere. But it also raises questions about Facebook's potential policing of its users--it's one thing to report someone who is harassing you directly, as you currently can do in the Help Center for a variety reasons. Yet it's something else entirely when you are reporting other people for the content of their own messages, as is already done a bit in the Facebook Help Center's section on "Law Enforcement." In other words, even if unintentional, there may be a slight dose of Big Brother in that benevolence.

This article is from the archive of our partner The Wire.