January 11, 2012

Do You Know How to Report Suicidal Posts on Facebook?

by Kary Delaria

Do You Know How to Report Suicidal Posts on Facebook?

Last week, in my network, a man’s Facebook post alluding to suicide caused an entire community to spring into action to help. Hundreds of people – many of them strangers – reached out in comments, shares, and by contacting local law enforcement to help locate a man who said goodbye in a Facebook post and then went missing. Within 24 hours, it was covered on the local news and spread nationally via social media. Fortunately for this man, his family, and friends, he was found and is receiving the help he needs.

This is not the first, and certainly will not be the last, post of this kind. And, not all instances will end as positively. Facebook knows this, and last December, launched a program aiming to prevent suicidal behavior, in which Facebook Friends can alert Facebook of a post that is expressing suicidal intentions.

While I commend Facebook for taking this initiative, (Google has a similar suicide initiative) last week’s situation was a stark realization that most people really aren’t aware that Facebook can, and will, get involved…

Are you aware of Facebook’s Suicide Prevention Program? 

Using the example from last week…hundreds of people, several of whom are heavy Facebook users and even work directly with social mediums, were involved in the outreach to this person. Did any of us know about this policy or  how it could be implemented? I’ve asked a few of my contacts, and awareness is minimal, at best. My colleague, Jen, was one of the first individuals in the line of response during last week’s incident and while she remembered reading about Facebook’s program, had no idea what she should do in the moment.

Do you know what you can do to notify Facebook of a potentially suicidal post? 

From what I can tell, here’s how Facebook’s “flagging” procedure would work (using a test post from Jen as an example only, so the content is obviously not quite something that one would flag):

One of your friend’s posts something indicating they are depressed or suicidal:

Then, if you feel this post warrants a “flag,” you need to go to the post on your friend’s wall/Timeline (this will not work within your newsfeed) and hover over the right side of the post and click on the “x” so that the option to “report/mark as spam” appears:

 

Now, here’s where I have to admit, I get a bit skeptical. Nothing about this tells me that I’m telling Facebook that I am concerned for the health and well-being of this individual. But…it gets more confusing. After clicking “Report/Mark as Spam” this is what appears:

What does that mean, exactly? Did anything “happen” with the first click? Is a second click required? And still, this feels much more like reporting content for spam than it does for suicidal behavior. It tells me that a potentially suicidal post is going to get dumped into the same database with the posts that are potential spam, and someone from Facebook will review it. Then, according to their most recent announcement, if they agree that this person needs help, they’ll send her an email inviting her to chat with a counselor.

While this procedure might work out in some instances, it worried me for any instances that truly require immediate attention. In doing research for this post, I was made aware of another avenue…

Type “suicide” into the Facebook search box, and the first result you get is Facebook’s Help Center topics on suicide and suicide prevention: (Again, I think it bears repeating that I had no idea this existed, and I suspect I’m not alone.)

Expanding the first topic, you’ll see that Facebook does encourage people who have seen suicidal content to contact law enforcement. Additionally, the content can be reported directly to Facebook, and they will contact law enforcement and other professionals: 

While I have not used this reporting feature myself, I did hear from one person who has, and said it worked quite effectively.

Facebook is pretty much here to stay. Cries for help are no longer whispered to a few, they can be published to hundreds, if not more. If you haven’t already encountered this, chances are you will, soon. While the first, most immediate action in a dangerous situation is contacting the authorities, it’s important to know that there are other processes in place, and no one should be afraid of using them to get people the help they need.


Tags

Facebook, suicidal content, suicide, suicide post, suicide prevention


  • {"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}

    Get my new free ebook

    Explore the basics of digital wellness

    >