Protect Young Eyes
Presentations
Connect With Us
Paid Content

What is Suicide Porn? Instagram Blamed in Young Teen’s Death.

Suicide Porn Instagram

What is Suicide Porn? Instagram Blamed in Young Teen’s Death.

**WARNING – the stories, words, and images included in this post are extremely graphic. But every child who has Instagram has instant access to photos like these. With zero parental controls. 

“Instagram helped kill my daughter.”

Molly Russell, 14, took her own life in 2017 after viewing disturbing content about suicide on Instagram and Pinterest.

Who’s to blame when parents allow access to online content that they later discover is doing great harm to their children?

Speaking to the BBC, her father said he believed Instagram “helped kill my daughter”.

In his interview with UK periodical The Sunday Times, Molly’s father said that “the more I looked, the more there was that chill horror that I was getting a glimpse into something that was unknown to me and had such profound effects on my lovely daughter. We went to one [account] Molly was following and what we found was just horrendous. They seemed to be completely encouraging of self-harm, linking depression to self-harm and to suicide, making it seem inevitable, normal, graphically showing things like cutting, biting, burning, bruising, taking pills. It was there, hiding in plain sight. We only looked at two sites because they were so harrowing and that’s what began it.”

What is suicide porn?

According to the Urban Dictionary: To look at dangerous objects or places and fantasize over causing yourself harm/death using them. The warm feelings this brings up are psychologically gratifying, rather than sexual.

A Huffington Post article referred to the Netflix show 13 Reasons Why as nothing more than suicide porn due to the way it made a spectacle of suicide, promoted victimization and glamorized revenge against the bullies.

A teen named Libby recently shared with the BBC that at her peak, at age 12, she was sharing pictures of her fresh cuts with 8,000 followers, who offered advice for how to make certain cuts that would produce the most blood.

Libby’s father was stunned by what he read in the comments.

“You shouldn’t have done it this way, you should have done it like that. Don’t do it here, do it there because there’s more blood.”

“That is not someone trying to help you – that is someone getting off on it,” according to her father.

Libby says she was easily drawn into a tribe where she found comfort among others who also struggled with anxiety, depression, suicide, and self-harm. But was this affinity positive or negative? In Libby’s words, “You start becoming a part of it – you get almost stuck to it,” she says. “I was very hooked on it. It was almost like you had to keep up with it otherwise people would turn away and stop caring.”

Further on, Libby stated that her involvement with these individuals was an enabler because she would see them do horrible things and survive. In her mind, this told her, “I’m not that bad.”

What is Instagram doing to combat suicide porn?

During its investigation, the Sun Times created a fake 14-year-old’s account on Pinterest with the right triggers to test the algorithm. Sure enough, pictures relating to suicide came through. And when the same newspaper made more than twenty complaints to Instagram about similar images, they were all rejected.

Libby’s family had a similar experience. They attempted to report images to Instagram but received a response that the photos did not breach their community standards.

Although not related to self-harm, the Protect Young Eyes team has performed extensive testing of Instagram’s reporting function, flagging pornographic content that violates its stated community guidelines. Most requests are denied.

Related Post: Instagram Porn is Everywhere

Justifying its stance, Instagram said: “We do not allow content that promotes or glorifies eating disorders, self-harm or suicide and work hard to remove it.

“However, for many young people, discussing their mental health journey or connecting with others who have battled similar issues, is an important part of their recovery.

“This is why we don’t remove certain content and instead offer people looking at, or posting it, support when they might need it most.”

This just isn’t good enough. And, the British Parliament agrees, stating this week that “Social media giants face [a] legal duty of care to protect children from online harm.”

According to BBC correspondent Agnus Crawford, Instagram has started making changes – restricting hashtags, no more “recommending” of self-harm accounts. Soon they’ll be blurring images of self-harm.

February 4, 2019 update – the UK health minister has requested a meeting with Instagram CEO, Adam Mosseri. Instagram has promised to start blurring self-harm images. We’ll see how well they do. It’s heartbreaking that a young girl had to die for Instagram to notice its content issues. 

Examples of self-harm content found on Instagram 

If Instagram has initiated steps to decrease or remove suicide porn content, then we found no evidence of these efforts yet. The ability to find graphic, bloody images is currently limitless. Some are included below. Be warned. The following images are gut-wrenching and dark. They are also representative of real humans who are suffering. Proceed with caution for you, compassion for the broken, and disgust toward social media organizations who can clearly do more. Every kid with Instagram has unrestricted access to these images.

**WARNING – proceed with care. We’ve applied a blue overlay to mute a bit of the impact of the photos.

Suicide Porn 1

5 ways parents can help kids use social media well. 

Good advice is worth repeating. These steps for parents are practical and have been shared in previous posts:

1. For detecting inappropriate content (like suicide porn) – Instagram doesn’t provide any parental controls. The only solution we recommend for monitoring Instagram and other social platforms is BARK. On Android, BARK can even alert parents to inappropriate searches in Explore (not on iOS yet). It’s a start and honestly, the best we can do for now, until Instagram gives us some help.

Bark Parental Controls

2. For mitigating bullying activity – get in your kid’s business, follow their account(s) from your own Instagram account (yes), and in their settings, enable Instagram’s “Comment Controls” to block abusive words and emojis. BARK also helps here with its algorithm that can detect hurtful and cruel words by sending a notification to a parent.

3. For finding fake accounts – get in your kid’s business, follow their account, review their following (because kids almost always follow their secret accounts), and read about Finsta accounts.

Related post: Finsta – How Well do You Know Instagram?

4. For protecting your child from predators – talk to them openly about predators and grooming, review their direct messages, use a private Instagram account, and get privacy and location settings locked down.

Related post: Digital Kidnapping – Your Kids and Social Media Privacy 

5. For doing something about social media anxiety, depression, or addiction – this one really comes down to parenting by creating healthy guardrails around your children who use social media and sticking to them. Note that Instagram has been named the worst app for mental health.

Conclusion – look them in the eye. Often.

If you suspect your child might be struggling emotionally, don’t be shy about getting the right help. After decades of combined experience working with teens at churches and schools, it’s an issue that the Protect Young Eyes team is unfortunately very familiar with.

The National Suicide Prevention Hotline is: 1-800-273-8255 and they also have a chat feature. Information about “signs” and “risks” of suicide can be found at the American Foundation for Suicide Prevention

If your child has social media, even if you’re uncomfortable getting into their account, then at least talk to them openly and honestly and use a service like Bark to sniff for problems.

Parents, we love BARK and how it helps parents AND kids. Here’s a real story…

“We knew our son was having some issues with school and in his social circle but he doesn’t talk to us about anything…he googled “What is it called when there’s a war going on inside your brain?”…The fact that he used the word “war” prompted BARK to mark it as violence…Call it depression or anxiety or regular mood swings teens experience, he wasn’t opening up to anyone about this and never mentioned it…I have a psych evaluation setup for him in a few days and I just have to say how grateful I am that BARK caught this. I would otherwise have no idea that this was even an issue for him and we can now get some professional help to ensure that it doesn’t become a true problem.”

Parents, do you want a better idea of what your kids are doing on social media? What about the comments on your daughter’s Instagram photos? Or, iMessage activity on your son’s iPhone? Then, look no further than Bark. You can start a 14-day free trial today.

Protect Young Eyes Logo*Note – links in this post might connect to affiliates who we know and trust. We might earn a small commission if you decide to purchase their services. This costs you nothing! We only recommend what we’ve tested on our own families. Enjoy!

Please follow and like us:
2 Comments
  • Christine Stephens
    Posted at 13:38h, 02 February

    This article is right on. You could have described our daughter 6 years ago. I reported over and over to instagram….nothing done about it.
    Churches were barely aware of this 6 years ago let alone willing to talk about it even though then 1 in 10 girls were cutting.

    The glorification and romanticized way it is portrayed to developing preteen and teen eyes has got to be stopped at all levels.

    I’m reading this in line at the airport and just want to scream at the top of my lungs in pain for this generation. This darkness and deception is getting worse and worse. We need to be far more proactive and louder advocates for mental health and also self harm.

  • Chris McKenna
    Posted at 19:54h, 02 February

    Oh, I’m so sorry about your situation. I agree with the “scream at the top of my lungs in pain” part. You sound like a mom who gets it. You have my prayers.

    Chris

Post A Comment