Join the Movement - Sign the Delay Manifesto 📑
4 Ways Pedophiles Exploit Instagram to Groom Kids
Pedophiles Trade Child Porn through Dropbox Links on Instagram
(Updated March 2, 2021) The Atlantic first reported that teenagers stumbled upon a network of Instagram accounts that were sharing Dropbox links of child porn (Atlantic article). The way it worked is that pedophiles were using certain hashtags on images that advertised how to get in touch. Teens discovered this and proceeded to spam the offending hashtags with hundreds of memes, making it difficult for pedophiles to find each other and trade illegal content.
Brilliant. Kids defending other kids!
And, although it was an admirable diversion, unfortunately these criminals are resourceful. And, with over a billion monthly users, it’s impossible for Instagram to keep pace with nefarious activity.
Maybe your kid already uses Instagram. Great! I’m not saying you need to rip it away. In fact, that is often counterproductive. Instead, we hope this post will help you better understand that the way the app is designed creates risks.
Because remember, not all kids using Instagram end up being groomed and abused.
But, if grooming and child exploitation are easy on the app, my guess if you would want to know. Even CNN recently reported that Instagram is the #1 app for child grooming.
If your son or daughter receives a private, DM (direct message) from a stranger, does he/she know how to respond? It’s easier to do than you think. Remember, wherever the kids are is where the predators are.
We simply want this post to flash a light in dark places. Since Apple’s App Store Description doesn’t say anything about predatory activity, it’s our job to tell the truth.
**Warning. Some of the screenshots you will see in this post are not safe for work (NSFW) and include some of the most disturbing content we’ve ever encountered during over four years of researching social media. Nothing has been censored.
Four Grooming Paths on Instagram - Comments, Hashtags, Likes, and DMs
If Instagram leadership reads this post, they’ll try really hard to point to their community guidelines and their reporting channels, saying that they don’t allow predatory activity. But we would argue that the very way in which Instagram is designed creates grooming pathways. In other words – no amount of moderation or guidelines can change Instagram’s features. Allow us to explain.
Oh, and one more thing. Many parents who read this might think, “my child has a private account, so they’re fine.” That’s a common, but incorrect conclusion. None of the four feature issues we discuss below are impacted in any way by the privacy of an account. Anyone, whether private or not, can post comments and search hashtags, and anyone can be seen through the like count and sent a message via DM.
Pedophiles exploit Instagram’s comments to network with each other and fish for victims.
Within the comments, pedophiles find other pedophiles and peddle their illegal and disgusting content with each other. Here are a few samples from an endless number of comments
(warning – these comments are extremely disturbing)
You also see comments that go directly at young people as a form of “fishing” for victims, waiting for a kid to bite.
Pedophiles exploit Instagram’s hashtags to drop horrible content into good, clean places.
Almost all social media platforms use #hashtags. Think of them as a card catalogue for social media content – a way to categorize millions and millions of images into groups so that I can find exactly what I’m looking for. We love them! Some people use them as a sort of witty, second language.
But the problem is that they can be used by anyone.
Let’s say for a minute that I’m a teen girl who’s interested in modeling. Or cheerleading. And my mom even made me have a private Instagram account (good job, mom!).
I take a photo at the beach with my friends, and I attach the hashtags #teen #teengirl #teenmodel #snapchat. Fabulous. Later on, with my girlfriends, I’m thumbing through the #teenmodel and #snapchat hashtags, and I see this:
See, any predator can attach #teenmodel and #snapchat to their photo. This allows that photo to show up in front of millions of teen girls, thumbing through #snapchat photos, hoping one will “bite.”
Notice in the one photo how part of the “sell” is to convince a girl to join him in Snapchat, which is a very secure environment for secretive activity. After all, >75% of teens have Instagram and >76% (AP Article) of teens have Snapchat, so there’s a good chance that if a kid has one, then they probably have the other.
In other words, #hashtags allow predators to hover over good places like a drone and drop their smut whenever they want. Pay attention to those screenshots – there’s nothing pornographic about them. There’s no swear words. No use of “sex.” But, the very nature of #hashtags as a feature create this grooming path.
And if someone reports the “daddy” posts you see above and Instagram takes them down, no problem. Since Instagram doesn’t require any identity verification, including birthday, real email, credit card, NOTHING, a predator can create another fake account in seconds. This is yet another huge design flaw that creates a situation where pedophiles don’t mind taking great risks and getting shut down – their attitude is, “I’ll just start over.”[Note: we experienced this with “daddy,” who we reported multiple times. His account would be shut down, and then he popped up with a slightly different username seconds later, posting the same horrifying images of him masturbating and asking kids to connect with him “live.”]
Related post: We Tested Instagram’s “No Nudity” Rule. We Can’t Show You the Results.
Predators exploit Instagram’s likes (the heart) to identify potential victims.
Going back to our #teenmodel example, if you click on one photo, you might find that it has hundreds of likes (hearts) similar to the photo of the young boy below (sorry, but if you don’t want your photo in blog posts, then keep your account private).
Predators can click on the likes and see everyone who has liked this photo. Everyone. Even if they have a private account. From that list, a predator can identify someone young who looks interesting and send him/her a direct message (DM) – we’ll explain the whole DM feature in more detail next. But, note how the “likes” feature creates a target audience for sexual predators. This is shown in the image below.
Again, it’s a design flaw. The very nature of the likes feature creates a pool of young people for predators to target (to Instagram’s credit, they are considering dropping the “like” count attached to photos, but so far, this has only been speculated).
Which leads us to DMs. Direct Messages.
Predators exploit Instagram’s DMs (direct messages) to groom kids. And they’re doing it very successfully.
Two weeks ago, PYE created a test Instagram account. This account was clearly for a young girl, who posted two selfies on the first day of existence. Tagged on these photos were hashtags #teen, #teengirl, #teenmodel. This account went out and “liked” a few photos with similar hashtags and followed accounts that were like mine.
Not much happened for the first six days of the account.
Then, one week later, something in Instagram’s algorithm triggered. It was as if some combination of the test account’s activity unleashed a tsunami of DM activity that hasn’t let up over the past four days, averaging over 10 DMs per day. The screenshots below show some of the activity, including a very creative porn link. Note – PYE is the one who scribbled out the man masturbating in the image below. The photo was sent to our test account as a DM, completely exposed.
What if I have more questions? How can I stay up to date?
Two actions you can take!
- Subscribe to our tech trends newsletter, the PYE Download. About every 3 weeks, we’ll share what’s new, what the PYE team is up to, and a message from Chris.
- Ask your questions in our private parent community called The Table! It’s not another Facebook group. No ads, no algorithms, no asterisks. Just honest, critical conversations and deep learning! For parents who want to “go slow” together. Become a member today!
A letter from our CEO
Read about our team’s commitment to provide everyone on our global platform with the technology that can help them move ahead.
Featured in Childhood 2.0
Honored to join Bark and other amazing advocates in this film.
World Economic Forum Presenter
Joined a coalition of global experts to present on social media's harms.
Testified before Congress
We shared our research and experience with the US Senate Judiciary Committee.