11 Aug We Tested Instagram’s “No Nudity” Rule. We Can’t Show You the Results.
For years, we’ve been telling parents about content on Instagram. We even coined a phrase, Instaporn, to characterize the sexualized content that seemed to be invading a once, pure photo app. Our recent testing confirms that although Instagram has clear Community Guidelines, they aren’t enforcing them. Both Forbes and Fight the New Drug agree and have featured our research in their own publications.
Instagram’s Community Guidelines Are Clear
Instagram’s Community Guidelines make the following brief and clear statement:
“The Short – We want Instagram to continue to be an authentic and safe place for inspiration and expression. Help us foster this community. Post only your own photos and videos and always follow the law. Respect everyone on Instagram, don’t spam people or post nudity.”
It expands on “The Short” with additional explanation here:
We know that there are times when people might want to share nude images that are artistic or creative in nature, but for a variety of reasons, we don’t allow nudity on Instagram. This includes photos, videos, and some digitally-created content that show sexual intercourse, genitals, and close-ups of fully-nude buttocks. It also includes some photos of female nipples, but photos of post-mastectomy scarring and women actively breastfeeding are allowed. Nudity in photos of paintings and sculptures is OK, too.
This reads pretty clearly to us. But, are these guidelines enforced?
We Tested Instagram’s “No Nudity” Rule
For anyone who has used Instagram for any period of time, you know that the gold is in the #hashtags. Heck, now you can even follow #hashtags as if they were living beings. Pornographers depend on #hashtags to make their content found. After all, what good is online porn if no one can find it?
Here’s what we did. Simple, really. In July 2018, we selected five #hashtags that are known offenders when it comes to inappropriate content. For five days, we reported these #hashtags using Instagram’s self-reporting feature, at least 10 times per day. Meaning, we told Instagram that the #hashtag was posting inappropriate content and/or the #hashtag itself is inappropriate. Therefore, each #hashtag was reported at least 50 times over a 5-day timeframe. And, we wanted to see if it made a difference.
The Results of our Instagram Testing
Trigger warning: we’ve done everything possible to remove as much visually stimulating content as possible from the screen shots below. But, the words used to describe what’s occurring in the videos and photos could be triggering to some individuals. Unfortunately, the 50% of the 5th grade classes that regularly tell us they use Instagram have this content at their fingertips.
After we post this, people will try to replicate our testing. But you can’t. Instagram blocks certain hashtags after an unknown number of complaints, and so the explicit content finds another home. We can simply show you at a point in time what we discovered in July 2018.
Notice that there are 1.6M posts on each day for this #hashtag.
Here, the number of total posts toting the #pornporn hashtag actually increased after five days of reporting.
The sheer quantity of posts that boasted the #sex hashtag was mind boggling. Why was this hashtag even allowed? What content that uses this hashtag would ever be in compliance with Instagram’s Community Guidelines? Even the thumbnail image was pornographic. Notice the violence associated with sex in multiple of the screen shots. I don’t show it here, but on day three during our reporting of the #sex hashtag, there was a video of a man literally strangling a woman with his private parts with her eyes rolled back. Right at the very top of the image feed. It even populated the thumbnail image. It left us horrified.
Instagram tends to struggle with non-English syntax. Meaning, with a simple accent or tilde, pornographers and amateurs can evade the minimal syntax controls that sporadically seem to be in place. I’m confident Instagram has the digital intelligence to overcome this issue (see recommendations below).
February 2019 update – the “siki” prefix on a number of hashtags continues to be problematic. As do a whole list of foreign words that we won’t list here.
February 2019 update – we’ve noticed that neither the hashtag “hot” and the double-dot version show up in Explore search results anymore. These two hashtags were problematic for years. We’re hopeful that this removal is permanent.
And, after five days, we received no message from Instagram, even though they had a working email for us in the testing profile. Also note that in all five of the #hashtags we tested, none of our reporting had any discernible impact on the overall number of photos available in each #hashtag.
Based on our Testing, Instagram Needs to do More
We don’t have all of the answers, but we’re left with an overall feeling that an organization with Instagram’s technological aptitude and noble beginnings should have the ability and desire to do more. Listen, we know that an unfortunately large percentage of human beings will go out of their way to corrupt anything good. We’re reasonable enough to recognize that Instagram porn will always be a problem at some level. It must be incredibly difficult to control the content being posted by millions and millions of people.
But, honestly, that’s not our problem. That’s Instagram’s problem. Non-compliance with their own Community Guidelines is something they need to own. We’re not the only ones who feel this way. A September 2018 piece from Forbes magazine was equally shocked at what they discovered in Instagram.
We’re sure that engineers and content managers at Instagram could come up with a hundred brilliant ways to fix these problems in one afternoon. We’ve included a short list of ways to make Instagram a safer, more productive place for kids:
- Raise the difficulty to creating an account. You don’t even need a working email address or age verification (e.g., a birthday) of any kind to create an account. Even Snapchat has a birthday requirement that allows them to gate certain content.
- Use machine learning to identify and block porn. The technology exists. I know you have it. Even the Monkey app has started using AI to identify and block inappropriate content. If you’re using it, it’s not doing a very good job.
- Do more to prevent obviously non-compliant hashtags from being used. Some might claim free speech violations here. Sure. But, when would using the hashtag #pornporn ever be consistent with your stated Community Guidelines? Either block using certain hashtags or change the Guidelines.
- Hire more humans to review and remove content that violates your community guidelines. It just feels like there aren’t any humans on the other side of the line when I’m submitting a complaint about a post or #hashtag. With only 500 employees and over one billion monthly users, that means each of your employees is taking care of 2M people. It’s just not possible. I mean, the signs that an account is posting filth are obvious – if the username is “”sglknoiwer1092k,” has 2 posts, but has 500 followers, then obviously something isn’t right. Those three attributes just described a LARGE number of pornographic accounts that you could do something about TODAY.
- Create a safe mode for parents who want to teach their kids how to use Instagram. Please give parents something more than private accounts and blocking cruel words. Maybe a safe mode doesn’t include Explore. Or at least make Explore safer because hashtags are a mess and since kids can wipe their search history, it’s like having a built-in incognito mode. Maybe a safe mode removes direct messages that disappear AND the ability to receive direct messages from anyone (which contributes to grooming risk). Maybe a safe mode doesn’t include a one-click integration with IGTV in the upper, right corner, with its ever-growing quantity of sexualized, longer-form video content. Please give parents something more. Right now it’s impossible to make Instagram safe enough for most kids.
- Open up just a little more of your API to noble companies like you. BARK comes to mind. We love their mission of allowing kids to use platforms like Instagram, but also alerting parents to anything potentially harming. They’re your ally.
The main point we want Instagram is hear is “If you want to be known as an app that cares about kids, then you need to do more. Your stated values and your corporate behaviors are inconsistent. Please fix this.”
What Can Parents Do Today to Protect Their Kids?
When it comes to inappropriate content (like Instagram Porn), Instagram doesn’t provide parental controls. The only solution we recommend for monitoring Instagram is BARK. On Android, BARK can even alert parents to inappropriate searches in Explore (not on iOS yet). It’s a start and honestly, the best we can do for now, until Instagram gives us some help.
For mitigating bullying activity – get in your kid’s business, follow their account(s), and in their settings, enable Instagram’s “Comment Controls” to block abusive words and emojis. BARK also helps here with its algorithm that can detect hurtful and cruel words with a notification to a parent.
For finding fake accounts – get in your kid’s business, follow their account, review their following (because kids almost always follow their secret accounts), and read about Finsta accounts.
Related post: Finsta – How Well do You Know Instagram?
For protecting your child from predatory activity – talk to them openly about predators and grooming, review their direct messages, and get privacy and location settings locked down.
Related post: Digital Kidnapping – Your Kids and Social Media Privacy
For doing something about social media anxiety, depression, or addiction – this one really comes down to parenting by creating healthy guardrails around your children who use social media and sticking to them. Please don’t hesitate to find professional help for your child if you suspect any of these issues.
Related post: Snapstreak Addiction. Why Kids Can’t Put Snapchat Down.
Instagram, it’s Time for Action. Families Need You!
The app started because founder Kevin Systrom’s wife wanted a more beautiful photo while vacationing in Mexico. It has a noble, good start.
But, to Adam Mosseri and the rest of Instagram’s current leadership, ask yourself if Instagram is still a noble app. One that has the right resources in place to maintain at least a minimal level of compliance with its stated Community Guidelines. If not, will you commit to doing more? Is Instagram an app that you’re proud to put into the hands of each of your 13-year-old sons and daughters without fear or apprehension?
Because, that’s what you’re asking each of us to do. We look forward to the changes you might decide to make.
Now what? Parents, Have you Heard of BARK?
Parents, do you want a better idea of what your kids are doing on social media? What about the comments on your daughter’s Instagram photos? Or, iMessage activity on your son’s iPhone? Then, look no further than Bark. We trust it and think you might like it, too.
*Note – links in this post might connect to affiliates who we know and trust. We might earn a small commission if you decide to purchase their services. This costs you nothing! We only recommend what we’ve tested on our own families. Enjoy!
I love life. Seriously! Each. Day. A. Gift. Former CPA, business advisor, youth pastor, development director, porn survivor. Current marketing manager for Covenant Eyes and CEO of PYE. God shares wild ideas with me about life while I run. I love guiding parents to teach their kids how to use technology well while protecting them from the bad stuff.