We Tested Instagram’s “No Nudity” Rule. We Can’t Show You the Results.

Instagram Community Guidelines

We Tested Instagram’s “No Nudity” Rule. We Can’t Show You the Results.

For years, we’ve been telling parents about content on Instagram. We even coined a phrase, Instaporn, to characterize the sexualized content that seemed to be invading a once, pure photo app. Our recent testing confirms that although Instagram has clear Community Guidelines, they aren’t enforcing them. Both Forbes and Fight the New Drug agree and have featured our research in their own publications.

Instagram’s Community Guidelines Are Clear

Instagram’s Community Guidelines make the following brief and clear statement:

“The Short – We want Instagram to continue to be an authentic and safe place for inspiration and expression. Help us foster this community. Post only your own photos and videos and always follow the law. Respect everyone on Instagram, don’t spam people or post nudity.”

It expands on “The Short” with additional explanation here:

We know that there are times when people might want to share nude images that are artistic or creative in nature, but for a variety of reasons, we don’t allow nudity on Instagram. This includes photos, videos, and some digitally-created content that show sexual intercourse, genitals, and close-ups of fully-nude buttocks. It also includes some photos of female nipples, but photos of post-mastectomy scarring and women actively breastfeeding are allowed. Nudity in photos of paintings and sculptures is OK, too.

This reads pretty clearly to us. But, are these guidelines enforced?

We Tested Instagram’s “No Nudity” Rule

For anyone who has used Instagram for any period of time, you know that the gold is in the #hashtags. Heck, now you can even follow #hashtags as if they were living beings. Pornographers depend on #hashtags to make their content found. After all, what good is online porn if no one can find it?

Here’s what we did. Simple, really. In July 2018, we selected five #hashtags that are known offenders when it comes to inappropriate content. For five days, we reported these #hashtags using Instagram’s self-reporting feature, at least 10 times per day. Meaning, we told Instagram that the #hashtag was posting inappropriate content and/or the #hashtag itself is inappropriate. Therefore, each #hashtag was reported at least 50 times over a 5-day timeframe. And, we wanted to see if it made a difference.

The Results of our Instagram Testing

Trigger warning: we’ve done everything possible to remove as much visually stimulating content as possible from the screen shots below. But, the words used to describe what’s occurring in the videos and photos could be triggering to some individuals. Unfortunately, the 50% of the 5th grade classes that regularly tell us they use Instagram have this content at their fingertips.

After we post this, people will try to replicate our testing. But you can’t. Instagram blocks certain hashtags after an unknown number of complaints, and so the explicit content finds another home. We can simply show you at a point in time what we discovered in July 2018.


Instagram Porn Problem Sexyvideo

Notice that there are 1.6M posts on each day for this #hashtag.


Instagram Pornporn

Here, the number of total posts toting the #pornporn hashtag actually increased after five days of reporting.

Instagram Porn - sex

The sheer quantity of posts that boasted the #sex hashtag was mind boggling. Why was this hashtag even allowed? What content that uses this hashtag would ever be in compliance with Instagram’s Community Guidelines? Even the thumbnail image was pornographic. Notice the violence associated with sex in multiple of the screen shots. I don’t show it here, but on day three during our reporting of the #sex hashtag, there was a video of a man literally strangling a woman with his private parts with her eyes rolled back. Right at the very top of the image feed. It even populated the thumbnail image. It left us horrified.

Instagram tends to struggle with non-English syntax. Meaning, with a simple accent or tilde, pornographers and amateurs can evade the minimal syntax controls that sporadically seem to be in place. I’m confident Instagram has the digital intelligence to overcome this issue (see recommendations below).

February 2019 update – the “siki” prefix on a number of hashtags continues to be problematic. As do a whole list of foreign words that we won’t list here.

Instagram Porn Problem - Hot

February 2019 update – we’ve noticed that neither the hashtag “hot” and the double-dot version show up in Explore search results anymore. These two hashtags were problematic for years. We’re hopeful that this removal is permanent.

And, after five days, we received no message from Instagram, even though they had a working email for us  in the testing profile. Also note that in all five of the #hashtags we tested, none of our reporting had any discernible impact on the overall number of photos available in each #hashtag.

Based on our Testing, Instagram Needs to do More

We don’t have all of the answers, but we’re left with an overall feeling that an organization with Instagram’s technological aptitude and noble beginnings should have the ability and desire to do more. Listen, we know that an unfortunately large percentage of human beings will go out of their way to corrupt anything good. We’re reasonable enough to recognize that Instagram porn will always be a problem at some level. It must be incredibly difficult to control the content being posted by millions and millions of people.

But, honestly, that’s not our problem. That’s Instagram’s problem. Non-compliance with their own Community Guidelines is something they need to own. We’re not the only ones who feel this way. A September 2018 piece from Forbes magazine was equally shocked at what they discovered in Instagram.

We’re sure that engineers and content managers at Instagram could come up with a hundred brilliant ways to fix these problems in one afternoon. We’ve included a short list of ways to make Instagram a safer, more productive place for kids:

  • Raise the difficulty to creating an account. You don’t even need a working email address or age verification (e.g., a birthday) of any kind to create an account. Even Snapchat has a birthday requirement that allows them to gate certain content.
  • Use machine learning to identify and block porn. The technology exists. I know you have it. Even the Monkey app has started using AI to identify and block inappropriate content. If you’re using it, it’s not doing a very good job.
  • Do more to prevent obviously non-compliant hashtags from being used. Some might claim free speech violations here. Sure. But, when would using the hashtag #pornporn ever be consistent with your stated Community Guidelines? Either block using certain hashtags or change the Guidelines.
  • Hire more humans to review and remove content that violates your community guidelines. It just feels like there aren’t any humans on the other side of the line when I’m submitting a complaint about a post or #hashtag. With only 500 employees and over one billion monthly users, that means each of your employees is taking care of 2M people. It’s just not possible. I mean, the signs that an account is posting filth are obvious – if the username is “”sglknoiwer1092k,” has 2 posts, but has 500 followers, then obviously something isn’t right. Those three attributes just described a LARGE number of pornographic accounts that you could do something about TODAY.
  • Create a safe mode for parents who want to teach their kids how to use Instagram. Please give parents something more than private accounts and blocking cruel words. Maybe a safe mode doesn’t include Explore. Or at least make Explore safer because hashtags are a mess and since kids can wipe their search history, it’s like having a built-in incognito mode. Maybe a safe mode removes direct messages that disappear AND the ability to receive direct messages from anyone (which contributes to grooming risk). Maybe a safe mode doesn’t include a one-click integration with IGTV in the upper, right corner, with its ever-growing quantity of sexualized, longer-form video content.  Please give parents something more. Right now it’s impossible to make Instagram safe enough for most kids.
  • Open up just a little more of your API to noble companies like you. BARK comes to mind. We love their mission of allowing kids to use platforms like Instagram, but also alerting parents to anything potentially harming. They’re your ally.

The main point we want Instagram is hear is “If you want to be known as an app that cares about kids, then you need to do more. Your stated values and your corporate behaviors are inconsistent. Please fix this.”

What Can Parents Do Today to Protect Their Kids?

When it comes to inappropriate content (like Instagram Porn), Instagram doesn’t provide parental controls. The only solution we recommend for monitoring Instagram is BARK. On Android, BARK can even alert parents to inappropriate searches in Explore (not on iOS yet). It’s a start and honestly, the best we can do for now, until Instagram gives us some help.

Bark Parental Controls

For mitigating bullying activity – get in your kid’s business, follow their account(s), and in their settings, enable Instagram’s “Comment Controls” to block abusive words and emojis. BARK also helps here with its algorithm that can detect hurtful and cruel words with a notification to a parent.

For finding fake accounts – get in your kid’s business, follow their account, review their following (because kids almost always follow their secret accounts), and read about Finsta accounts.

Related post: Finsta – How Well do You Know Instagram?

For protecting your child from predatory activity – talk to them openly about predators and grooming, review their direct messages, and get privacy and location settings locked down.

Related post: Digital Kidnapping – Your Kids and Social Media Privacy 

For doing something about social media anxiety, depression, or addiction – this one really comes down to parenting by creating healthy guardrails around your children who use social media and sticking to them. Please don’t hesitate to find professional help for your child if you suspect any of these issues.

Related post: Snapstreak Addiction. Why Kids Can’t Put Snapchat Down.

Instagram, it’s Time for Action. Families Need You!

The app started because founder Kevin Systrom’s wife wanted a more beautiful photo while vacationing in Mexico. It has a noble, good start.

But, to Adam Mosseri and the rest of Instagram’s current leadership, ask yourself if Instagram is still a noble app. One that has the right resources in place to maintain at least a minimal level of compliance with its stated Community Guidelines. If not, will you commit to doing more? Is Instagram an app that you’re proud to put into the hands of each of your 13-year-old sons and daughters without fear or apprehension?

Because, that’s what you’re asking each of us to do. We look forward to the changes you might decide to make.

Now what? Parents, Have you Heard of BARK?

Parents, do you want a better idea of what your kids are doing on social media? What about the comments on your daughter’s Instagram photos? Or, iMessage activity on your son’s iPhone? Then, look no further than Bark. We trust it and think you might like it, too.

Protect Young Eyes Logo*Note – links in this post might connect to affiliates who we know and trust. We might earn a small commission if you decide to purchase their services. This costs you nothing! We only recommend what we’ve tested on our own families. Enjoy!

Posted in

18 thoughts on “We Tested Instagram’s “No Nudity” Rule. We Can’t Show You the Results.”

  1. Thank you for this enlightening article. I appreciate your desire to protect not only your kids but ALL kids from inappropriate and life-damaging web content!

  2. Instagram is making excuses other platforms, like Whisper, have chosen not to make. Here’s what other platforms have done:

    IP block porn accounts. This blocks the COMPUTER being used to post pornography not the content itself

    Allow users to report illegal content directly to law enforcement. When you flag a post on whisper multiple reasons are given. If you’re reporting child endangerment, illegal activities or threats law enforcement is contacted directly.

    Credit card verification. Yes, the community will be smaller. That might not be a bad idea.

    Enlist the help of legitimate pornographers. Most licensed porn studios do not post pornographic content. A union could collectively sell Instagram and other sites that give away free porn as damaging their business.

  3. It’s ridiculous the amount of actual porn clips I’ve reported only to receive a reply they find no fault with what I reported. Are they actually looking at what I reported? My daughter was using it but I told her not to Log in due to pollution of porn. To my surprise she actually uninstalled it from her tablet.

  4. Good on you for trying to do something about this. How do you report accounts that could be deemed not appropriate? I have come across images that I don’t think are appropriate for instagram, they are generally of scantily clad women leaving little to the imagination. They are not in anyway art. I understand a person working as a model may have a lingerie shot or 2, but it should be tasteful. Is this really what instagram is going to become, getting followers by wearing very little.

  5. You know what, Instagram isn’t for little kids. Stop pushing your stupid ethics on everyone else. If you can’t handle life, get a commune and move all your bible thumping sheep friends and stay there with no TV, no internet, no phones and stay out of the real world.

    1. Right, not for little kids. But still full of middle school students. Is what you see there appropriate for 13-year-olds? If you want to view it as an adult, that’s your choice. We’re not advocating censorship. Just duty of care toward kids. We do this in every other place where kids spend time. Quit pretending digital spaces get some kind of free pass. Not anymore.

  6. Well, I’ll say I’m totally against Porn anywhere. It’s ruined a 30 yr marriage. It needs to go back to stores to purchase. I see the mention of Instagram. Yahoo Lifestyle we’ll blow your mind which will connect to Instagram. I seen my husband look up pawg and was I in shock. You can also put that in on Facebook. You can pick what type of gender, hair color, race, size and age, yes Teens to Milfs. Your child or spouse can interact with them. Made me sick to my stomach. I would love to learn everything about it being blocked on any device here at home and it not be accessible when he walks out the door. It’s worse than drugs and these poor young kids are subjected to it everywhere you look on the internet and will never understand the true meaning of a relationship because of what they see. Please check out that Pawg site..It needs to go away. I’m reading your articles of stopping devices from seeing it but it’s a lot of homework. Thank you for helping us out and I needed to share the site I sent.

  7. Here’s a thought…instead of calling for the world and the internet to become this big boomer esque 1950s safe space, why don’t you just parent your kids. A start would be oh I don’t know, NOT LETTING THEM USE INSTAGRAM IF YOU’RE AFRAID THEY WILL SEE NUDITY!!! You can’t shelter your kids forever, sooner or later they have to grow up and see adult things in the world. I just got one of my posts on IG removed, do you know what it was, I’ll let you guess… it was a joke where an overweight guy was sucking his man boob. IG removed it because it was “sexual” apparently, although you people would probably condone removing a post because a girl shows a small amount of clevage in a non sexual way through her shirt, and she God forbid isn’t wearing a turtleneck!

  8. This author is a vagina. If you don’t look for it you won’t find it on Instagram. So don’t look at it. BTW police your own kids. It’s not up to everyone else to do it for you. That’s called bad parenting if you leave it up to other people.

  9. This is actually very important. You can’t just surveillance or police your kids like that and expect things to just fall into place. Even if you do that, there are times when you’re absolutely NOT going to be with them to tell them when stuff isn’t a good idea. And it’s very likely for kids to stumble upon inappropriate content through tags or from the idea of possible content being introduced directly. So there does need to be a compromise here. You can literally go to any other porn/adult site to find the content you want so let kids be able to surf the internet worry-free. The internet is a great place for people when uses and monitored responsibly. My niece used to watch porn since the second grade until recently when she started having intolerable views to sex. Even a joke or a notion towards that bothers her and I truly believe it was from the unfortunate to such content topics at a time when she was NOT READY for that. Plus, if content such as somebody shaking their man tidies as a joke is flagged, 9/10 that’s just an AI error… it’s not people reforming internet to the freaking 20s…

  10. I have reported several accounts for nudity and inappropriate content and I always get the same message back from instagram stating that the accounts do not violating guidelines. Well they obviously are so what’s the deal. And for those of you who say don’t go looking for it well these constantly come up in feeds and suggestions to follow even though I block everybody’s content I do not wish to see. What happened to teaching our children modesty? If you pervs want to see porn than go to a porn website and keep this crap off Instagram and other platforms that should be family friendly.

  11. First off, how about you police your children instead of letting the internet do it for you. Secondly, most people who may show partial nudity have their profiles set to private. So not just anyone can see it. Third, and I know this is going to be difficult for you super puritanical types to grasp but NOT ALL NUDITY IS PORN.

  12. They claim you can’t show nipples. I just reported a woman wearing a top that can best be described as a fishing net. With huge open holes that showed her nipples literally popping through. Whole breast and nipples showing while looking all cheesy and overly posed. Instagram determined that was not nudity. However if an artist draws or paints a picture of a woman that has her breasts exposed, they must fuzz the nipples out or face having their photo being taken down.

    I submitted it for review again and again it was turned down as not going against the nudity rules. I don’t get it. She’s nude, her breast are completely exposed. There is no hiding of anything. Instagram must be run by a bunch of 20 year old dorks.

  13. Literally pathetic. Get a life. Try spending more time with your kids instead of policing the internet for what YOU believe is wrong. You’re demonization of nudity is what’s actually wrong with this world.

Leave a Comment

Your email address will not be published. Required fields are marked *

Stay up to date

Scroll to Top