Protect Young Eyes
Presentations
Connect With Us
Paid Content

The Case for Accurate and Accountable App Ratings. 3 Layers to Protect Our Kids.

App Rating

The Case for Accurate and Accountable App Ratings. 3 Layers to Protect Our Kids.

Summary: No one is holding technology companies accountable for the impacts that their technology is having on the hearts and minds of young people. Protect Young Eyes, in conjunction with The National Center on Sexual Exploitation and Melissa McKay, Child Advocate, have released a joint statement calling for an accurate app rating system and easy, intuitive parental controls. This post lays out our 3-layered approach to protecting kids more effectively.

Full Statement from NCOSE

Three Stories. One Problem.

Teen Sues Snapchat for Explicit Content.

In 2016, a 14-year-old boy and his mom filed a lawsuit against Snapchat over content that appeared in the Discover section, which features stories, video, and articles from a variety of media partners. The Discover section cannot be turned off – every person who has Snapchat has the Discover section.

This teen was offended by content ranging from, “23 Pictures That Are Too Real If You’ve Ever had Sex With A Penis” showing Disney characters having sex, to articles titled, “10 Things He Thinks When He Can’t Make You Orgasm,” showing two dolls having sex with a puppy in the background, among others.

All of this in an app with an app rating of 12+ in Apple’s App Store. That’s seventh grade.

Teens Step up to Spam Suspected Child Porn Hashtags in Instagram.

On January 8, 2019, The Atlantic ran an article about a network of Instagram hashtags that connected hundreds of child pornographers who used Instagram direct messages to share links to Dropbox folders of illegal content. When teenagers discovered what was going on, they shrewdly started spamming these hashtags with memes, in order to clutter the content, and disrupt the transactions.

Power to the kids! It’s brilliant.

All of this in an app with an app rating of 12+ in Apple’s App Store. That’s seventh grade.

Dad Blames Instagram for the Suicide of His 14-year-old Daughter, Molly. 

In a story that has caught the attention of the world, including the attention of Instagram CEO Adam Mosseri, the father of Molly Russell blames the social media giant for the death of his daughter. Molly took her life in 2017 after viewing disturbing content about suicide, sometimes called “suicide porn,” on Instagram and Pinterest.

In his interview with UK periodical The Sunday Times, Molly’s father said…”We went to one [account] Molly was following and what we found was just horrendous. They seemed to be completely encouraging of self-harm, linking depression to self-harm and to suicide, making it seem inevitable, normal, graphically showing things like cutting, biting, burning, bruising, taking pills. It was there, hiding in plain sight…”

Since these allegations, Instagram has started shielding graphic images of self-harm that must be clicked through to be viewed.

All of this in an app with an app rating of 12+ in Apple’s App Store. That’s seventh grade.

Friends, these stories show us a consistent problem that we can’t ignore. It’s not until parent advocacy groups scream, teenagers act like adults, and young people die that anyone does anything. No one is holding these apps accountable for their actions.

Smart Parents are Struggling to Help Their Kids Use Technology Well.

Here’s the thing. I want kids to use Instagram well. The beauty of its photos! Amazing. I want kids to learn how to collaborate with their classmates on a group assignment through Google Hangouts. I want kids to create unique and inspiring videos and post them to YouTube for the world to enjoy.

But, even tech-savvy parents find it incredibly difficult to figure out how to protect their kids from the junk while at the same time nurturing and guiding their children to use technology well. They tell us this constantly.

  • Chris, why can’t I turn off the Explore feature on Instagram and stop my kid from clearing his search history? Doesn’t Instagram care that porn is always 7 seconds away from EVERY kid? I don’t know. 
  • Chris, why can’t I prevent my daughter from deleting her iMessage history so that I can help hold her accountable for how she uses this app? Doesn’t Apple know that parents desperately want this? I don’t know.
  • Chris, why is there no way to lock in YouTube’s Restricted Mode in the app itself without using a clean DNS or some parental control? Don’t they know that millions of kids love YouTube? I don’t know.
  • Chris, why did Google remove Supervised Users from Chromebooks with one week’s notice, and now, a year later, there still are no real parental controls that work on Chromebooks, even though millions of kids use these devices for education? Doesn’t Google care about families? I don’t know. 
  • Chris, why does Snapchat continue to release features and content that completely ignore the fact that they depend on a young user base to keep them afloat? Heck, even their CEO’s former Stanford professor is calling them out! I don’t know.

So parents are left with three choices. Bubble-wrap their kids from tech (wrong). Let their kids figure out tech on their own (wrong). Or, toil in the middle to stay current on the latest trends, issues, risks, and controls (tough).

What if tech companies saw it as their duty to partner with families to get this right? What if these organizations sincerely sought to benefit the lives of the very humans that they profit from? We can dream, can’t we?

In 2017, two large Apple investors took the unusual step of sending a letter to the Company, urging them to create better iPhone parental controls. And, in 2018, we saw Screen Time in iOS 12, which is a start. But, we believe much more is needed.

Bark parental controls

A Three-layered Approach to Protecting Kids.

Layer One: Create a Consistent and Enforceable App Rating System.

When taking our children to movies, we depend on the MPAA rating system to tell us what to expect. When listening to music, inappropriate content is tagged as having “explicit content.” Video game retailers refuse to stock games that have not been rated by the Entertainment Software Ratings Board (ESRB). These ratings systems are clearly understood, enforced, trustworthy, and exist to protect the innocence of minors.

Back in 1993, violence in Mortal Kombat caused a congressional hearing that determined video game companies were irresponsibly marketing mature content to minors. It was animated, cartoon violence that motivated law makers to change an industry.

Where’s the outcry today? Why isn’t more being done to protect our young people? Since when is hardcore pornography, real violence, graphic self-harm, animal abuse, child pornography, easy sexting (trigger warning with this link), sextortionglorified substance abuse, and an underground sex trafficking arena acceptable for seventh graders in an app rated 12+? Why does Apple tell parents that their child’s favorite social media platform is appropriate for 12-year-olds, yet the Children’s Online Privacy and Protection Act (COPPA) requires all users to be at least age 13? Why do different mobile app stores rate apps differently?

All of this furthers our belief that the current app rating system is inconsistent, out of alignment with current legislation, and inaccurate in its definition of what is proper for young people. These factors cause parental confusion and children being exposed to mature, harmful content.

Fortunately, the ESRB has shown us how to fix this issue. Possible steps forward for more accurate and consistent mobile app ratings include: 

  1. Creating an independent, self-regulatory body that assigns app ratings.
  2. Developing rating categories that are consistent with current laws and expectations.
  3. Developing an agreed upon set of category descriptors (critical).
  4. Creating a screening process for all app developers.
  5. Requiring Google and Apple to only accept apps that have a certified rating summary.
  6. Creating methods of enforcement that promote compliance.

Apple is inconsistent at best with how it chooses to wield its all-powerful app approval. Consider its own policies against pornography in apps. If these policies are real, then why is Twitter allowed? We need an entity outside of those who profit from the approval that guides the approval.

Layer Two: Build Easier-to-use Parental Controls on iOS and Android.

In September 2018, Apple released Screen Time as a replacement for its Restrictions. It was heralded as a remedy for weary parents, who wanted better control how their kids use iOS devices. But, it’s complex. Our own PYE blog post explaining Screen Time required 33 screen shots and 37 steps showing parents how to set it up.

Related Post: Setting Up iOS Screen Time Parental Controls

Separately, there’s Google’s Family Link. Ok, it’s decent for Android devices, but it’s a mess on Chromebooks and again, doesn’t seem to consider what real parents, with real kids, doing real homework need in order to protect and guide their kids.

We’re confident that both Apple and Google could quickly develop and release a very small list of features that would dramatically help families, including:

  • One-click device and app locking from a parent device, similar to OurPact. Interestingly, Apple has recently informed OurPact that they can no longer use MDM in their parental control app. Although Apple points to a violation of a long-standing Apple Store rule, others find the timing suspiciously close to the release of iOS 12. An ideal solution would include really simple lock-down settings for bedtime and school time.
  • More control over texting, including iMessage. For some reason, no version of parental controls on iOS devices has ever given parents the control they desire over iMessages. Parents desire  the simple ability to prevent messages from being deleted. Additionally, it’s too easy for other media, including YouTube videos, GIFs, and other content, to be attached to iMessages. Parents want the ability to remove all App Store and app content in the iMessages their kids are sending.
  • Easy linking between parent and child devices. No more child accounts that are treated like adults when they turn 13, no more confusing family profiles, Family Sharing, family groups, etc. Let parents bring up a code on a child device (iPhone, Android, Chromebook), type that code into their device, and in seconds, they are able to exert control over that child’s device. This requires around 16 steps in iOS 12. I’ve already mentioned Family Link and Chromebooks above. There has to be an easier way.
  • One-click porn blocking. This can be achieved with clean DNS. There are amazing clean DNS providers like CleanBrowsing that excel here. Pairing up their service as a default, one-button solution on iOS and Android devices would be so helpful in protecting kids.
  • Default settings that are family friendly that don’t have to be set. When first enabling Screen Time, the default settings for Content Restrictions are “explicit” for music, books, and Web Content is “Unrestricted Access.” If someone enables the Screen Time feature, default settings should err toward protecting kids. Not the other way around. This could be further simplified through a feature whereby a parent enters a child’s age into a new device and it locks in a set of family friendly settings automatically.

Apple, please do better. Google, please do better. We need you both to start thinking like busy parents who sincerely want to help their kids use your devices well.

Layer Three: Provide Some Parental Controls in Major Apps Used by a Majority of Teens.

It’s time for organizations that create apps (YouTube, Instagram, Snapchat) and devices (Apple and Google) used by a majority of teens to create parental controls as if their own precious sons and daughters were using them.

Yes, the golden rule for technology.

A recent poll in the UK found that “nine out of ten parents want social media companies to be subject to a legal duty of care to keep children safe on their platforms.”

At a minimum, we need organizations like Instagram to at least adhere to their Community Guidelines. But, what hope do we have when their reporting process is hopelessly broken?

Related post: Instagram has a Porn Problem

We’ve come up with a list of parental control ideas, ranging from simple to complex, that social media organizations like Snapchat or Instagram could develop in a week, and implement in a month.

  • Raise the difficulty to creating an account. For Instagram, you don’t even need a working email address or provide age verification to create an account. Snapchat has a birthday entry, which is a start, but inserting a birthday indicating you’re a minor only slightly matters if partners in Discover decide to age-gate their content.
  • Use machine learning to identify and block porn if the user is a minor. The technology exists. You have it! Even the Monkey app has started using AI to identify and block inappropriate content.
  • Prevent obviously non-compliant hashtags from being used. This one is for Instagram. Their hashtags are a complete dumpster fire. Some might claim free speech violations here. Sure. But, when would using the hashtag #pornporn, #hotboobs, or #twerkthatass ever be consistent with Instagram’s stated Community Guidelines? Either block using certain hashtags or change the Guidelines.
  • Hire more humans to review and remove content that violates Instagram’s community guidelines. It just feels like there aren’t any humans on the other side of the line when I’m submitting a complaints about a post or #hashtag in Instagram. With only 500 employees and over one billion monthly users, that means each of your employees is taking care of 2M people. It’s just not possible. I mean, the signs that an account is posting filth are obvious – if the username is “”sglknoiwer1092k” (actual username), has 2 posts, but has 500 followers, then obviously something isn’t right. Those three attributes just described a large number of pornographic accounts that Instagram could do something about today.
  • Create a safe mode for parents who want to teach their kids how to use social media. Please give parents something more than private accounts and blocking cruel words. A safe mode might not include the ability to search hashtags and removes one-click integration with IGTV. Maybe safe mode removes the ability to “clear search history” in Instagram. Maybe a safe mode eliminates the ability to search for premium accounts on Snapchat. Maybe a safe mode eliminates the Discover section. Please give parents something more.
  • Open up just a little more of your APIs to noble companies. BARK comes to mind. We love their mission of allowing kids to use platforms like Instagram, Snapchat, and YouTube, while also alerting parents to anything potentially harming. They’re your ally. They want kids to use your platforms. Covenant Eyes comes to mind as they strive to help families avoid the porn trap, which is ensnaring too many kids. Please treat them like partners and not foes.

As I’ve said in other posts, I’m not out to censor what adults want to do. I just want major technology organizations, who know that millions of children use their apps, to build higher walls around adult content and embrace the “duty of care” idea that is gaining steam in the UK.

Closing Thoughts from a Dad who loves technology.

This post is somewhat of a dissertation based on things that I’ve observed over the past four years. I’ll be the first to admit that more parents need to step up and do more to protect and guide their kids. This isn’t all on Apple. Trust me – I’m pouring my most productive professional years into helping as many families as possible figure this out. But, I need help.

I believe that if done properly, the three layers above would give more parents the information they need to make informed decisions about the appropriateness of the digital places where their kids spend time and create safer digital environments.

To Apple, Google, Snapchat, Instagram, and all of your friends, you have immense power. And great responsibility. The next move is yours.

Sincerely,

Chris McKenna

Founder, Protect Young Eyes

Protect Young Eyes Logo

*There are affiliate links throughout this post because we’ve tested and trust a small list of parental control solutions. Our work saves you time! If you decide that you agree with us, then we may earn a small commission, which does nothing to your price. Enjoy! 

Please follow and like us:
No Comments

Post A Comment