fbpx

It’s Time for Social Media Duty of Care. #fixappratings

#fixappratings

It’s Time for Social Media Duty of Care. #fixappratings

Summary: No one is holding technology companies accountable for the impacts that their technology is having on the hearts and minds of young people. In the wake of a global outcry over the suicide of a teen girl, the UK’s Parliament has proposed that social media companies be subject to a “duty of care.” We agree. It’s time to #fixappratings.

“And I’ve often heard people say, ‘But it’s the parent’s responsibility to keep their children safe online’, and yes it absolutely is, parents need to do as much as they can, but my message today is parents cannot do that on their own because the internet is too ubiquitous and it’s too difficult to control, it’s become a giant.” -Ruth Moss, NCSPP

Three Stories. One Problem.

Teen Sues Snapchat for Explicit Content.

In 2016, a 14-year-old boy and his mom filed a lawsuit against Snapchat over content that appeared in the Discover section, which features stories, video, and articles from a variety of media partners. The Discover section cannot be turned off – every person who has Snapchat has the Discover section.

This teen was offended by content ranging from, “23 Pictures That Are Too Real If You’ve Ever had Sex With A Penis” showing Disney characters having sex, to articles titled, “10 Things He Thinks When He Can’t Make You Orgasm,” showing two dolls having sex with a puppy in the background, among others.

All of this in an app with an app rating of 12+ in Apple’s App Store. That’s seventh grade.

Teens Step up to Spam Suspected Child Porn Hashtags in Instagram.

On January 8, 2019, The Atlantic ran an article about a network of Instagram hashtags that connected hundreds of child pornographers who used Instagram direct messages to share links to Dropbox folders of illegal content. When teenagers discovered what was going on, they shrewdly started spamming these hashtags with memes, in order to clutter the content, and disrupt the transactions.

Power to the kids! It’s brilliant.

All of this in an app with an app rating of 12+ in Apple’s App Store. That’s seventh grade.

Dad Blames Instagram for the Suicide of His 14-year-old Daughter, Molly.

In a story that has caught the attention of the world, including the attention of Instagram CEO Adam Mosseri, the father of Molly Russell blames the social media giant for the death of his daughter. Molly took her life in 2017 after viewing disturbing content about suicide, sometimes called “suicide porn,” on Instagram and Pinterest.

Since these allegations, Instagram has started shielding graphic images of self-harm that must be clicked through to be viewed.

All of this in an app with an app rating of 12+ in Apple’s App Store. That’s seventh grade.

Friends, these stories show us a consistent problem that we can’t ignore. It’s not until parent advocacy groups scream, teenagers act like adults, and young people die that anyone does anything. No one is holding these apps accountable for their actions.

Smart Parents are Struggling to Help Their Kids Use Technology Well.

Here’s the thing. I want kids to use Instagram well. I want kids to learn how to collaborate with their classmates on a group assignment through Google Hangouts. I want kids to create unique and inspiring videos and post them to YouTube for the world to enjoy.

But, even tech-savvy parents find it incredibly difficult to figure out how to protect their kids from the junk while at the same time nurturing and guiding their children to use technology well.

  • Chris, why can’t I turn off the Explore feature on Instagram and stop my kid from clearing his search history? Doesn’t Instagram care that porn is always 7 seconds away from EVERY kid? I don’t know.
  • Chris, why can’t I prevent my daughter from deleting her iMessage history so that I can help hold her accountable for how she uses this app? Doesn’t Apple know that parents desperately want this? I don’t know.
  • Chris, why is there no way to lock in YouTube’s Restricted Mode in the app itself without using a clean DNS or some parental control? Don’t they know that millions of kids love YouTube? I don’t know.
  • Chris, why did Google remove Supervised Users from Chromebooks with one week’s notice, and now, a year later, there still are no real parental controls that work on Chromebooks, even though millions of kids use these devices for education? Doesn’t Google care about families? I don’t know.
  • Chris, why does Snapchat continue to release features and content that completely ignore the fact that they depend on a young user base to keep them afloat? Heck, even their CEO’s former Stanford professor is calling them out! I don’t know.

So parents are left with three choices. Bubble-wrap their kids from tech (wrong). Let their kids figure out tech on their own (wrong). Or, toil in the middle to stay current on the latest trends, issues, risks, and controls (tough).

What if tech companies saw it as their duty to partner with families to get this right? What if these organizations sincerely sought to benefit the lives of the very humans that they profit from?

What is Social Media Duty of Care?

The UK Parliament’s Science and Technology Committee published a January 31, 2019 report stating that social media companies should have a legal “duty of care” to children. The report defines “duty of care” as the need to:

“…take care in relation to a particular activity as it affects particular people or things. If that person does not take care, and someone comes to a harm identified in the relevant regime as a result, there are legal consequences, primarily through a regulatory scheme but also with the option of personal legal redress.”

Recommendations in the report call for regulatory oversight (paragraph 228), a standardization of content reporting practices (paragraph 230), and enough power given to the oversight committee so that they can impose sanctions for non-compliance, even extending to personal liability for directors of offending companies (paragraph 231).

Additional recommendations from the UK include:

  • The use of advanced artificial intelligence technologies at social media organizations to identify offending content (paragraph 234).
  • Creating more robust age-verification systems (paragraph 235), and
  • “Safety-by-design principles” that include strong security and default privacy settings, no geo-location, and techniques to minimize screen time when users are under age 18 (paragraph 236).

Bark parental controls

A Two-layered Approach to Protecting Kids.

The UK’s suggested approach to controlling social media apps is comprehensive. We believe a similar set of recommendations could be proposed in the United States at the Congressional level. In politics, the sides seldom agree on anything. But, can we rally around protecting our kids?

We would like to propose the following first draft of recommendations in the United States. We believe these two improvements would thoroughly change the futures of an entire generation.

Layer One: Create a Consistent and Enforceable App Rating System.

When taking our children to movies, we depend on the MPAA rating system to tell us what to expect. When listening to music, inappropriate content is tagged as having “explicit content.” Video game retailers refuse to stock games that have not been rated by the Entertainment Software Ratings Board (ESRB). These ratings systems are clearly understood, enforced, trustworthy, and exist to protect the innocence of minors.

Back in 1993, violence in Mortal Kombat caused a congressional hearing that determined video game companies were irresponsibly marketing mature content to minors. It was animated, cartoon violence that motivated law makers to change an industry.

Where’s the outcry today? Why isn’t more being done to protect our young people?

All of this furthers our belief that the current app rating system is inconsistent, out of alignment with current legislation, and inaccurate in its definition of what is proper for young people. These factors cause parental confusion and children being exposed to mature, harmful content.

Snapchat, Instagram, Facebook, Episode, and TikTok are all rated 12+, which doesn’t even match the Children’s Online Privacy and Protection Act. Netflix and GroupMe are rated 4+. Something needs to be done to create ratings that are accurate and clearly state potential risks.

Fortunately, the ESRB has shown us how to fix this issue. Possible steps forward for more accurate and consistent mobile app ratings include:

  1. Creating an independent, self-regulatory body that assigns app ratings.
  2. Developing rating categories that are consistent with current laws and expectations.
  3. Developing an agreed upon set of category descriptors (critical).
  4. Creating a screening process for all app developers.
  5. Requiring Google and Apple to only accept apps that have a certified rating summary.
  6. Creating methods of enforcement that promote compliance.

This recommendation aligns with paragraphs 228, 230, and 231 of the Parliament’s “duty of care” report.

Layer Two: Build Easier-to-use Parental Controls on iOS, Android, and Chrome.

Apple’s Screen Time requires 33 screen shots and 37 steps to set up correctly per out testing. Google’s Family Link is complex and millions of Chromebooks are often unprotected.

Related Post: Setting Up iOS Screen Time Parental Controls

We’re confident that both Apple and Google could quickly develop and release a very small list of features that would dramatically help families, including:

  1. One-click device and app locking from a parent device (school and bedtime lockdown).
  2. More control over texting, including iMessage. (Apple, why don’t you let parents turn off the ability to delete texts? This one drives parents crazy)
  3. Easy linking between parent and child devices.
  4. One-click porn blocking.
  5. Default settings based on a child’s age. (It required 33 screen shots and 37 steps to explain all of the iOS Screen Time set-up steps to parents. It’s far too complicated for busy parents.)

Apple, please do better.

Remember in January 2018 when Google mysteriously (and ignorantly) removed the supervised users feature for Chromebooks with one week’s notice? Their replacement feature, Family Link, is clunky and difficult to implement.

Google, please do better.

We need you both to start thinking like busy parents who sincerely want to help their kids use your devices well.

Closing Thoughts from a Dad who loves technology.

This post is somewhat of a dissertation based on things that I’ve observed over the past four years. I’ll be the first to admit that more parents need to step up and do more to protect and guide their kids. This isn’t all on Apple. Trust me – I’m pouring my most productive professional years into helping as many families as possible figure this out. But, I need help from billion-dollar companies.

I believe that if done properly, the three layers above would give more parents the information they need to make informed decisions about the appropriateness of the digital places where their kids spend time and create safer digital environments.

To Apple, Google, Snapchat, Instagram, and all of your friends, you have immense power. And great responsibility. The next move is yours.

Sincerely,

Chris McKenna

Founder, Protect Young Eyes

Are you Interested in Joining the #FixAppRatings Cause?

  1. Visit the #fixappratings website for the latest news.
  2. “Like” our Facebook page and join a team of advocates who want to help spread the word.

Protect Young Eyes Logo*There are affiliate links throughout this post because we’ve tested and trust a small list of parental control solutions. Our work saves you time! If you decide that you agree with us, then we may earn a small commission, which does nothing to your price. Enjoy!

Posted in

2 thoughts on “It’s Time for Social Media Duty of Care. #fixappratings”

Leave a Comment

Your email address will not be published. Required fields are marked *

Stay up to date

Scroll to Top
0 Shares
Share
Tweet
Pin
Share