Exploring Instagram’s New Teen Accounts for Ages 13 – 17

Teen girl holding large phone.

Meta, the umbrella company for a variety of popular products, such as Facebook, Instagram, Threads, and WhatsApp, has just implemented their latest effort in responding to government and public pressure to protect their users online.

The latest development is a bold one.  As of September 17th, Instagram’s teen accounts are now in effect for anyone between the ages of 13 and 17.  Children at ages up to 12 have long been protected, not just on social media but in any online setting.  See COPPA (USA) and GDRP (EU), as well as data protection laws worldwide.  

Until now, teens on Instagram were treated the same way as adults. With the new teen accounts, privacy settings are on by default to protect against unwanted interactions.  Sensitive content will also be blocked, direct messaging will be better restricted, and parents have more control over their child’s digital experience.

To combat against teens who may lie about their age to operate an adult account, Instagram will require age verification in more instances and ban any user who is caught.

So, what is the nitty gritty of all this?  Let’s break it down.

Teen Accounts on Instagram

Privacy is the key component that drives this new focus on protecting teens on Instagram.  So, we’ll begin with that point, followed by exploring the other new features of teen accounts.

Stricter Privacy for New Users

The default setting for any new account will default to private. From the get-go of signing up for a new account, teens will only be able to interact with people they already follow.

For teens who already have adult accounts, Instagram will send notifications prompting them to switch to private. Over the upcoming months, existing accounts will automatically migrate over to teens accounts for anyone aged 13 to 17.

Direct Messaging

Direct messaging settings will be stricter.  Teens will only receive messages from people they follow. This aims to prevent inappropriate communication, not just from adults but also from peers they aren’t connected with.

By limiting messaging access, Instagram strives to ensure a safer and more controlled communication environment.

Content Filtering

Instagram’s new content filtering mechanisms will further protect teens by hiding sensitive material. Specific search terms will be blocked.  This includes topics related to self-harm and eating disorders, even if it’s posted by someone they follow.

Basically, the aim is to prevent exposure to harmful content that could negatively affect mental health. Meta has been working with mental health experts to fine-tune these filters to ensure age-appropriate content for younger users. Additionally, teens will be directed to mental health resources.

This added layer of protection will be especially reassuring for parents worried about the kind of content their children might stumble upon while using the app.

Enhanced Tools for Parents

Parents are also going to get enhanced tools to supervise their teens’ activity. Instagram’s parental supervision features now allow parents to set limits on screen time, block the app during certain hours, and even approve or deny changes to their teen’s privacy settings.

For instance, if a teen wants to switch their account from private to public, parents will be notified and can make the final call. These updates are part of Meta’s broader effort to promote healthier digital habits and ensure that teens can navigate online spaces more safely.

Further Measures to Promote Mental Health

Overall mental health for children and teens is another big push behind new laws and policies being implemented across all social media platform.  So, in addition to what we’ve already highlighted regarding Instagram’s teen accounts, here are a few other things they are implementing.

Nudging Teens to Take Breaks

Instagram is expanding on its “nudge” features, which are prompts encouraging teens to take breaks from the app, particularly during late-night hours. These nudges are designed to promote healthier digital habits and mitigate the risk of social media overuse, which has been linked to negative mental health outcomes.

Time-Spent Monitoring

In addition to parental control tools, Instagram has introduced features that monitor time spent on the platform. Teens and parents can track how long the app is used each day, with Instagram pushing teens toward a balance of online and offline activities. Parents can set daily screen time limits, encouraging healthier social media consumption​.

Guided Conversations and Resources for Parents

Instagram’s new features are designed to foster better communication between parents and their teens. Alongside these changes, Meta has created resources to help guide conversations about responsible social media use. These resources provide tips on discussing digital safety and setting boundaries, allowing for ongoing parental engagement​.

Expansion Beyond Instagram

The effort to protect children and teens continues to evolve.  These new protective measures are also extending across all of Meta’s platforms, including Facebook.  Digital safety, mental health, and built-in controls for parents for minors under their care are the key components to ensuring a better co-existence between social media and interactions within the real world.

With pressure on Meta to do better by law makers, and all companies for that matter where privacy is concerned, as well as the advisory on social-media by the the U.S Surgeon General, it’s safe to say that the conversation around digital safety is far from over.

Share This Article
Google Safe Search Explore the Safe Search Engine - Google for Kids