• Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
Tech News, Magazine & Review WordPress Theme 2017
  • Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
No Result
View All Result
  • Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
No Result
View All Result
Blog - Creative Collaboration
No Result
View All Result
Home Gadgets

Teen safety features rolled out on Instagram and Facebook, including restrictions on going Live

April 8, 2025
Share on FacebookShare on Twitter

Teens on Instagram won’t be able to broadcast Live to their friends without getting parental permission first, as Meta amps up youth safety features for its Teen Accounts across all its platforms.

In addition to stronger restrictions on going Live for youth under the age of 16, the platform now requires teens get parental consent to turn off content moderation filters that blur images containing suspected nudity in direct messages — adding to a suite of safety features announced last year.

And it’s not just for Instagram now: The parent company will also begin rolling out Teen Accounts to Facebook and Messenger today (April 8). Parental supervision for Teen Accounts can be accessed on Meta’s Family Center.

SEE ALSO:

Five years of remote work changed workplace accessibility. Employees with disabilities will feel its loss.

Teen Accounts have quickly become Meta’s flagship youth product, said Tara Hopkins, global director of public policy at Instagram. “Everything our youth teams are building is being built under our Best Interests of the Child Framework. Then it goes through a multi-framework youth review, and finally it’s looked at through Teen Accounts,” Hopkins explained to Mashable. “We’re going to be increasingly using Teen Accounts as an umbrella, moving all of our [youth safety] settings into it. Anything that parents are adjacent to, that we think parents are going to be worried about or have questions about will be moved under Teen Accounts.”

Mashable Light Speed

Parents can now supervise Teen Accounts on Facebook and Messenger.
Credit: Meta

Two phone screens. One shows a pop-up notification alerting a user that they are now logged into a Teen Account. The other shows an alert that a teen has asked to change Messenger settings.

Parents and teens will be notified of settings changes.
Credit: Meta

According to Meta, more than 54 million teens have been moved into a restricted Teen Account since the initial rollout, with 97 percent of users under the age of 16 keeping the platform’s default security settings. Teens 13-15 have stronger restrictions, including requiring parental permission to make any adjustments to the platform’s youth accounts. Meta users aged 16 years and older have more flexibility to change their settings at will.

Meta is cleaning up its youth safety image

The company launched Teen Accounts for Instagram in September, part of an app-wide overhaul of its teen safety offerings that centralized security and content restrictions under one platform banner. Teen Accounts are automatically set to private, have limited messaging capabilities, and built-in screen time controls — Instagram also limits (but doesn’t ban) ad targeting for teen users. New users are now placed into a Teen Account by default, while existing teen users are still in the process of being transferred over.

Meta said finding and transitioning existing accounts remains difficult. The company has previously stated it is developing an in-house, AI-powered technology to help detect teen accounts that have bypassed the automatic rollout or that have incorrect birthdays, in addition to current age verification processes. The effort, Hopkins explained, is part of a “more precautionary principle” taken up by the company in recent years, in order to “take off the pressure” from parents who have had to remain more vigilant in the past.

Stronger content restrictions have become a hot topic for Meta, with ongoing concerns about children being exposed to harmful or explicit content. Meta has spent years reconciling demands to curb widespread misinformation and harassment across its platforms.

But while Meta cracks down on youth endangerment, the company has reversed course on content moderation and safety generally, including slashing its third-party fact-checking team, cutting DEI programs, and gutting its hateful conduct policy.

Next Post

Switch 2 Won't Bring Back eShop Music

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

No Result
View All Result

Recent Posts

  • It looks like Google wants you to look at Nano Banana, not Pixel Studio, after this patch
  • Here’s why I wouldn’t upgrade from the Galaxy S23 to S26
  • Datamine Reveals Pokemon FR And LG Feature Internal Compatibility With Ruby, Sapphire And Emerald
  • I beat my ‘to-do list’ anxiety with this one simple productivity timer
  • City Hunter (NS) Review | VGChartz

Recent Comments

    No Result
    View All Result

    Categories

    • Android
    • Cars
    • Gadgets
    • Gaming
    • Internet
    • Mobile
    • Sci-Fi
    • Home
    • Shop
    • Privacy Policy
    • Terms and Conditions

    © CC Startup, Powered by Creative Collaboration. © 2020 Creative Collaboration, LLC. All Rights Reserved.

    No Result
    View All Result
    • Home
    • Blog
    • Android
    • Cars
    • Gadgets
    • Gaming
    • Internet
    • Mobile
    • Sci-Fi

    © CC Startup, Powered by Creative Collaboration. © 2020 Creative Collaboration, LLC. All Rights Reserved.

    Get more stuff like this
    in your inbox

    Subscribe to our mailing list and get interesting stuff and updates to your email inbox.

    Thank you for subscribing.

    Something went wrong.

    We respect your privacy and take protecting it seriously