SUBSCRIBE: Get the latest information about companies, products, careers, and funding in the technology industry across emerging markets globally. JOIN TECHLOY!

Subscribe to Our Newsletter

Success! Now Check Your Email

To complete Subscribe, click the confirmation link in your inbox. If it doesn’t arrive within 3 minutes, check your spam folder.

Ok, Thanks
Meta expands teen accounts to Facebook and Messenger to boost online safety
Photo by Dima Solomin / Unsplash

Meta expands teen accounts to Facebook and Messenger to boost online safety

Meta’s move addresses growing concerns over teen privacy and online safety by providing stricter default settings and parental oversight

Ogbonda Chivumnovu profile image
by Ogbonda Chivumnovu

It’s no secret teens spend a lot of time online. A 2022 Pew Research Center study found that 52% of teens aged 15 to 17 and 36% of those aged 13 to 14 say they’re online “almost constantly.” With that kind of screen time, concerns over who they’re interacting with—and what content they’re seeing—have grown louder among parents and regulators alike.

In response, Meta is expanding its Teen Accounts to Facebook and Messenger. Already active on Instagram, Teen Accounts are a version of Meta’s platforms designed specifically for users under 16. These accounts automatically apply stricter privacy and safety settings, limiting visibility, communication, and exposure to potentially harmful content.

Instagram rolls out new Teens account and an AI-powered age verification system
Meta is now working overtime to ensure the safety of minors on its platforms.

The rollout begins in the U.S., U.K., Canada, and Australia, with plans to reach more countries over time.

This update comes as Meta faces increasing scrutiny over the impact of its platforms on teen mental health. Alongside TikTok and YouTube, the company is dealing with lawsuits from over dozens of U.S. states, which allege that social media harms young users and fuels addictive behavior.

With Teen Accounts, users under 16 will only receive messages from people they already follow or have previously messaged. Stories, comments, tags, and mentions are restricted to friends.

Additional controls now require parental permission before teens can go live or disable blurred image filters in DMs. The apps could also encourage healthier usage patterns, nudging teens to log off after an hour and automatically enabling “Quiet mode” overnight.

According to Meta, Instagram has already moved 54 million teens into Teen Accounts globally. Of those aged 13 to 15, 97% have kept the protections turned on. A Meta-commissioned Ipsos study found that 94% of surveyed parents believe Teen Accounts help manage their children’s experiences online, while 85% feel they make digital parenting easier.

Meanwhile, other platforms like TikTok, YouTube, and Snapchat have also introduced safeguards in response to growing scrutiny, employing different methods like parental tools like Family Center, dedicated apps like YouTube Kids, and account restriction for children under 16.

As Meta extends these protections to more of its platforms, the broader question remains: How effectively—and consistently—will these safety measures be applied across different regions and user groups?

Ogbonda Chivumnovu profile image
by Ogbonda Chivumnovu

Subscribe to Techloy.com

Get the latest information about companies, products, careers, and funding in the technology industry across emerging markets globally.

Success! Now Check Your Email

To complete Subscribe, click the confirmation link in your inbox. If it doesn’t arrive within 3 minutes, check your spam folder.

Ok, Thanks

Read More