NEW YORK: In response to global regulatory pressures urging enhanced protections for children on its platforms, Meta Platforms announced on Tuesday that it would implement tighter content restrictions for teens on Instagram and Facebook. The move follows legal actions and concerns about the impact of social media on the mental health of young users.
According to a blog post from Meta, all teenagers will now be automatically placed into the most restrictive content control settings on both Instagram and Facebook. Additionally, Meta will limit additional search terms on Instagram, making it more challenging for teens to encounter potentially sensitive content or accounts when using features like Search and Explore.
Lawsuit Against Meta
This decision comes in the wake of a lawsuit filed by dozens of US states in October against Meta Platforms and Instagram, alleging that the social media giant’s platforms contribute to a youth mental health crisis by fostering addiction. The move aims to address concerns about the impact of social media on young users and their exposure to potentially harmful content.
Meta is also facing scrutiny in Europe, where the European Commission has sought information on the measures taken by the company to protect children from illegal and harmful content.