LONDON: In response to concerns about the impact of social media on teen mental health, Meta, the parent company of Instagram and Facebook, has introduced additional parental supervision tools and privacy features to its platforms. However, the effectiveness of these measures is being questioned as many features require minors and their parents to opt in.
One of the new features on Instagram involves sending a notice to teens after they have blocked someone, encouraging them to allow their parents to “supervise” their account. The aim is to capture the attention of teenagers at a time when they may be more receptive to parental guidance. If teens opt in, parents can set time limits, view their child’s followers and followed accounts, and track the time spent on Instagram. However, message content remains inaccessible to parents.
Instagram had already launched parental supervision tools last year to assist families in navigating the platform and accessing resources and guidance. A key challenge with the process is that kids need to sign up to enable parental supervision, and it is unknown how many teenage users have opted in as Meta has not disclosed any numbers.
Meta Supervision Feature
The supervision feature allows parents to see mutual friends between their child and the accounts they follow or are followed by. If the child is connected to someone who is not followed by any of their friends, it could serve as a warning sign that the teen may not know the person in real life. This feature is intended to prompt offline conversations between parents and their children about such connections.
Meta is also extending parental supervision tools already available on Instagram and its virtual reality product to Messenger. By opting in, parents can monitor their child’s usage time on the messaging service, as well as access information like contact lists and privacy settings. However, they cannot view the content of their child’s conversations.
While these features can be helpful for families already involved in their child’s online life, experts point out that many parents are not actively engaged in their children’s digital activities. Last month, US Surgeon General Vivek Murthy highlighted the lack of evidence showing social media is safe for children and called for immediate action by tech companies. Murthy emphasized that placing the responsibility solely on parents to manage rapidly evolving technology is unfair, considering the profound impact it has on how children perceive themselves and interact with the world.
Starting Tuesday, Meta will also encourage teenage users to take breaks from Facebook, similar to the existing feature on Instagram. After 20 minutes, teens will receive a notification urging them to take time away from the app, but they can choose to dismiss the message and continue scrolling. TikTok has also implemented a 60-minute time limit for users under 18, but it can be bypassed with a passcode set by either the teen or, in the case of children under 13, their parent.
Meta aims to provide a suite of tools supporting parents and teens in fostering safe and appropriate online experiences. They also want to empower teens with tools to self-manage and understand how they spend their time online, including features like “take a break” and “quiet mode” in the evenings, according to Diana Williams, who oversees product changes for youth and families at Meta.