Instagram will notify parents who have activated parental supervision on their teenager’s account if repeated searches related to suicide or self-harm are detected, the platform has announced.
The alerts will be sent by email or text message, depending on the communication method selected by parents during registration.
How the notification system works
Once parents are informed, the application will also provide a telephone number for emergency support and guidance developed in cooperation with health professionals on how to approach a conversation with their child.
The system will be rolled out next week in the United States, the United Kingdom, Australia and Canada, with plans to expand to additional countries at a later stage.
The Meta-owned platform acknowledged that notifications may occasionally be triggered without an actual risk being present. However, it stated that experts consider the measure an appropriate starting point for safeguarding young users.
Broader safety measures
The initiative forms part of a wider series of steps taken in recent years by social media platforms aimed at strengthening protections for minors.
At the same time, a trial is under way in Los Angeles involving Instagram, owned by Meta, and YouTube, owned by Google. A young woman alleges that the platforms contributed to depression, anxiety and body dysmorphic disorder.
The claimant, identified in court documents as Kaylee J.M., began using Instagram at the age of nine and YouTube at six. Her legal representatives argue that the companies sought to increase profits by fostering dependency among children, despite being aware of potential risks to mental health.
The platforms deny the allegations and state that the evidence in the case does not substantiate the claims.
Source: AFP