Khaberni - Instagram has announced that it will begin sending alerts to parents if teenagers repeatedly search for content related to suicide or self-harm over a short period.
The new feature will be available to parents enrolled in the app's parental supervision system, and aims — according to the company — to enable them to intervene early and provide support to their children.
When is the alert sent?
The platform, owned by Meta, states that it already prohibits direct searches for content that encourages suicide or self-harm, according to a report published by "TechCrunch" that was viewed by "Al Arabiya Business".
But the new system will be activated when repeated search attempts for phrases like:
- Words that encourage suicide or self-harm.
- Phrases that might indicate that the teenager is at risk.
- General terms like "suicide" or "self-harm".
The alert will reach the parents via email, text messages, or the WhatsApp application, alongside an in-app notification, including guidance resources to help them start a supportive dialogue with their children.
A step amidst legal pressures
The announcement comes at a time when Meta and several other major tech companies face lawsuits accusing them of failing to protect teenagers from the mental harm associated with social media use.
During this week's hearings before the Federal Court in Northern California, Instagram's chief Adam Mosseri was interrogated about delays in launching basic safety features, including a nudity filter in private messages for teenagers.
Testimony in a separate case before the Los Angeles County Superior Court revealed that an internal study at Meta found that parental control tools did not significantly impact reducing compulsive app usage among children, especially those facing significant life pressures.
A balance between protection and privacy
Instagram emphasized that it aims to avoid sending excessive alerts, as too many could reduce their effectiveness.
The company explained that it adopted a certain threshold that requires repeated search operations within a short period before sending the notification, in consultation with experts in the field of suicide prevention and self-harm.
The feature will start reaching users in the United States, the United Kingdom, Australia, and Canada next week, and will be expanded to other regions later this year.
In a future step, the platform plans to expand the scope of alerts to include situations where the teenager tries to engage with artificial intelligence tools within the app regarding topics related to suicide or self-harm.
With this update, Instagram is trying to enhance proactive protection tools, at a time when there are increasing demands to hold digital platforms accountable for their impact on teenagers' mental health.



