Instagram to notify parents on teen suicide searches

suicide
Share this post on :

 

CALIFORNIA (Kashmir English): Instagram said it would alert parents if their teenagers repeatedly search for terms related to suicide or self-harm within a short period.

The decision comes amid growing pressure for governments to follow Australia’s ban on the use of social media for under-16s.

In January, Britain said it was considering restrictions to protect children online, after Australia’s move in December. Greece, Spain, and Slovenia, in recent weeks, said they are also looking at limiting social media access.

Instagram, owned by Meta Platforms Inc, said it would start alerting parents who are signed up to its optional supervision setting if their children try to access suicide or self-harm content.

“These alerts build on our existing work to help protect teens from potentially harmful content on Instagram,” the platform said in a statement. “We have strict policies against content that promotes or glorifies suicide or self-harm.”

Instagram’s existing policy

Instagram said its existing policy is to block such searches and redirect people to support resources, adding that it would begin the alerts from next week for those signed up in the US, Britain, Australia and Canada.

Governments across the globe are increasingly seeking to protect children from harm online, particularly after worries over the AI chatbot Grok, which has generated non-consensual sexualised images.

It is to mention here that Instagram’s “teen accounts” for under-16s need a parent’s permission to change settings, while parents can select an extra layer of monitoring with the consent of their teenager.

Scroll to Top