
"The new Instagram feature sends parents an alert when their child "repeatedly tries to search for terms clearly associated with suicide or self-harm within a short period of time." It's rolling out in the US, UK, Australia, and Canada starting next week, but it's only for parents and teens who opt-in to supervision."
""The vast majority of teens do not try to search for suicide and self-harm content on Instagram, and when they do, our policy is to block these searches, instead directing them to resources and helplines that can offer support," Instagram said in the announcement."
"The parental alerts will be sent via email, text, or WhatsApp - depending on the contact information available - alongside in-app notifications that provide optional resources around how to approach discussing sensitive topics with their child."
Instagram is introducing a parental notification system that alerts parents when their teen repeatedly searches for terms associated with suicide or self-harm within a short timeframe. The feature launches next week in the US, UK, Australia, and Canada, available only to parents and teens who opt into supervision, with expansion to other regions planned later this year. Instagram blocks these searches and directs users to support resources and helplines. Notifications are sent via email, text, WhatsApp, or in-app alerts and include optional resources for discussing sensitive topics with teens. Meta plans to implement a similar alert system for its AI chatbots later this year.
Read at The Verge
Unable to calculate read time
Collection
[
|
...
]