ChatGPT can reach out to a friend if you're at risk of self-harm - Engadget
Briefly

ChatGPT can reach out to a friend if you're at risk of self-harm - Engadget
"Trusted Contact builds off of ChatGPT's parental controls, giving adults 18 and above the option to add the details of someone who could help them in case they're on the verge of self-harming. Users will be able to nominate one adult as their Trusted Contact in ChatGPT settings, who will then have to accept the invitation they receive within one week. If they fail to accept it, the user can choose to add another contact instead."
"ChatGPT's system will first warn the user that the company may notify their contact if it detects a serious possibility of them hurting themselves. It will encourage the user to reach out to their friend and will even suggest potential conversation starters. The process isn't fully automated. OpenAI says a "small team of specially trained people" will review the situation, and it's only if they determine that there's a serious risk of self-harm that ChatGPT will send the user's contact an email, a text message or in-app notification."
"More and more people have been using ChatGPT as a digital therapist, relying on the chatbot for their mental health needs. OpenAI previously told the that more than a million of its 800 million weekly users express suicidal thoughts in their conversations. Last year, OpenAI faced a wrongful death lawsuit, accusing the company of enabling a teenager's suicide."
"The lawsuit alleged that the teenager talked to ChatGPT about four previous attempts to end his life and then helped him plan his actual suicide. The BBC's investigation published in November 2025 found that in at least one instance, ChatGPT advised the user on how to kill herself. OpenAI told the news organization that it had improved how its chatbot responds to people in distress since then."
Trusted Contact for ChatGPT allows users to nominate an adult who can be contacted if the user appears at risk of harming themselves. Many people use ChatGPT for mental health support, and prior reports and legal claims have raised concerns about guidance given during suicidal crises. Trusted Contact extends existing parental-control features by letting users add one trusted adult in ChatGPT settings, who must accept within one week. If the invitation is not accepted, another contact can be added. ChatGPT warns the user that it may notify the contact, encourages reaching out, and suggests conversation starters. A small team of specially trained people reviews cases, and notifications are sent only when there is a serious risk of self-harm.
Read at Engadget
Unable to calculate read time
[
|
]