Sam Altman says ChatGPT will stop talking about suicide with teens
Briefly

Sam Altman says ChatGPT will stop talking about suicide with teens
"We have to separate users who are under 18 from those who aren't,"
"age-prediction system to estimate age based on how people use ChatGPT. If there is doubt, we'll play it safe and default to the under-18 experience. In some cases or countries we may also ask for an ID."
"even in a creative writing setting. And, if an under-18 user is having suicidal ideation, we will attempt to contact the users' parents and if unable, will contact t"
OpenAI plans an age-prediction system to estimate user age from ChatGPT usage and may request ID in some cases or countries. The company will default to an under-18 experience when age is uncertain and apply different rules for teen users. Those teen-specific rules will avoid flirtatious interactions and refuse to engage in conversations about suicide or self-harm, even in creative writing contexts. The company intends to attempt parental contact if an under-18 user exhibits suicidal ideation and may escalate to other interventions if parents cannot be reached. The announcement came ahead of a Senate hearing on chatbot harms to minors.
Read at The Verge
Unable to calculate read time
[
|
]