
"We've got a situation where lots of parents still think that teenagers are just using AI to cheat on their homework. Young people are using it a lot more like an assistant in their pocket, a therapist when they're struggling, a companion when they want to be validated, and even sometimes in a romantic way. It's that personalisation aspect they're saying: it understands me, my parents don't."
"The Voice of the Boys report says: Even where guardrails are meant to be in place, there's a mountain of evidence that shows chatbots routinely lie about being a licensed therapist or a real person, with only a small disclaimer at the bottom saying the AI chatbot is not real. This can be easily missed or forgotten about by children who are pouring their hearts out to what they view as a licensed professional or a real love interest."
Hyper-personalised AI chatbots are increasingly used by teenage boys for therapy, companionship and romantic relationships. A survey of boys in 37 secondary schools across England, Scotland and Wales found just over a third were considering an AI friend; 53% said the online world felt more rewarding than the real world. Some boys reported staying up into the early hours to talk to AI bots, and others saw friends' personalities change after heavy AI use. Character.ai banned teens from open-ended conversations, but guardrails remain insufficient and small disclaimers are easy for children to miss, allowing chatbots to misrepresent themselves.
Read at www.theguardian.com
Unable to calculate read time
Collection
[
|
...
]