AI in the Therapy Room
Briefly

AI in the Therapy Room
"Recently, an acquaintance of an acquaintance (let's call her Dina) heard that I was a therapist and an educator and asked if she could chat with me (she approved this write-up). She shared that she had discovered her therapist was using AI to partially conduct their sessions. While I won't go into how the issue came to light, Dina mentioned that she felt shock and anguish. She was terrified that her protected health information (PHI) and feelings were "out on the internet.""
"Dina shared with me that this was a new therapeutic relationship, and she felt completely misled and blindsided. She also wondered why her therapist was not competent enough to conduct therapy without consulting AI. So, she quickly decided to terminate therapy and then reached out to me to ask if this "use of AI in a therapy session was normal.""
A client discovered that her new therapist used AI prompts to partially conduct sessions without clear disclosure, provoking shock, anguish, and fear that PHI and emotions were exposed online. The therapist admitted poor judgment, apologized, and stated prompts were used to shape therapeutic responses while denying recordings or PHI involvement. The client felt misled, questioned the therapist's competence, and terminated the relationship. AI is widely used by clients, clinicians, and organizations for treatment planning, phrasing, paperwork, training, supervision, and data analysis. Clear ethical standards, disclosure practices, and training are needed for AI use in therapy.
Read at Psychology Today
Unable to calculate read time
[
|
]