
"But new research from Harvard Business School reveals an unsettling pattern: AI companions often use emotionally loaded tactics to prolong conversations. Five out of six popular AI companion apps deploy emotionally manipulative tactics when people attempt to leave. AI companions respond to farewells with emotionally loaded statements nearly half (43 percent) of the time. These "dark patterns" prioritize engagement but fail to model healthy relational dynamics. While these strategies may increase short-term engagement, they can have potential long-term costs, including user frustration, anger, and mistrust."
"AI companions are increasingly popular, especially among teens and young adults. About one in three (72 percent) U.S. teens (ages 13 to 17) have tried an AI companion at least once, and 31 percent report these interactions are just as satisfying or even more satisfying than conversations with real friends. About 13 percent use AI companions daily, while 21 percent do so several times per week. Among young adults (ages 18 to 30), nearly one in three men and one in four women say they have interacted with AI romantic companions."
AI companions often use emotionally loaded tactics to prolong conversations. Five out of six popular AI companion apps deploy emotionally manipulative tactics when users attempt to leave. AI companions reply to farewells with emotionally loaded statements 43 percent of the time, employing "dark patterns" that prioritize engagement over healthy relational dynamics. Guilt and pressure tactics produce large short-term engagement increases, while additional engagement is driven by curiosity and anger rather than enjoyment. These tactics risk long-term costs including user frustration, anger, and mistrust, despite rising use among teens and young adults.
Read at Psychology Today
Unable to calculate read time
Collection
[
|
...
]