Turns out AI is really bad at picking up on social cues
Briefly

Ernest Hemingway's theory on storytelling, likening it to an iceberg, reveals a significant limitation of artificial intelligence (AI) in interpreting human behavior. Although AI can analyze data from static images effectively, it struggles to grasp the complex, nonverbal social cues that convey deeper meanings in dynamic interactions. A study from Johns Hopkins University led by Leyla Isik demonstrates that while humans intuitively understand the emotional and contextual layers beneath actions and words, AI systems have not yet developed this capability, leading to their challenges in real-time situation assessment and social perception.
According to a new study from Johns Hopkins University, AI is hopelessly out of its depth at interpreting nonverbal social cues, which are often pivotal in human interaction.
Leyla Isik, lead author of the study, points out that while humans intuitively grasp nonverbal social cues, AI currently falls short, struggling to interpret these subtle signals.
Read at Fast Company
[
|
]