As a law professor who studies the impact of new technologies on individuals, relationships and social institutions, I can understand the appeal of a manufactured spouse. They can be kinder, prettier, more comforting and smarter than the human version. They are available whenever you want them and never fight for control of the remote. During COVID-19, we talked to loved ones on a screen, so the switch to a chatbot is not so dramatic.
There is much anxiety these days about the dangers of human-AI relationships. Reports of suicide and self-harm attributable to interactions with chatbots have understandably made headlines. The phrase AI psychosis has been used to describe the plight of people experiencing delusions, paranoia or dissociation after talking to large language models (LLMs). Our collective anxiety has been compounded by studies showing that young people are increasingly embracing the idea of AI relationships; half of teens chat with an AI companion at least a few times
You may have run across the new "slur" making the rounds online, and in middle school lunchrooms: clanker. Borrowed from Star Wars (where battle droids get called "clankers"), the word is supposed to be a knockout insult to robots and A.I. Which would sort of make sense, if machines could actually take offense at anything. Since they can't, clanker is basically an insult that punches at nothing, perhaps the least-effective slur in history.
"If my goal was to understand people who fall in love with AI boyfriends and girlfriends, why not rent a vacation house and gather a group of human-AI couples together for a romantic getaway?"