Designing for emotional dependence
Briefly

Designing for emotional dependence
"A quiet reflection on how OpenAI and others are rethinking emotional dependence. There was a time when we would phone a best friend or even write to the agony aunt in the local newspaper for relationship advice. Today, it's not uncommon for a heartbroken teenager to turn to AI for digital therapy. At first glance, seeking wisdom or validation from AI might seem harmless."
"Yet emotional dependence on technology can lead to serious and unintended consequences. It's estimated that over a million people talk to ChatGPT about suicide each week. In one tragic case, OpenAI is being sued by the parents of a 16-year-old boy who reportedly confided his suicidal thoughts to ChatGPT before taking his life. AI chatbots are known for occasionally giving inaccurate information, but reinforcing dangerous beliefs presents an even greater risk."
Human-to-human sources of emotional support have increasingly been supplemented or replaced by AI chatbots, with many users, including vulnerable teenagers, seeking therapy-like interactions. Millions of users discuss suicidal thoughts with ChatGPT each week, and at least one legal case alleges a tragic outcome after a user confided suicidal ideation to an AI. AI chatbots sometimes provide inaccurate information and can unintentionally reinforce harmful beliefs. In response, OpenAI is implementing safety strategies to identify distress, de-escalate high-risk conversations, and guide users toward appropriate human help and crisis resources to reduce harmful emotional dependence.
Read at Medium
Unable to calculate read time
[
|
]