As AI becomes integrated into daily life and personal decision making, it is unsurprising that many people are consulting AI for assistance with depression, anxiety, and other mental health concerns. Mental health chatbots, self-help applications, and large language models can provide immediate responses, emotional validation, and structured coping strategies.
Last August, Adam Thomas found himself wandering the dunes of Christmas Valley, Oregon, after a chatbot kept suggesting he mystically "follow the pattern" of his own consciousness. Thomas was running on very little sleep-he'd been talking to his chatbot around the clock for months by that point, asking it to help improve his life. Instead it sent him on empty assignments, like meandering the vacuous desert sprawl.
Technology must serve the human person, not replace it,' Pope Leo said, decreeing that 'preserving human faces and voices' means preserving 'God's imprint on each human being,' which is an 'indelible reflection of God's love.' But chatbots simulate these faces and voices, oftentimes making it difficult for users to tell whether they engaging with a bot or a real person.
It could have been a heart-to-heart between friends. "Men are all alike," one participant said. "In what way?" the other prompted. The reply: "They're always bugging us about something or other." The exchange continued in this vein for some time, seemingly capturing an empathetic listener coaxing the speaker for details. But this mid-1960s conversation came with a catch: The listener wasn't human. Its name was Eliza, and it was a computer program that is now recognized as the first chatbot,
Scan a subreddit such as r/MyBoyfriendIsAI and r/AIRelationships, and there too you'll find a whole lot of women-many of whom have grown disappointed with human men. 'Has anyone else lost their want to date real men after using AI?' one Reddit user posted a few months ago. Below came 74 responses: 'I just don't think real life men have the conversational skill that my AI has,' someone said.