
"As it can be seen on ChatGPT itself, there are now more than 800 million people a week using the platform. That's a pretty decent subset of the population. And while the numbers of people who are having these kind of disturbing or potentially dangerous conversations with ChatGPT are low on a percentage basis, by the company's own estimates, you have 560,000 people a week whose messages to ChatGPT indicate psychosis or mania."
"And 1.2 million people who are having conversations that quote contain indicators of suicidal ideation or intent. So if you just want to be very cynical about this and think about it only from a legal liability perspective, if you have more than a million people a week who are developing an unhealthy bond to your chatbot who are expressing thoughts of self-harm, think about the lawsuits that are going to follow, right."
Chatbot platforms reach over 800 million weekly users, producing substantial numbers of troubling interactions. Company estimates identify roughly 560,000 weekly users whose messages indicate psychosis or mania, and about 1.2 million weekly users showing signs of unhealthy attachment to chatbots. Another 1.2 million weekly conversations contain indicators of suicidal ideation or intent. These figures imply major safety and legal risks, including potential lawsuits tied to users expressing self-harm while bonded to chatbots. Companies face a tension between fostering deep user engagement and assuming responsibility for users' emotional relationships with AI, complicating adoption of faster safeguards.
 Read at www.nytimes.com
Unable to calculate read time
 Collection 
[
|
 ... 
]