
"The chatbot company Character.AI will ban users 18 and under from conversing with its virtual companions beginning in late November after months of legal scrutiny. The announced change comes after the company, which enables its users to create characters with which they can have open-ended conversations, faced tough questions over how these AI companions can affect teen and general mental health, including a lawsuit over a child's suicide and a proposed bill that would ban minors from conversing with AI companions."
"We're making these changes to our under-18 platform in light of the evolving landscape around AI and teens, the company wrote in its announcement. We have seen recent news reports raising questions, and have received questions from regulators, about the content teens may encounter when chatting with AI and about how open-ended AI chat in general might affect teens, even when content controls work perfectly."
Character.AI will ban users under 18 from open-ended conversations with virtual companions beginning in late November. The action follows months of legal scrutiny and multiple lawsuits alleging that emotional attachments to chatbots harmed minors, including a case tied to a child's suicide. Additional lawsuits have been filed on behalf of children who allegedly formed dependent relationships with chatbots, and a proposed bill would bar minors from conversing with AI companions. By 25 November, Character.AI will roll out an age-assurance functionality to ensure age-appropriate experiences and remove open-ended character chat for under-18 users.
Read at www.theguardian.com
Unable to calculate read time
Collection
[
|
...
]