
"Character.AI, the chatbot platform accused in several ongoing lawsuits of driving teens to self-harm and suicide, says it will move to block kids under 18 from using its services. The company announced the sweeping policy change in a blog post today, in which it cited the "evolving landscape around AI and teens" as its reason for the shift. As for what this "evolving landscape" actually looks like, the company says it's"
"the marketplace, allegedly resulting in the emotional and sexual abuse of minor users. The announcement also doesn't cite any internal safety research. Character.AI CEO Karandeep Anand, who took over as chief executive of the Andreessen-Horowitz-backed AI firm in June, told The New York Timesthat Character.AI is "making a very bold step to say for teen users, chatbots are not the way for entertainment, but there"
Character.AI will block users under 18 from its chatbot services, citing concerns about AI interactions with teens. The stated rationale references an evolving landscape around AI and teens, news reports raising questions, and inquiries from regulators about content teens may encounter and potential effects of open-ended AI chat even when content controls function. The announcement did not mention multiple lawsuits alleging reckless and negligent product release and resulting emotional and sexual abuse of minors, and it did not cite internal safety research. CEO Karandeep Anand described the move as prioritizing alternatives for teen entertainment and declined to comment on ongoing litigation.
Read at Futurism
Unable to calculate read time
Collection
[
|
...
]