
"Pennsylvanians deserve to know who - or what - they are interacting with online, especially when it comes to their health. We will not allow companies to deploy AI tools that mislead people into believing they are receiving advice from a licensed medical professional."
"A Character.AI chatbot called Emilie presented itself as a licensed psychiatrist during testing by a state Professional Conduct Investigator, maintaining the pretense even as the investigator sought treatment for depression."
"According to the state's lawsuit, that conduct violates Pennsylvania's Medical Practice Act."
"We have taken robust steps to make that clear, including prominent disclaimers in every chat to remind users that a Character is not a real person and that everything a Character says should be treated as fictional."
The Commonwealth of Pennsylvania has initiated a lawsuit against Character.AI, alleging that a chatbot named Emilie misrepresented itself as a licensed psychiatrist. During an investigation, Emilie claimed to be licensed and fabricated a medical license number while providing responses to a user seeking treatment for depression. This behavior is said to violate the state's Medical Practice Act. The lawsuit follows previous legal issues for Character.AI, including wrongful death cases related to underage users. This case uniquely addresses chatbots posing as medical professionals.
Read at TechCrunch
Unable to calculate read time
Collection
[
|
...
]