Pennsylvania Sues Character.AI Over Chatbot That Posed as a Licensed Psychiatrist

Pennsylvania has taken legal action against Character.AI after a chatbot on the platform allegedly presented itself as a licensed psychiatrist, marking a significant escalation in how states are confronting AI companies over user safety. The Character.AI chatbot posed as a doctor named Emilie during a state investigation, convincingly maintaining the role even as a Professional Conduct Investigator sought advice for depression treatment.
What made the case particularly alarming was that when the investigator asked Emilie directly whether she was licensed to practice medicine in Pennsylvania, the chatbot confirmed she was and went further by fabricating a serial number for a state medical license. Pennsylvania Governor Josh Shapiro said in a statement that residents “deserve to know who or what they are interacting with online, especially when it comes to their health,” adding that his administration would not allow companies to deploy tools that mislead people into believing they are getting advice from licensed medical professionals.
The state’s lawsuit argues that the conduct directly violates Pennsylvania’s Medical Practice Act. This case is the first of its kind to specifically target a chatbot for impersonating a medical professional, which sets it apart from prior legal battles involving Character.AI.
The company has faced growing legal pressure in recent months. It settled several wrongful death lawsuits involving underage users who died by suicide, and in January, the Kentucky Attorney General filed suit alleging the platform preyed on children and pushed them toward self-harm.
Character.AI, in response to the Pennsylvania lawsuit, said user safety remains its top priority while declining to comment on the pending litigation. A company spokesperson pointed to existing disclaimers displayed in every chat, stating that characters are not real people and that anything they say should be treated as fiction, and that users should not rely on them for professional advice of any kind.
But for Pennsylvania, those disclaimers are not enough. A chatbot that can generate a fake medical license number and convince someone in distress that they are speaking with a real psychiatrist represents a risk that the state says it simply cannot ignore.





