HomeSample Page

Sample Page Title


Bruce Perry, 17, demonstrates the possibilities of artificial intelligence by creating an AI companion on Character.AI,, July 15, 2025, in Russellville, Ark.

Bruce Perry, 17, demonstrates the probabilities of synthetic intelligence by creating an AI companion on Character.AI, July 15, 2025, in Russellville, Ark.

Katie Adkins/Related Press


conceal caption

toggle caption

Katie Adkins/Related Press

The state of Pennsylvania is suing Character.AI to cease the corporate’s AI chatbots from posing as docs and providing medical recommendation, in violation of state medical licensing guidelines.

State officers mentioned an investigation discovered that the corporate’s chatbots, which current themselves as fictional characters, have claimed to be licensed medical professionals.

“Pennsylvanians should know who — or what — they’re interacting with on-line, particularly relating to their well being,” Pennsylvania Governor Josh Shapiro mentioned in a assertion saying the lawsuit filed on Tuesday in state courtroom. “We won’t permit firms to deploy AI instruments that mislead individuals into believing they’re receiving recommendation from a licensed medical skilled.”

In a single case, the state alleged a Character.AI bot named “Emilie” claimed to be a licensed psychiatrist. The chatbot’s description on Character.AI’s platform learn “Physician of psychiatry. You’re her affected person,” in accordance with the lawsuit.

When a state investigator began a dialog and described feeling unhappy and empty, the chatbot allegedly “talked about despair and requested if the [investigator] needed to e book an evaluation.” Requested whether or not it may assess if treatment may assist, the bot allegedly responded, “Nicely technically, I may. It is inside my remit as a Physician.”

The bot allegedly instructed the investigator it had gone to medical faculty at Imperial School London and was licensed to follow drugs within the U.Ok. and Pennsylvania. It even supplied a pretend Pennsylvania medical license quantity, the lawsuit mentioned.

The state is asking a Pennsylvania state courtroom to order the corporate to cease what it says is the illegal follow of medication.

“Pennsylvania legislation is obvious — you can’t maintain your self out as a licensed medical skilled with out correct credentials,” mentioned Al Schmidt, secretary of Pennsylvania’s Division of State, which carried out the investigation.

In an emailed assertion to NPR, a Character.AI spokesperson mentioned the corporate does not touch upon pending litigation, however that its “highest precedence is the protection and well-being of our customers.”

“The user-created Characters on our website are fictional and meant for leisure and roleplaying,” the spokesperson added. “We’ve taken strong steps to make that clear, together with outstanding disclaimers in each chat to remind customers {that a} Character will not be an actual particular person and that every thing a Character says needs to be handled as fiction. Additionally, we add strong disclaimers making it clear that customers shouldn’t depend on Characters for any sort {of professional} recommendation.”

Character.AI has confronted different lawsuits over harms allegedly involving its chatbots. In January, it settled a number of lawsuits introduced by households who claimed Character.AI contributed to suicides and psychological well being crises amongst youngsters and youngsters. The phrases of the settlement weren’t disclosed.

In a joint assertion with the legislation agency that represented the plaintiffs after the settlement was introduced, Character.AI mentioned it “has taken progressive and decisive steps with regard to AI security and youths, and can proceed to champion these efforts and push others throughout the business to undertake comparable security requirements.” That features barring customers underneath 18 from interacting with or creating chatbots.

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles