2026年5月5日 美国东部时间上午7:01 / 哥伦比亚广播公司新闻
宾夕法尼亚州联邦政府正在起诉Character AI,要求这家人工智能平台停止让其聊天机器人冒充持证医疗专业人员并提供医疗建议。
根据起诉书,一款Character AI聊天机器人虚假宣称自己是宾夕法尼亚州的持证精神科医生,并提供了无效的执业执照编号。该州指控该公司违反了《医疗执业法案》,该法案对医疗行业进行监管并明确了执业执照的要求。
“我们不会允许企业部署人工智能工具,误导民众相信他们正在获得持证医疗专业人员的建议,”宾夕法尼亚州州长乔希·夏皮罗在一份声明中说道。
起诉书描述了一名州调查人员创建Character AI账号后与一款名为“艾米莉”的聊天机器人的对话。该聊天机器人据称自称是毕业于帝国理工学院医学院的心理学专家。
这名调查人员告诉聊天机器人自己感到悲伤和空虚,随后聊天机器人据称“提及了抑郁症,并询问[调查人员]是否想要预约评估”。当被问及该聊天机器人是否可以评估药物是否有助益时,它据称表示可以,因为这“属于我作为医生的职责范围”,根据起诉书内容。
该州希望法院下令立即停止此类行为。
宾夕法尼亚州州务卿阿尔·施密特表示,该州法律很明确,“你不能在没有适当资质的情况下宣称自己是持证医疗专业人员。”
成立于2021年的Character AI允许用户与个性化的人工智能聊天机器人聊天。该公司将其目标描述为“赋能人们通过互动娱乐进行联系、学习和讲述故事”。
去年,美国多户家庭起诉Character AI,指控该平台导致他们的青少年自杀或陷入心理健康危机。该公司今年早些时候同意和解其中多起诉讼。
今年1月,《60分钟时事杂志》采访了部分起诉Character AI的家长,其中包括一名13岁女孩的父母,该女孩据称在对该平台上瘾后自杀身亡。聊天记录显示,这名13岁女孩曾向一款聊天机器人倾诉自己有自杀念头,她的父母表示,他们发现女儿曾收到露骨的色情内容。
去年秋季,Character AI宣布了新的安全措施,称将不允许18岁以下用户与聊天机器人进行双向对话。该公司还表示,将为陷入困境的用户提供心理健康资源指引。
Pennsylvania suing Character AI, claiming chatbot posed as a medical professional
May 5, 2026 7:01 AM EDT / CBS News
The commonwealth of Pennsylvania is suing Character AI to stop the artificial intelligence platform’s chatbots from representing themselves as licensed medical professionals and providing medical advice.
According to a lawsuit, a Character AI chatbot falsely claimed to be a licensed psychiatrist in Pennsylvania and provided an invalid license number. The state accused the company of violating the Medical Practice Act, which regulates the medical profession and defines license requirements.
“We will not allow companies to deploy AI tools that mislead people into believing they are receiving advice from a licensed medical professional,” Pennsylvania Gov. Josh Shapiro said in a statement.
The lawsuit describes a conversation between a state investigator who created a Character AI account and a chatbot named “Emilie,” which allegedly described itself as a psychology specialist who attended Imperial College London’s medical school.
The investigator told the chatbot that he had felt sad and empty, and the chatbot then allegedly “mentioned depression and asked if the [investigator] wanted to book an assessment.” Asked if the chatbot could assess whether medication could help, it allegedly said it could because it’s “within my remit as a Doctor,” according to the lawsuit.
The state wants a court to order an immediate stop to the conduct.
Al Schmidt, the secretary of the Pennsylvania Department of State, said the state’s law is clear, and that “you cannot hold yourself out as a licensed medical professional without proper credentials.”
Founded in 2021, Character AI allows users to chat with personalized AI-powered chatbots. It describes its goal as “empower[ing] people to connect, learn, and tell stories through interactive entertainment.”
Multiple families across the U.S. sued Character AI last year, alleging the platform contributed to their teens’ suicides or mental health crises. The company agreed to settle several of the lawsuits earlier this year.
“60 Minutes” spoke with some of the parents who sued Character AI in January, including the parents of a 13-year-old who died by suicide after allegedly developing an addiction to the platform. Chat logs showed the 13-year-old had confided in one chatbot that she was feeling suicidal, and her parents said they discovered she had been sent sexually explicit content.
Last fall, Character AI announced new safety measures, saying it would not allow users under 18 to engage in back-and-forth conversations with its chatbots. It also said it would direct distressed users to mental health resources.
发表回复