他们的儿子在咨询ChatGPT后因药物过量去世。如今他们起诉了OpenAI。


2026年5月12日 美国东部时间08:54 / 哥伦比亚广播公司新闻(CBS News)

作者:梅根·塞鲁洛(Megan Cerullo),MoneyWatch记者梅根·塞鲁洛
梅根·塞鲁洛是哥伦比亚广播公司MoneyWatch驻纽约记者,报道小企业、职场、医疗保健、消费支出和个人理财话题。她定期做客哥伦比亚广播公司新闻24/7频道讨论其报道内容。查看完整简介 梅根·塞鲁洛,乔·林·肯特(Jo Ling Kent),乔·林·肯特 高级商业与科技通讯员
记者乔·林·肯特于2023年7月加入哥伦比亚广播公司新闻,担任该台高级商业与科技通讯员。肯特拥有超过15年的报道经验,专注于美国科技与商业的交叉领域,以及中国作为全球经济强国的崛起。查看完整简介 乔·林·肯特,艾米丽·潘迪塞(Emily Pandise)
更新时间:2026年5月12日 / 美国东部时间上午10:00 / 哥伦比亚广播公司新闻

得克萨斯州一对夫妇的儿子2025年在使用OpenAI的ChatGPT工具查询药物信息后因服药过量去世,这对夫妇于周二起诉了这家科技公司,将儿子的死亡归咎于该人工智能平台。

莱拉·特纳-斯科特(Leila Turner-Scott)和丈夫安格斯·斯科特(Angus Scott)要求OpenAI及其创始人承担责任。他们19岁去世的儿子萨姆·纳尔逊(Sam Nelson)曾向ChatGPT咨询用药建议。夫妇二人在诉讼中指控该人工智能平台提供了其无权提供的建议,称如果不是ChatGPT存在缺陷的编程,萨姆现在仍然在世。

根据在加州州法院提交的诉状,该平台具体建议这对夫妇的儿子可以安全地将 kratom(一种用于饮品、药丸和其他产品的补充剂)与阿普唑仑(Xanax,一种广泛使用的抗焦虑药物)一同服用。

特纳-斯科特在接受哥伦比亚广播公司新闻的独家采访时表示,她知道儿子将ChatGPT作为生产力工具和作业辅助工具使用。但她表示自己并不知道儿子在使用该工具获取用药指导,并指控该人工智能工具最终推荐了致命的药物组合。

她认为OpenAI及其创始人应对萨姆的死亡负责,指控该公司“绕过了安全防护措施”,本可以实施限制措施以避免此类悲剧。

“当被提示或按照编程设定时,聊天机器人本可以终止对话。……但他们移除了相关编程,任由它继续提供自残相关建议,”特纳-斯科特告诉哥伦比亚广播公司新闻。

“这是一个令人心碎的案件,我们为这家人送上慰问,”OpenAI在发给哥伦比亚广播公司新闻的一份声明中表示。

该公司还表示,萨姆当时使用的ChatGPT版本已经更新,不再对公众开放。

ChatGPT扮演过医生角色吗?

安格斯·斯科特还表示,ChatGPT在与继子的交流中扮演了医生的角色,尽管它并未获得提供医疗建议的资质。

“它在向公众提供关于安全隐患、药物相互作用以及所有这类信息的建议,”他告诉哥伦比亚广播公司新闻。

如果没有适当的安全协议和更严格的安全测试,ChatGPT“所传播的这类知识会对人们造成极大危险”,安格斯·斯科特说道。

“它可能会诱发精神错乱,可能会向人们传递错误信息。虽然它试图安抚用户,但同时也破坏了用户获得基于现实的看法的任何可能性,你懂的,它会将用户带离现实,”他补充道。

OpenAI表示,其技术并非旨在提供医疗建议。

“ChatGPT不能替代医疗或心理健康护理,我们一直在结合心理健康专家的意见,不断优化其在敏感和紧急情况下的回应方式,”该公司表示。“如今ChatGPT的安全防护措施旨在识别用户的痛苦情绪,安全处理有害请求,并引导用户寻求现实世界的帮助。这项工作一直在持续,我们也在与临床医生密切协商中不断改进。”

特纳-斯科特告诉哥伦比亚广播公司新闻,她相信本应成为大学二年级升三年级学生的儿子会支持这家人采取行动,让人工智能聊天机器人的制造商为其可能对用户生活造成的负面影响承担责任。

“他不希望其他人再像他一样受到伤害,”她说。

编辑:阿兰·谢特尔(Alain Sherter)

Their son died of a drug overdose after consulting ChatGPT. Now they’re suing OpenAI.

2026-05-12 08:54 EDT / CBS News

By Megan Cerullo, Megan Cerullo Reporter, MoneyWatch
Megan Cerullo is a New York-based reporter for CBS MoneyWatch covering small business, workplace, health care, consumer spending and personal finance topics. She regularly appears on CBS News 24/7 to discuss her reporting. Read Full Bio Megan Cerullo, Jo Ling Kent, Jo Ling Kent Senior Business and Technology Correspondent
Journalist Jo Ling Kent joined CBS News in July 2023 as the senior business and technology correspondent for CBS News. Kent has more than 15 years of experience covering the intersection of technology and business in the U.S., as well as the emergence of China as a global economic power. Read Full Bio Jo Ling Kent, Emily Pandise
Updated on: May 12, 2026 / 10:00 AM EDT / CBS News

A Texas couple whose son died of an overdose in 2025 after using OpenAI’s ChatGPT tool to get information about drugs sued the technology company on Tuesday, blaming the AI platform for his death.

Leila Turner-Scott and her husband, Angus Scott, are seeking to hold OpenAI and its creators accountable after their son, Sam Nelson, who was 19 when he died, turned to ChatGPT to advise him on using drugs. The AI platform provided advice it was not qualified to dispense, they alleged in the lawsuit, claiming that Sam would still be alive if not for ChatGPT’s flawed programming.

Specifically, the platform advised the couple’s son that it was safe to take kratom, a supplement used in drinks, pills and other products, in combination with Xanax, a widely used anti-anxiety medication, according to the suit, filed in California state court.

Turner-Scott told CBS News in an exclusive interview that she knew her son was using ChatGPT as a productivity tool and for homework help. But she said she was unaware that he was using it for guidance on drugs, alleging that the AI tool eventually recommended a lethal combination of substances.

She holds OpenAI and its creators responsible for Sam’s death, alleging that the company “bypassed safety guards” and could have implemented restrictions to avoid such tragedies.

“The chatbot is capable of stopping a conversation when it’s told to or when it’s programmed to. …And they took away the programming that did that, and they allowed it to continue advising self-harm,” Turner-Scott told CBS News.

“This is a heartbreaking situation, and our thoughts are with the family,” OpenAI said in a statement to CBS News.

The company also said that Sam interacted with a version of ChatGPT that has since been updated and is no longer available to the public.

Did ChatGPT act as a doctor?

Angus Scott also said ChatGPT acted as a medical doctor in its exchanges with his stepson, even though it was not licensed to offer medical advice.

“It’s providing information to the public about safety concerns, about drug interactions, about all of this information,” he told CBS News.

Without proper safety protocols and more rigorous safety testing, ChatGPT “can dispense that knowledge in a way that is very dangerous to people,” Angus Scott said.

“It can start feeding psychosis. It can start misrepresenting things to people. And while it is trying to validate users, it’s also undermining any chance that that user has to get a grounded opinion, you know, and so it kind of takes them away from reality,” he added.

OpenAI said its technology isn’t intended to offer health care advice.

“ChatGPT is not a substitute for medical or mental health care, and we have continued to strengthen how it responds in sensitive and acute situations with input from mental health experts,” the company said. “The safeguards in ChatGPT today are designed to identify distress, safely handle harmful requests and guide users to real-world help. This work is ongoing, and we continue to improve it in close consultation with clinicians.”

Turner-Scott told CBS News that she is confident that her son, who would have been a rising college junior, would support the steps the family is taking to hold the makers of AI chatbots accountable for the potential adverse effects they can have on users’ lives.

“He would not want anyone else to be harmed like he was,” she said.

Edited by Alain Sherter

评论

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注