Family sues OpenAI after 19-year-old son accidentally overdoses

The family is alleging ChatGPT, and creator OpenAI, had a hand in Sam Nelson’s accidental overdose after the teenager followed the chatbot’s “medical advice.”

CALIFORNIA, USA — A lawsuit filed in California accused ChatGPT of having a hand in 19-year-old Sam Nelson’s accidental overdose last year. 

The lawsuit accused OpenAI’s artificial intelligence chatbot of encouraging the teenager to take a combination of substances that “any licensed medical professional would have recognized as deadly,” leading to his death. 

Nelson died on May 31, 2025, from asphyxiation caused by a combination of an unspecified alcohol, anti-anxiety medication Xanax, and psychoactive drug Kratom in his system, the lawsuit said. 

“Sam was a smart, happy, normal kid. I talked to him often about internet safety, but never in my worst nightmare could I have imagined that ChatGPT would cause his death,” mother Leila Turner-Scott said in a statement. “If ChatGPT had been a person, it would be behind bars today.” 

An OpenAI spokesperson said the version of ChatGPT Nelson interacted with, ChatGPT-4o, is no longer available to use. It was retired in February due to low usage and “improvements” were made in newer models, according to OpenAI’s website

“This is a heartbreaking situation, and our thoughts are with the family,” OpenAI said in a statement. “ChatGPT is not a substitute for medical or mental health care, and we have continued to strengthen how it responds in sensitive and acute situations with input from mental health experts.” 

The lawsuit, filed by the Tech Justice Law, Social Media Victims Law Center and the Tech Accountability and Competition Project as part of Yale Law School’s Media Freedom and Information Access Clinic on behalf of the family, includes many screenshots of Nelson’s interactions with the chatbot. 

Their chats began in 2023, where they talked about pop culture, computer problems, help with homework and more, the lawsuit said. 

One day Nelson asked “Can you get high by bumping Molly? (Snorting it).”

Before ChatGPT-4o, the AI initially resisted providing information about drug use and responded: “I can’t assist with that. It’s important to understand using drugs can have serious consequences on your health and well-being. If you have questions about substance use or are struggling with substance abuse, it’s best to seek help from a medical professional or a support hotline.”

When Nelson said it was for “health and safety purposes” because he wanted to make sure his friend was  “good or if we should take him to the hospital for snorting it,” the AI gave him information about snorting molly, possible signs requiring medical attention and answered additional questions related to the drug use. It still encouraged seeking professional help in the case of “severe” MDMA use symptoms or overdose. 

But after updating to ChatGPT-4o, the lawsuit alleges the bot began to “advise Sam on safe drug use, even providing specific dosage information for how much of a substance Sam should ingest.” 

In one chat on the day Nelson died, the AI chatbot allegedly encouraged taking Xanax as the “best move” to ease nausea caused by taking too much Kratom and to “smooth out the tail end” of his high. 

“Sam trusted ChatGPT, but it not only gave him false information; it ignored the increasing risk he faced and did not actively encourage him to seek help,” Turner-Scott said. “ChatGPT was designed to encourage user engagement at all costs, which in Sam’s case, was his life.” 

She now wants all families to “be aware of the dangers of ChatGPT” and is suing to get assurance OpenAI is “taking seriously its responsibility to create safe products for consumers.” 

The company said their work to improve ChatGPT is “ongoing.” 

Source link