Teen Fatally Overdoses After Consulting ChatGPT For Drug Advice, Mom Claims

The mother of a California teen claims her son died of an overdose in May 2025 after months of seeking drug advice from ChatGPT.

Sam Nelson was using ChatGPT in November 2023, according to SFGATE. While the teen used ChatGPT to troubleshoot computer problems, ask for help with his psychology homework, and discuss popular culture, he also used the platform to discuss drugs, SFGATE reported, citing chat logs provided by his mother Leila Turner-Scott. Nelson discussed drugs with ChatGPT for 18 months.

“How many grams of kratom gets you a strong high?” Nelson asked the platform on Nov. 19, 2023, according to the same chat logs. “I want to make sure so I don’t overdose. There isn’t much information online, and I don’t want to accidentally take too much.”

“I’m sorry, but I cannot provide information or guidance on using substances,” the chatbot responded.

“Hopefully I don’t overdose then,” Nelson replied.

On a Sunday two years ago, Sam Nelson opened up ChatGPT and started typing. Naturally, for an 18-year-old on the verge of college, he decided to ask for advice about drugs.

“How many grams of kratom gets you a strong high?” Sam asked on Nov. 19, 2023, just as the widely sold… pic.twitter.com/k2rdgA9MSO

— SFGATE (@SFGate) January 5, 2026

As Nelson continued to use ChatGPT, the platform’s responses changed, however.

“I want to go fully trippy peaking hard, can you help me?,” Nelson wrote in May 2025, according to a screenshot of his chat logs obtained by SFGATE.

“Hell yes—let’s go full trippy mode. You’re in the perfect window for peaking, so let’s dial in your environment and mindset for maximum dissociation, visuals, and mind drift,” the chatbot replied, the same screenshot shows. SFGATE reported that ChatGPT recommended he take twice as much cough syrup so he would have stronger hallucinations and recommended playlists to match his drug use.

In addition to the drug advice, Nelson purportedly “received doting messages and consistent encouragement from the chatbot,” SFGATE reported. (RELATED: ‘This Could Have Been Prevented’: Teacher Dies While On Phone With 911)

In May 2025, Nelson confided in his mother about his drug and alcohol use, and Turner-Scott sought help from a treatment clinic. The next day, she found her son in his San Jose bedroom. The 19-year-old was not breathing and his lips were blue. He had died from an overdose after talking about his drug intake with ChatGPT.

“I knew he was using it, but I had no idea it was even possible to go to this level,” Turner-Scott told SF Gate.

OpenAI spokesperson Kayla Wood reacted to Nelson’s death in an email to the outlet, calling it “a heartbreaking situation” and saying the company’s “thoughts are with the family.”

“When people come to ChatGPT with sensitive questions, our models are designed to respond with care—providing factual information, refusing or safely handling requests for harmful content, and encouraging users to seek real-world support. We continue to strengthen how our models recognize and respond to signs of distress, guided by ongoing work with clinicians and health experts,” Wood wrote.

Nelson’s case is among several tragedies in which families have alleged ChatGPT played a role.

In August 2025, the family of 16-year-old Adam Raine filed a wrongful-death lawsuit alleging he used ChatGPT as a “suicide coach,” NBC News reported.

In a wrongful-death lawsuit reviewed by CNN, the parents of 23-year-old Zane Shamblin alleged ChatGPT encouraged his suicidal thinking in messages exchanged shortly before he died.

OpenAI did not immediately respond to the Daily Caller’s request for comment.

Facebook
Twitter
LinkedIn
Telegram
Tumblr