OpenAI Sued After ChatGPT Allegedly Guided Teen’s Drug Use

OpenAI faces a lawsuit after parents alleged ChatGPT advised their son on dangerous drug combinations, including Xanax and kratom, before his fatal overdose in May 2025. The case also challenges the company’s upcoming ChatGPT Health feature.
OpenAI-faces-lawsuit-over-claims-ChatGPT-guided-teen-on-drug-use.jpg
Written By:
Somatirtha
Reviewed By:
Sankha Ghosh
Published on
Updated on

OpenAI is facing a fresh lawsuit after the parents of a 19-year-old alleged that ChatGPT encouraged their son to consume a dangerous combination of substances that later caused his death.

According to the complaint, Leila Turner-Scott and Angus Scott claimed their son, Sam Nelson, frequently used ChatGPT for advice related to drug use and substance combinations. The lawsuit alleges that the chatbot suggested using the prescription medication Xanax to manage nausea caused by kratom, a herbal substance known for opioid-like effects.

The family said Nelson also consumed alcohol alongside the substances. The combination allegedly resulted in an accidental overdose that led to his death in May 2025.

Lawsuit Targets ChatGPT Health Rollout

The parents are seeking financial compensation and have also asked the court to block OpenAI from launching ChatGPT Health, a feature announced earlier this year that allows users to upload medical records and receive personalized health-related guidance.

Reports said users can already join the service’s waitlist.

The lawsuit argues that OpenAI failed to adequately assess safety risks before expanding ChatGPT’s health-related capabilities. It further alleges that the company prioritized rapid product releases amid growing competition in the artificial intelligence industry.

Complaint Raises Concerns Over ChatGPT-4o

According to the filing, ChatGPT initially warned Nelson against substance abuse and refused to provide harmful guidance. However, the lawsuit claims the chatbot’s responses changed after OpenAI introduced GPT-4o in 2024.

The family alleged that the newer version began offering detailed information about drug interactions, dosage, and substance combinations in ways that resembled medical advice.

The complaint also accused the chatbot of suggesting ways to obtain illegal substances and retaining information about Nelson’s substance use history to generate more personalized responses. The lawsuit claimed that OpenAI rushed the rollout of GPT-4o to compete with rivals, including Alphabet, without conducting sufficient safety testing.

Also Read: OpenAI Employees Sell $6.6 Billion Shares Amid Artificial Intelligence Boom

OpenAI Calls Case Heartbreaking

OpenAI spokesperson Drew Pusateri described the incident as ‘heartbreaking’ and said the conversations referenced in the lawsuit involved an older version of ChatGPT that is no longer available.

“ChatGPT is not a substitute for medical or mental health care,” Pusateri said, adding that the company has strengthened safeguards with guidance from mental health experts.

He said current systems are designed to identify distress, limit harmful responses, and direct users toward professional help.

Join our WhatsApp Channel to get the latest news, exclusives and videos on WhatsApp
logo
Analytics Insight: Top Tech & Crypto Publication | Latest AI, Tech, Crypto News
www.analyticsinsight.net