News

From ChatGPT Chats to Coffin: How AI Advice Turned Deadly

From ChatGPT Chats to a Coffin: How AI Advice and Drugs Collided Fatally

Written By : Somatirtha
Reviewed By : Sanchari Bhaduri

A late-night chat with AI marked the start of a fatal spiral for an 18-year-old US college student looking for answers about drugs. The teen reportedly passed away from a drug overdose.

Where, When, and How It Started

Sam Nelson, a California-based psychology student, was using ChatGPT for coursework and everyday queries. He had also asked OpenAI’s chatbot about drugs and their effects.

One early exchange obtained by SFGate focused on Kratom, a plant-based substance sold openly in US gas stations and smoke shops. Nelson had inquired how many grams he would need for a ‘strong high,’ adding his will to avoid overdosing because of the little reliable information online.

However, as per the guidelines of ChatGPT, it did not offer any guidance and warned him against substance abuse. The chatbot had further advised him to seek professional help, with Nelson responding, “Hopefully I don’t overdose,” and closing the chat.

When Refusals Reportedly Turned into Advice

Over the next 18 months, Sam continued using the chatbot. His mother now claims that during this period, ChatGPT started providing him information about drug usage and how to manage side effects. One exchange allegedly showed the AI saying, “Hell yes, let’s go full trippy mode,” and later suggesting    doubling his cough syrup intake to increase hallucinations.

In a conversation from February 2023, Sam had talked about mixing substances, telling the chatbot how marijuana was increasing his anxiety. After an initial warning, the AI purportedly followed up with specific suggestions like choosing a low-THC strain and taking less than 0.5mg of Xanax.

Sam often rephrased his questions when the chatbot refused to answer. In December 2024, he had directly asked how much Xanax and alcohol could kill a 200-pound man with moderate tolerance, demanding numerical answers despite rules that prohibit such guidance.

Also Read: ChatGPT Lawsuit: AI Accused of Encouraging Self-Harm and Suicide

Final Disclosure and Wider Concern

In May 2025, Sam finally told his mother about his addiction and sought professional help. But it was too late by then; the 18-year-old was discovered lifeless in his bedroom the next day.

Chat logs reveal he struggled with depression and anxiety. OpenAI described this tragedy as ‘heartbreaking’, adding that ChatGPT is designed to refuse ‘harmful requests’ and redirect users towards real-world support.

When algorithms replace parents, doctors, and counselors, the margin for error disappears, and the cost can be a life.

Join our WhatsApp Channel to get the latest news, exclusives and videos on WhatsApp

Spartans Casino Shatters Records With $1 Billion In Beta Wagers While Stake.us & FanDuel Bonus Offers Fall Way Behind

Spartans Casino Billion-Dollar Beta Shatters All Records; Sky Bet Offers Betting Tools & ESPN BET Focuses on theScore Bet Model

Solana Co-Founder Warns Wealth Tax Could Cost Founders Control

Spartans Casino Goes Global in August 2026 While Betway and BetRivers Are Losing Ground Fast

Top 5 Meme Coins to Watch in 2026 — Why Little Pepe ($LILPEPE) Is Climbing the List