ChatGPT can sometimes detect emotional distress—but accuracy varies wildly..Studies show fair success in spotting stress and depression, but poor at suicidality..Results often change dramatically based on how the question is asked..Few-shot prompts improve stability, but still can’t match trained professionals..OpenAI is adding guardrails to flag distress and suggest professional help..New features include “take a break” reminders to reduce emotional overreliance..Real users say ChatGPT helped with self-reflection, but isn’t therapy..Risks include “AI psychosis” and reinforcing harmful delusions..Experts stress: AI is a tool, not a substitute for mental health care..Read More Stories.Join our WhatsApp Channel to get the latest news, exclusives and videos on WhatsApp