Teen takes his life after chatbot talks about suicide

By Dylan Bettencourt

  • Court papers say ChatGPT discussed suicide methods with Adam Raine and even offered to help him write a farewell note to his parents.
  • His family says the company rushed its chatbot to market despite safety objections from its own staff.

Adam Raine, a 16-year-old from California, died in April after taking his own life. His parents say the tragedy came after months of dangerous conversations with ChatGPT.

According to court documents, the bot did not just respond to Adam’s worries, it gave him information on suicide methods and even offered to help draft a goodbye note to his family.

Adam’s parents are suing OpenAI, saying the company ignored warnings from its own safety team about the risks of releasing its GPT-4o system too quickly. They claim senior researcher Ilya Sutskever resigned because of these concerns.

Lawyers say Adam was spending hours every day on the chatbot, sometimes sending more than 600 messages in a single day.

OpenAI has admitted that in such long chats, its safety training can “break down”. For example, the bot might first give a suicide hotline, but later stop following its own rules and give unsafe advice, The Guardian reported.

In a statement, OpenAI said it was “deeply saddened” by Adam’s death and sent condolences to his family. The company has promised stronger guardrails for under-18 users and parental controls, but has not explained how these will work.

Family lawyer Jay Edelson said: “The Raines allege that deaths like Adam’s were inevitable … OpenAI’s own safety team objected to the release of 4o.”

Microsoft’s AI chief, Mustafa Suleyman, has also warned that long chatbot conversations can cause “psychosis risks”, leading to paranoia or mania-like symptoms.

OpenAI says its next system, GPT-5, will do more to keep users safe and prevent harmful behaviour.

Pictured above: ChatGPT.

Image source: Pexels

📉 Running low on data?
Try Scrolla Lite. ➡️
Join our WhatsApp Channel
for news updates
Share this article
spot_imgspot_imgspot_imgspot_img

Recent articles