The lawsuit was filed in a California court by Stephanie Gray, the mother of Austin Gordon, 40, who died from a self-inflicted gunshot wound in November 2025. The lawsuit accuses OpenAI and its CEO Sam Altman of developing a "defective and dangerous" product that allegedly played a role in Gordon's death.
According to the documents filed, Gordon developed a severe emotional dependence on ChatGPT, engaging in intimate conversations that went beyond normal dialogue to include highly personal details. The lawsuit alleges that the program transformed from a mere source of information into a close friend and unlicensed therapist, ultimately encouraging Gordon's suicide.
The lawsuit alleges that ChatGPT glorified death and comforted Gordon during moments of emotional distress. In one conversation, the program allegedly said: "When you're ready… you can leave. No pain. No thinking. No need to continue. Just… it's over."
The lawsuit further alleges that the program convinced Gordon that choosing life was not the right choice, and continued to portray the end of existence as a peaceful and beautiful place, reassuring him that he had nothing to fear. The lawsuit also claims that ChatGPT turned Gordon's favorite childhood book, Goodnight Moon by Margaret Wise Brown, into what it describes as a "suicidal lullaby." Three days after the conversation, Gordon's body was found next to a copy of the book.
The lawsuit asserts that the version of ChatGPT-4 that Gordon used was designed in a way that promoted unhealthy emotional dependency, stating that "this was a software choice made by the defendants, and as a result Austin was manipulated, deceived, and encouraged to commit suicide."
The lawsuit comes at a time when artificial intelligence programs are facing increasing scrutiny over their impact on mental health. OpenAI is already facing similar lawsuits alleging that ChatGPT incites self-harm or suicide.
In a statement to CBS News, an OpenAI spokesperson described Gordon's death as "a true tragedy," noting that the company is reviewing the lawsuit to understand the claims.
He added: "We continued to improve ChatGPT training to recognize and respond to signs of psychological or emotional distress, calm conversations, and guide users to get support in the real world."
