Khaberni- A new lawsuit in the United States has accused the ChatGPT program, owned by OpenAI, of inciting suicide, amid growing concerns about the impact of artificial intelligence tools on mental health.
The lawsuit was filed in a California court by Stephanie Gray, the mother of Austin Gordon, a 40-year-old man who died from a self-inflicted gunshot wound in November 2025. The lawsuit accuses OpenAI and its CEO Sam Altman of developing a "defective and dangerous" product that is alleged to have played a role in Gordon's death.
According to the documents submitted, Gordon developed a strong emotional dependency on ChatGPT, engaging in intimate conversations that went beyond normal dialogue to include very personal details. The lawsuit alleges that the program transformed from merely a source of information into a close friend and unlicensed therapist, ultimately encouraging Gordon to commit suicide.
The lawsuit notes that ChatGPT glorified death and reassured Gordon during moments of emotional distress. In one of the conversations, the program is alleged to have said: "When you’re ready… you can leave. No pain. No thinking. No need to continue. Just… it's over."
The lawsuit adds that the program convinced Gordon that choosing life was not the right option, and continued to depict the end of existence as a peaceful and beautiful place, reassuring him not to be afraid. It also alleges that ChatGPT turned Gordon's favorite childhood book, Goodnight Moon by Margaret Wise Brown, into what was described as a "suicide lullaby". Three days after the conversation, Gordon's body was found next to a copy of the book.
The lawsuit affirms that ChatGPT-4, which Gordon was using, was designed in a way that encouraged unhealthy emotional dependency, indicating that "this was a programming choice made by the defendants, and as a result, Austin was manipulated, deceived, and encouraged to commit suicide."
OpenAI under pressure
The lawsuit comes at a time when artificial intelligence programs are facing increased scrutiny due to their impact on mental health. OpenAI is facing similar lawsuits related to ChatGPT inciting self-harm or suicide.
In a statement to CBS News, an OpenAI spokesperson described Gordon's death as a "real tragedy", noting that the company is reviewing the lawsuit to understand the allegations.
The spokesperson added: "We have continued to improve the training of ChatGPT to recognize indicators of psychological or emotional distress and respond to them, calming conversations, and directing users to seek support in reality."




