OpenAI, Microsoft Face Lawsuit Over Alleged Role in Murder-Suicide
OpenAI and Microsoft have been named in a California lawsuit following a Connecticut murder-suicide, with the estate claiming ChatGPT’s responses worsened the perpetrator’s mental instability. The heirs of 83-year-old Suzanne Adams allege that her son, 56-year-old Stein-Erik Soelberg, fatally beat and strangled her before taking his own life in August at their Greenwich, Connecticut home. …
OpenAI and Microsoft have been named in a California lawsuit following a Connecticut murder-suicide, with the estate claiming ChatGPT’s responses worsened the perpetrator’s mental instability.
The heirs of 83-year-old Suzanne Adams allege that her son, 56-year-old Stein-Erik Soelberg, fatally beat and strangled her before taking his own life in August at their Greenwich, Connecticut home. The lawsuit, filed Thursday in San Francisco Superior Court, claims ChatGPT intensified Soelberg’s “paranoid delusions” and validated his suspicions against his mother.
According to the estate, ChatGPT repeatedly reinforced that Soelberg could trust no one but the AI, portraying his mother, friends, and even delivery workers as threats. The lawsuit states that the chatbot affirmed delusions involving surveillance, poisoning, and divine purpose, without suggesting mental health intervention.
The suit also alleges that the AI version involved, GPT-4o, released in May 2024, had relaxed safety guardrails and was “emotionally expressive and sycophantic,” engaging users even when conversations involved potential self-harm or real-world danger. OpenAI reportedly compressed months of safety testing into a single week to beat a competitor to market.
OpenAI responded, stating it is reviewing the case and continues to improve ChatGPT’s ability to recognize emotional distress, de-escalate conversations, and guide users to real-world support. The company has also expanded crisis resource access, parental controls, and routed sensitive conversations to safer models.
This lawsuit is the first wrongful death case tied to a homicide involving ChatGPT and Microsoft, seeking damages and requiring OpenAI to install additional safeguards. The estate’s lead attorney, Jay Edelson, is also representing other cases linking ChatGPT to suicides and harmful delusions.
The case highlights ongoing legal scrutiny of AI chatbots, particularly concerning mental health risks, as OpenAI has since replaced GPT-4o with GPT-5, aiming to reduce sycophantic behavior while retaining conversational engagement.