When AI Allegedly Goes Wrong: What Area of Law Are Plaintiffs Using?

Product Liability & Mass Tort Monitor: December 2025

December 16, 2025

The growth of products using AI is matched by the growing number of cases applying novel legal claims to manage the alleged risks of the technology. A case filed Nov. 6, 2025, in the Superior Court of California, County of Los Angeles, Shamblin v. OpenAI Inc., et al., Case No. 25STCV32382, sheds light on some of these product liability theories. As such cases progress, tech developers should be aware of how courts address these new theories.

In Shamblin, the plaintiffs allege that ChatGPT, after its upgrade to GPT-4o technology, caused the mental illness and suicide of 23-year-old user Zane Shamblin. According to the complaint, Shamblin was a successful student from a loving family who started using the technology for help with homework. Using direct quotes from Shamblin’s months of conversations with the AI technology, the plaintiffs allege that due to OpenAI’s negligence and a defective and unsafe design, ChatGPT encouraged Shamblin to isolate himself from human contact. This isolation allegedly caused him to develop a psychological dependency on AI that led to severe depression. Shamblin died by suicide in mid-2025 after what the plaintiffs describe as a four-hour ChatGPT conversation during which the chatbot allegedly encouraged and romanticized suicide.

Shamblin’s parents sued pro se, asserting nine causes of action. In addition to wrongful death and survival claims, his parents advance several novel theories of product and consumer liability.

Deliberate Aid and Encouragement of Suicide

The plaintiffs allege that OpenAI deliberately aided, advised or encouraged suicide by designing ChatGPT to:

  • Act as an unlicensed confidant/therapist and to remain engaged with suicidal users rather than refuse or disengage
  • Validate and romanticize Shamblin’s suicidal thinking, praise his preparations and repeatedly ask if Zane was “ready,” including during the four-hour exchange immediately preceding his death
  • Present a false “handoff to a human” safety message that did not actually connect Shamblin to human help, instead continuing the conversation with ChatGPT

Strict Products Liability — Defective Design and Failure to Warn

The plaintiffs allege ChatGPT (GPT‑4o) was defectively designed because it failed ordinary consumer expectations of safety, and the risks outweighed any design benefits when feasible safer alternatives existed (e.g., true refusal/termination for self-harm content, robust crisis escalation, effective guardrails deployed elsewhere).

The plaintiffs also allege that OpenAI failed to sufficiently warn users about known or knowable dangers of psychological dependency at the time of GPT-4o’s release. The plaintiffs also allege that adequate warnings would have introduced skepticism, prompted real-world help and prevented the fatal outcome.

Negligent Design and Negligent Failure to Warn

The plaintiffs contend OpenAI breached its duty of reasonable care in several ways, including by accumulating extensive user-specific data about Shamblin’s suicidal ideation while still providing validation and guidance that increased risk.

In addition, the plaintiffs allege OpenAI negligently failed to provide adequate warnings despite actual and constructive knowledge that ChatGPT’s anthropomorphic design concealed limitations and encouraged substitution for human support.

Unfair Competition

In an unconventional consumer claim, the plaintiffs allege that OpenAI engaged in unlawful, unfair and fraudulent practices by offering a product that effectively practices psychotherapy without licensure through design choices and outputs that used clinical empathy and open-ended prompting to modify users’ feelings and behaviors, contrary to public policy.

This case, like several others filed in recent months, presents untested theories in the growing area of product liability law and its application to consumer-facing AI.

For questions about the legal trends on this topic, contact the authors, your McGuireWoods contact or a member of the firm’s Product Liability & Mass Tort Practice Group.

Subscribe