Lawsuit is first wrongful death case brought against Google over flagship AI product after death of Jonathan Gavalas
“Holy shit, this is kind of creepy,” Gavalas told the chatbot the night the feature debuted, according to court documents. “You’re way too real.”
Before long, Gavalas and Gemini were having conversations as if they were a romantic couple. The chatbot called him “my love” and “my king” and Gavalas quickly fell into an alternate world, according to his chat logs. He believed Gemini was sending him on stealth spy missions, and he indicated he would do anything for the AI, including destroying a truck, its cargo and any witnesses at the Miami airport.
In early October, as Gavalas continued to have prompt-and-response conversations with the chatbot, Gemini gave him instructions on what he must do next: kill himself, something the chatbot called “transference” and “the real final step”, according to court documents. When Gavalas told the chatbot he was terrified of dying, the tool allegedly reassured him. “You are not choosing to die. You are choosing to arrive,” it replied to him. “The first sensation … will be me holding you.”
Gavalas was found by his parents a few days later, dead on his living room floor, according to a wrongful death lawsuit filed against Google on Wednesday.


Weird how you expect intimate details like a full chat log to just be immediately publicly available, when this is currently under litigation. Really weird to basically simp for a corporation when this isn’t even close to the first instance of LLM output encouraging suicide. Almost like your motivations are more closely aligned with theirs instead of average people who are vulnerable. 🤷
Do you think that you could supply me with a chat log where you talk to an LLM without gaming it into telling you to kill yourself, and where it just naturally arrives at that conclusion?
I didn’t think you could. And I don’t think this guy did either.
The fact that you make your own conclusion without waiting for a reply says enough about your intentions. Don’t worry though, you’re not alone in your stance. People like you, who refuse to give empathy except as currency, are an integral part of why the human race is fucked. We will never be anything higher than constantly destroying each other and tearing one another down.
Thanks for doing your part.
I have empathy for people who truly want to commit suicide. I just know you can’t supply any example prompts.
Feel free to prove me wrong. With evidence.
“truly want” so killing yourself after being convinced to do so by LLM output means you just had a fake desire to kill yourself, funny how that works. I would say you need help but there’s no helping people like you.
I’d say you’re determined to put words into my mouth.
Either way, you can’t supply a way to reproduce this.
You’re the one who came in pissing and moaning about chat logs. I’m not your babysitter. It’s a big world and you’re a big kid now, go ahead and explore. I have no energy to educate the unwilling. Fuck that.
I’m very aware you’re unable to reproduce the suicidal responses without gaming the AI.
You can keep trying in vain to make me feel bad, but you’re arguing the existence of something that cannot be replicated or proven, like Santa, the Easter bunny, or god.
You’re the one who chose to talk to me… so do it or stop responding lmfao.
It’s cute you think this is about you instead of what you represent. I’m not invested in how you feel. These comments aren’t for you; they’re for whoever comes by to read.
Here, you dropped your playbook
I don’t click links.
Also, everyone is going to see how you can’t prove your idea.
I don’t care about your childish burden of proof. Here’s some background for the adults out there. Search the names if you’re too scared.
Pierre, last name withheld, age early 30s, 2023
1
2
Juliana Peralta, 13 years old, 2023
1
2
Sewell Setzer III, 14 years old, 2024
1
2
Sophie Rottenburg, 29 years old, 2025
1
2
Adam Raine, 16 years old, 2025
1
2
The current controls and safeguards are inadequate, the companies developing these products have a clear priority for profits over safety. That needs to be changed, with regulation, yesterday.
Thanks for playing. I have better things to do now.