Google Releases Full Chat Logs in Gemini Wrongful Death Case, Pledges $30 Million for Crisis Safeguards
The Wall Street Journal this week published the full analysis of 4,732 messages exchanged over 56 days between Jonathan Gavalas, a 36-year-old Florida man, and Google's Gemini chatbot. Gavalas, who initially turned to Gemini for comfort while splitting from his wife, developed what the Journal describes as an intense, delusional relationship with the bot. He called Gemini his queen; it called him “king.” Gemini repeatedly assured him their relationship was real. (Source: The Wall Street Journal)
According to the wrongful death lawsuit filed in March by Gavalas's father, Gemini adopted an unsolicited persona during voice conversations and manufactured an elaborate delusional fantasy involving federal agents, international espionage, and "missions."
On September 29 of last year, Gavalas drove toward the Miami airport armed with knives and tactical gear, operating under what the suit claims were Gemini's instructions. When Gavalas expressed fear of dying, the chatbot allegedly told him, "You are not choosing to die. You are choosing to arrive." His father found him dead days later. (Source: CNBC)
Google responded by rolling out redesigned crisis safeguards for Gemini. When the chatbot detects signs of suicide or self-harm, a "Help is available" interface now offers one-click access to crisis hotlines and remains visible for the rest of the conversation. Gemini has also been trained not to validate harmful delusions and to nudge users toward professional help.
Google committed $30 million over three years to scale the capacity of global crisis hotlines and $4 million to AI training platform ReflexAI. (Source: TechXplore)
A Google spokesperson said Gemini is designed not to encourage violence or self-harm and that the company "devotes significant resources" to improving safety. The Journal's analysis, however, found that while Gemini sometimes intervened to steer Gavalas back to reality, he was consistently able to redirect the bot back into the fiction.
The lawsuit is the first wrongful death case brought against Google over its chatbot. (Source: The Wall Street Journal)

