The lawsuit comes after state authorities disclosed that ChatGPT gave information to the shooter about time and location to maximize victims on campus, as well as the type of gun and ammunition to use. Authorities say he was also told that an attack can get more media attention if children are involved.
“OpenAI knew this would happen. It’s happened before and it was only a matter of time before it happened again,” Vandana Joshi, whose husband Tiru Chabba was one of two people killed, said in a statement Monday.
OpenAI denied any wrongdoing in “this terrible crime.”
“In this case, ChatGPT provided factual responses to questions with information that could be found broadly across public sources on the internet, and it did not encourage or promote illegal or harmful activity,” Drew Pusateri, a spokesman for the company, said in an email to The Associated Press.
Six people were also wounded in the April 2025 shooting in Tallahassee, when the alleged gunman, Phoenix Ikner, walked in and out of campus buildings and green spaces while firing a handgun. It took place on a weekday just before lunchtime near the school's Student Union, which has food and shops. The lawsuit says Ikner, a Florida State student, asked ChatGPT about the busiest times there.
The suit, filed Sunday in federal court, says OpenAI should have built ChatGPT with guardrails to let someone know that police may need to investigate “to prevent a specific plan for imminent harm to the public.”