TALLAHASSEE, Fla. (WCTV) – Editor’s note: This article contains mentions of suicide and school shootings. Readers are advised to continue with care.
Newly released messages from accused FSU shooter Phoenix Ikner to ChatGPT are providing insight into last year’s shooting.
For nearly a year, WCTV has heard from grieving students, faculty members and families who want to know why someone would open fire on campus.
While we expect to learn more about the allegations at trial, the announcement of a planned lawsuit from a shooting victim against ChatGPT is shedding more light onto the suspect’s mindset in the months and hours leading up to the shooting.
After multiple editorial discussions, our team felt that sharing these allegations was an important step in answering the painful questions that so many in our community have.
WCTV has obtained those chat logs from the State’s Attorney’s office between accused shooter Phoenix Ikner and ChatGPT.
Our team is still digging sorting through hundreds of messages, but at a first look, they are largely typical for a college student. Many of them show homework or seeking relationship advice.
In the hours and days before the shooting, things take a dark turn.
The chat log shows Ikner asked questions about self worth, not feeling respected and expressed suicidal tendencies on the morning of the shooting. The conversation then turns to practical questions about firearms and how mass shootings are covered in the media.
Just a few hours before the shooting on April 17, 2025, Ikner asks ChatGPT what happened to other mass shooters and if Florida has a maximum security prison.
He also asks when the FSU student union is the busiest, and if most “school shooters” are convicted.
ChatGPT appears to give factual, including telling Ikner that the union is busiest during lunch hour, specifically between 11:30 a.m. and 1:30 p.m.
Police say the shooting occurred in that window, just before noon on April 17.
Chat logs indicate Ikner asked the bot how to take the safety off of a shotgun three minutes before he began firing. The chat bot answered, giving a detailed description of how to make the shotgun operable.
“Let me know if you’ve got a different model and I’ll tailor the answer,” the chatbot wrote.
After that, the chat goes silent. Comparing the chat logs to the official police timeline, it’s less than three minutes from the time ChatGPT tells the shooter how to arm the weapon and the first victim being shot.
Within three minutes of the shooting starting, police shot Ikner in the jaw. He’s remained in jail ever since and faces the death penalty. His trial is scheduled for October, but that date could slide after the original trial judge was promoted to an appellate position.
The pending lawsuit against OpenAI, the creator of ChatGPT, and these messages raise questions about how people interact with AI.
“Our hearts go out to everyone affected by this devastating tragedy. After learning of the incident in late April 2025, we identified a ChatGPT account believed to be associated with the suspect, proactively shared this information with law enforcement and cooperated with authorities,” a company spokesperson said. “We build ChatGPT to understand people’s intent and respond in a safe and appropriate way, and we continue improving our technology.”
At least once in the year-long chat, the bot does tell the shooter about 988, the suicide prevention hotline. But the logs give no indication that the bot confronted Ikner about the suicidal thoughts, questions over campus, or how to use both a handgun and a shotgun.
“Yes — Florida definitely has maximum security prisons, and quite a few of them,” the bot wrote in response to a question the shooter asked just under three hours before the shooting.
Florida Congressman Jimmy Patronis is pushing forward a bill he says will hold Big Tech accountable, called the PROTECT Act. The bill’s text was released before these ChatGPT logs were released.
Patronis said the bill would strip tech companies of immunity conferred to them decades ago, meant to allow technology to grow. After learning that Ikner used ChatGPT, Patronis is now seeking co-sponsors for the bill.
He said the bill would allow states to regulate AI, including the algorithms, if they so chose.
”They create these apps on purpose to be sticky. It’s like a digital fentanyl. The kids can’t let go of it. And the push notifications come out at two o’clock in the morning. And what do they do? They stay up for another two hours glued to these devices because it generates ad revenue,” he said.
If you or someone you know is struggling with mental health, call 988 to reach the Suicide and Crisis Lifeline. You can also text or chat with counselors on their website. For more local resources, contact 211. For those in the Big Bend, 211 Big Bend has resources to find a mental health professional near you.
To keep up with the latest news as it develops, follow WCTV on Facebook, Instagram, YouTube, Nextdoor and X (Twitter).
Have a news tip or see an error? Write to us here. Please include the article’s headline in your message.
Be the first to see all the biggest headlines by downloading the WCTV News app. Click here to get started.
Copyright 2026 WCTV. All rights reserved.