The Daily BS • Bo Snerdley Cuts Through It!
The Daily BS • Bo Snerdley Cuts Through It!

Get my Daily BS twice-a-day news stack directly to your email.


Killer chatbot? Florida targets ChatGPT as potential accomplice in FSU massacre

by

Florida is taking a hard look at whether a chatbot should be treated less like a tool — and more like a potential accomplice.

James Uthmeier, the Sunshine State’s top lawman has launched a criminal investigation into OpenAI over its wildly popular chatbot, ChatGPT, and whether it played any role in last year’s horrific mass shooting at Florida State University.

The April 17, 2025 attack left two people dead and six wounded — another grim entry in America’s long list of campus tragedies. But what’s making this case explosive is what allegedly happened before the first shot was fired.

State investigators have been combing through chat logs between the suspect, Phoenix Ikner, and the AI program. According to Uthmeier, those conversations may have gone far beyond harmless Q&A.

The allegation? That the chatbot advised the gunman on weapons, ammunition — even timing and location to maximize the number of victims.

“If this were a person on the other end of the screen, we would be charging them with murder,” Uthmeier said flatly. “Just because this is a chatbot, an AI, does not mean that there is not criminal culpability. So, we’re going to look at who knew what, designed what or should have done more.” That’s not just tough talk — it’s a warning shot aimed straight at Silicon Valley.

Florida law allows prosecutors to charge anyone who “aids, abets or counsels” a crime as if they pulled the trigger themselves. Now the question is whether lines of code — written by humans — can cross that same legal threshold.

To get answers, the state’s Office of Statewide Prosecution has subpoenaed OpenAI for a trove of internal documents: safety protocols, training data, and records detailing how the company handles threats and works with law enforcement. Florida wants to know exactly what Big Tech knew — and when.

Predictably, OpenAI is pushing back hard.

“Last year’s mass shooting at Florida State University was a tragedy, but ChatGPT is not responsible for this terrible crime,” spokesperson Kate Waters said, adding the company proactively flagged the suspect’s account to authorities after the fact.

Waters insists the bot merely provided “factual responses” readily available across the internet and “did not encourage or promote illegal or harmful activity.” She also emphasized that ChatGPT is a “general-purpose tool used by hundreds of millions of people every day for legitimate purposes.”

That’s the standard Silicon Valley defense: it’s just a tool, not the user. But critics — and increasingly, in law enforcement — aren’t buying the shrug. Mark Glass put it bluntly: “Artificial intelligence is built by man. Man is fallible. Man makes mistakes.” If humans build it, humans are responsible for it.

And here’s where things get uncomfortable for the tech giants. Across the country — and the world — lawmakers are scrambling to figure out how to regulate AI before it outruns accountability. From deepfakes to cybercrime, the risks aren’t theoretical anymore.

Meanwhile, Ikner, 20, has been charged with two counts of first-degree murder and seven counts of attempted murder after allegedly opening fire with weapons stolen from his parents’ home. He was shot and wounded by police at the scene. The criminal case against him is straightforward. The case against AI? Not so much.

Still, Florida is clearly signaling it won’t sit back and let tech companies hide behind algorithms while society deals with the fallout. The state has already moved to stiffen penalties around AI-driven crimes, including disturbing new laws targeting AI-generated child exploitation material.

Now it’s asking a question that would’ve sounded like science fiction just a few years ago: When a machine gives deadly advice, who pays the price?

Silicon Valley might want to come up with a better answer — fast.

Submit a Comment

Your email address will not be published. Required fields are marked *