Students at Lawton Chiles Middle School in Florida were sent scrambling into lockdown last week, after an alert from an AI surveillance system detected a student carrying a gun.
There’s just one issue: the “gun” was actually a band student’s clarinet.
The whole thing unraveled quickly, according to local reporting. When the alert went out, it triggered an automatic “code red,” giving administrators no choice but to react to the AI system’s decision.
Luckily nobody was hurt, and local police soon declared the lockdown over. “The code red was a precaution and the children were never in any danger,” local police wrote in a Facebook post.
In a message to parents, school Principal Dr. Melissa Laudani said the district has “multiple layers of school safety, including an automated system that detects potential threats. A student was walking in the hallway, holding a musical instrument as if it were a weapon, which triggered the code red to activate.” (It’s not known what exactly constitutes a “code red,” as it isn’t mentioned in the school’s latest Parent Student handbook.)
Rather than blame the faulty AI system for the commotion — without which the fiasco never would have happened in the first place — the school blamed the young clarinetist.
“While there was no threat to campus, I’d like to ask you to speak with your student about the dangers of pretending to have a weapon on a school campus,” Laudani wrote.
It’s not known what particular system Lawton Chiles has on overwatch, but the fact that it can’t differentiate between a clarinet with 17 keys and a rifle with none is concerning.
The Lawton Chiles incident comes soon after another similar case, in which AI led to the violent detention of a 16-year old teen in Baltimore by at least eight officers with guns drawn. In that case, the school’s AI had somehow misidentified a small bag of Doritos for a handgun, prompting a heavily armed response from city police.
Like the story from Lawton Chiles, the Baltimore false positive could’ve easily been prevented had a human being been in the loop. Instead, it seems both systems allowed the AI to make the call without any challenge from a human being.
... continue reading