If you’re eating a bag of chips in an area where “AI” software is being used to monitor people’s behaviour, you might want to reconsider. Some high school kid in the US was hanging out with his friends, when all of a sudden, he was being swarmed by police officers with with guns drawn. Held at gunpoint, he was told to lie down, after which he was detained. Obviously, this is a rather unpleasant experience, so say the least, especially considering the kid in question is a person of colour. In the US.
Anyway, the “AI” software used by the police department to monitor citizens’ behaviour mistook an empty chips bag in his pocket for a gun. US police officers, who only receive a few weeks of training, didn’t question what the computer told them and pointed guns at a teenager.
In a statement, Omnilert expressed regret over the incident, acknowledging that the image “closely resembled a gun being held.” The company called it a “false positive,” but defended the system’s response, stating it “functioned as intended: to prioritize safety and awareness through rapid human verification.”
↫ Alexa Dikos and Rebecca Pryor at FOX45 News
I’ve been warning that the implementation of “AI” was going to lead to people dying, and while this poor kid got lucky this time, you know it’s only a matter of time before people start getting shot by US police because they’re too stupid to question their computer overlords. Add in the fact that “AI” is well-known to be deeply racist, and we have a very deadly cocktail of failures.

“rapid human verification” is a fun little euphemism for racial profiling and frivolous arrest.
Unfortunately without seeing the evidence, none of us are in a position to judge it or the AI for that matter. Humans who reviewed it say it did look like a gun. I don’t want to take humans with a conflict of interest at their word, however nothing reported seems to indicate the AI didn’t do it’s job correctly. It alerted humans, who looked at the same images and then called police.
I get this is newsworthy because AI is attached, however humans make these same mistakes all of the time.
“Paranoid Cop Mistakes Gas Pump For a Pistol”
https://www.youtube.com/watch?v=O5JwWf_Q5qA
Without seeing the data we need to be open to the possibility that the AI’ actually performs better than humans. In hyperbole speak: more AI could save lives. I have to concede that I’m not privy to the data and don’t necessarily trust insiders behind closed doors at face value. It’s not right to ask the public to blindly trust technology. In principal though, I think it’s important to approach these things scientifically and not to use animus against AI to assert a predetermined conclusion.
I also think the story lacks a lot of context about the state of affairs with guns in US schools. Parents are desperate to protect their kids at school and we know for a fact that over a 180 day school year there will be an average of 1-2 shootings per day. This is the reality.
https://echomovement.org/school-shootings/
Even though I am a parent and this incident is scary, it’s really hard to agree that we shouldn’t use technology to look for guns in schools. I think police training to come out with guns blazing is often not appropriate, but it can be hard to judge what’s right to do at any given moment without the benefit of hindsight. These are all serious problems and I don’t think anyone has a complete solution.
The police in this country (that’s the freedom loving USA) don’t need much of a reason to point a gun at kids. This happens all the time, those kids are just usually brown, so it doesn’t make the news. ‘merica!
CaptainN-,
I think the AI is right to alert for a possible gun. It’s humans who have discretion of what to do next. Sometimes the presence of police in and of themselves endanger people more than anything likely to happen in their absence.. Clearly in this scenario it would have been a better outcome for a school resource officer to investigate the situation before bringing the police in. However the crux of the problem is that we don’t know the scenario we are in until after the fact. That’s what makes it so hard to prescribe a formulaic fix.
I think it could make sense for the police to take a strictly “backup” role to assist the school resource officer rather than committing the situation to police tactics right away. This could be a better fit for most scenarios, but I realize things are messy in reality and that the sensible course of action doesn’t always prevail.