Teenager detained at gunpoint by US cops because “AI” mistook a chips bag for a gun

If you’re eating a bag of chips in an area where “AI” software is being used to monitor people’s behaviour, you might want to reconsider. Some high school kid in the US was hanging out with his friends, when all of a sudden, he was being swarmed by police officers with with guns drawn. Held at gunpoint, he was told to lie down, after which he was detained. Obviously, this is a rather unpleasant experience, so say the least, especially considering the kid in question is a person of colour. In the US.

Anyway, the “AI” software used by the police department to monitor citizens’ behaviour mistook an empty chips bag in his pocket for a gun. US police officers, who only receive a few weeks of training, didn’t question what the computer told them and pointed guns at a teenager.

In a statement, Omnilert expressed regret over the incident, acknowledging that the image “closely resembled a gun being held.” The company called it a “false positive,” but defended the system’s response, stating it “functioned as intended: to prioritize safety and awareness through rapid human verification.”

↫ Alexa Dikos and Rebecca Pryor at FOX45 News

I’ve been warning that the implementation of “AI” was going to lead to people dying, and while this poor kid got lucky this time, you know it’s only a matter of time before people start getting shot by US police because they’re too stupid to question their computer overlords. Add in the fact that “AI” is well-known to be deeply racist, and we have a very deadly cocktail of failures.

4 Comments

  1. 2025-10-24 11:50 am
  2. 2025-10-24 12:51 pm
  3. 2025-10-24 4:12 pm
    • 2025-10-24 5:47 pm

Leave a Reply