Innocent woman jailed after being misidentified using AI facial recognition
rectang
524 points
279 comments
March 12, 2026
Related Discussions
Found 5 related stories in 54.0ms across 3,471 title embeddings via pgvector HNSW
- Tennessee grandmother jailed after AI face recognition error links her to fraud danso · 96 pts · March 13, 2026 · 75% similar
- Police used AI facial recognition to wrongly arrest TN woman for crimes in ND ourmandave · 383 pts · March 29, 2026 · 73% similar
- AI Gets Wrong Woman Jailed for Six Months, Life Ruined vaxman · 72 pts · March 14, 2026 · 54% similar
- Elon Musk's xAI sued for turning three girls' real photos into AI CSAM nobody9999 · 19 pts · March 16, 2026 · 43% similar
- AI Error May Have Contributed to Girl's School Bombing in Iran apolloartemis · 61 pts · March 07, 2026 · 43% similar
Discussion Highlights (20 comments)
rectang
> facial recognition showed she was the main suspect in what Fargo police called an organized bank fraud case. > Her bank records showed she was more than 1,200 miles away, at home in Tennessee at the same time police claimed she was in Fargo committing fraud. > Unable to pay her bills from jail, she lost her home, her car and even her dog
Jtsummers
https://archive.is/yCaVV - Archive link to get around the paywall. https://www.theguardian.com/us-news/2026/mar/12/tennessee-gr... - Another article on this without a paywall. It's annoying that both articles are calling this AI error. This was human error, the police did the wrong thing and the people of Fargo will end up paying for this fuckup.
mitchbob
https://archive.ph/2026.03.12-183903/https://www.grandforksh...
jauer
AI or not, it's unconscionable that victims of compulsory legal processes by way of mistaken identity are not made whole.
neaden
I hate this headline (not blaming submitter). Police incompetence and negligence jailed her for months and left her stranded in a North Dakota winter. The AI is no more responsible than the cars and airplanes they used. Edit: this is in reference to the original headline "AI error jails innocent grandmother for months in North Dakota fraud case" not the revised title that it was changed to.
tony_cannistra
Completely infuriating, but more of a commentary on the sad state of incompetent power-hungry law enforcement with tools they don't know how to use than the tools themselves. Though, the question remains: are the tools built in such a way as to deceive the user into a false sense of trust or certainty? _Some_ of the blame lies on the UX here. It must.
rpcope1
There's no way this isn't a slam dunk case to sue the piss out of the Fargo Police, probably the US Marshals and maybe other orgs. The woman in the surveillance phone clearly looks way younger, among the many other obvious signs this woman didn't do it. I hope she wrings at least several million dollars out of the government.
quickthrowman
It’s obvious from the one photo they posted of the actual suspect that the lady they arrested is about 20-30 years older than the woman in the bank photo. The woman in the photo is maybe 25-30 years old, this grandma looks like she’s 65-70 (actual age of 50). Absolutely ridiculous, I hope she wins her civil case.
Aardwolf
This reminds me of the British Post Office Scandal: https://en.wikipedia.org/wiki/British_Post_Office_scandal
anigbrowl
It is an AI error, but also an error on the part of the cops, the prosecutors, the judge, and the county sheriff (who is responsible for the jail inmates). I hope everyone involved in this travesty is sued into oblivion and unable to hide behind their immunity defenses. Facial recognition should never be the sole basis for a warrant.
jchama
The movie "Brazil" was right!
hsbauauvhabzb
Why the fuck does a newspaper need a ‘notifications’ icon in the top right hand corner?
holman
Me: Whoa, cool, my hometown is on atop Hacker News! Also me, reading further: Uh-oh. The chief of police also resigned today; wouldn't be shocked if this was part of the reasoning.
causal
Wait - what was the AI tool and how did it have her face to begin with? If small-town police are doing face-matching searches across national databases then nobody is safe because the number of false positives is going to be MASSIVE by sheer number of people being searched every day. Pretend the tool is 99.999999% specific. If it searches every face in the USA you're still getting about 3 false positives PER SEARCH. You will never have a criminal AI tool safe enough to apply at a national scale.
whack
> According to the court documents, the Fargo detective working the case then looked at Lipps' social media accounts and Tennessee driver's license photo. In his charging document, the detective wrote that Lipps appeared to be the suspect based on facial features, body type and hairstyle and color. > Once they were in hand, Fargo police met with him and Lipps at the Cass County jail on Dec. 19. She had already been in jail for more than five months. It was the first time police interviewed her. How is this the fault of AI? It flagged a possible match. A live human detective confirmed it. And the criminal justice system, for reasons that have nothing to do with AI, let this woman sit in jail for 5 months before doing even interviewing her or doing any due diligence. There's a reason why we don't let AI autonomously jail people. Instead of scapegoating an AI bogeyman, maybe we should look instead at the professional human-in-the-loop who shirked all responsibility, and a criminal justice system that thinks it is okay to jail people for 5 months before even starting to assess their guilt.
RobRivera
>Unable to pay her bills from jail, she lost her home, her car and even her dog. Fargo police say the bank fraud case is still under investigation and no arrests have been made. I smell a lawsuit
api
It's not an AI error. It's a human error in mis-using AI in this way. Saying it's an AI error is like saying a hole in your drywall is a hammer error. Unfortunately we'll probably see a trend of people using AI and then blaming AI for cases where they mis-used AI in roles it's not good for or failed to review or monitor the AI.
zingar
“Computers don’t argue” seemed charmingly wrong about how computers work until a few short years ago. https://nob.cs.ucdavis.edu/classes/ecs153-2019-04/readings/c...
bethekidyouwant
I read the article and I don’t really understand… she was held in a jail in Tennessee but the article states they flew her to North Dakota? And somehow she’s a fugitive so that’s why she doesn’t get bail? but she’s a fugitive held in her own state in a holding facility? But then when they release her, she’s in North Dakota? So if some state says you’re a fugitive your home state will just hold you in jail until they come and put you on an airplane? Is that correct?
jmyeet
We are rapidly becoming a world where every person is one inscrutable LLM decision from having their life ruined with no recourse. This type of incident isn't new and is only going to get worse. The problem is our governments are doing absolutely nothing about it. I'll give two examples: 1. Hertz implemented a system where they falsely reported cars as being stolen. People were arrested and went to jail for rental cars that were sitting in the Hertz lot. Hertz ultimately had to pay $168 million in a settlement [1]. That's insufficient. If I, as an ordinary citizen, make a false police report that somebody stole my car I can be criminally charged. And rightly so. People should go to jail for this and it will continue until they do. These fines and settlements are just the cost of doing business; and 2. The UK government contracted Fujitsu to produce a new system for their post offices. That system was allowed to produce criminal charges for fraud that were completely false. People committed suicide over this. This went on for what? A decade or more? But resuted in a parliamentary inquiry and settlements. It's known as the British Post Office scandal [2]. Again, people should go to jail for this. The choice we as a society face is whether to have automation improve all of our lives by raising everyone's standard of living and allowing us to do less work and less menial work or do we allow automation to further suppress wages so the Epstein class can be slightly more wealthy. [1]: https://www.npr.org/2022/12/06/1140998674/hertz-false-accusa... [2]: https://en.wikipedia.org/wiki/British_Post_Office_scandal