Skip to content
Tech News
← Back to articles

Tennessee grandmother wrongly jailed for six months, latest victim of AI-driven misidentification — facial recognition is jailing the wrong people, but police keep using it anyway

read original get Facial Recognition Safety Kit → more articles
Why This Matters

This case highlights the critical risks of relying on facial recognition technology in law enforcement, especially its potential to cause wrongful arrests and harm innocent individuals. It underscores the urgent need for stricter regulations, better oversight, and more accurate investigative methods to protect civil liberties and prevent similar injustices in the future.

Key Takeaways

A Tennessee grandmother spent nearly six months in jail after police in Fargo, North Dakota, used facial recognition software to identify her as the primary suspect in a bank fraud case, according to reporting by WDAY News.

The problem was that Angela Lipps, 50, had never been to North Dakota, and bank records confirmed she was more than 1,200 miles away at the time of the alleged crimes. Her case is the latest in a documented pattern of wrongful arrests driven by facial recognition technology deployed without adequate investigative follow-up.

Fargo police were investigating a series of bank fraud incidents in April and May last year, in which a woman used a fake U.S. Army ID to withdraw tens of thousands of dollars. Detectives ran surveillance footage through facial recognition software, which returned a match to Lipps. A detective then compared her Tennessee driver's license and social media images to the suspect and concluded that she was the perpetrator based on facial features, body type, and hair. Nobody from the department contacted Lipps before U.S. Marshals arrested her at gunpoint on July 14 while she was babysitting four children.

Article continues below

Lipps sat in a Tennessee county jail for 108 days before North Dakota officers collected her. Her attorney, Jay Greenwood, immediately requested her bank records, and when Fargo police finally met with Greenwood and Lipps on December 19, five months after her arrest, the records showed she had been buying cigarettes and depositing Social Security checks in Tennessee at the time police placed her in Fargo. The case was dismissed on Christmas Eve, but the damage had already been done; she had no money, no coat, and no way home, and subsequently lost her house, her car, and her dog.

Its not unusual

Shockingly, this is just the latest in a series of structural failures that have led to innocent people being persecuted for crimes they didn’t commit. A January 2025 WaPo investigation documented at least eight instances of Americans wrongfully arrested after police found a possible FRT match, and in every case, investigators skipped fundamental steps like checking alibis and comparing physical descriptions that would have cleared the suspect before arrest.

The facial recognition vendors themselves, such as Clearview AI, even attach explicit caveats to their systems. Clearview requires agencies to acknowledge that results "are indicative and not definitive" and that officers must conduct further research before acting on them. According to an April 2024 ACLU submission to the U.S. Commission on Civil Rights, in at least five of seven wrongful arrest cases, police had received explicit warnings that FRT results don’t constitute probable cause but made arrests anyway.

Robert Williams, whose 2020 wrongful arrest in Detroit was the first publicly reported FRT false-positive case, reached a landmark settlement with the city in June 2024 that now requires independent corroborating evidence before any FRT match can be used to seek an arrest warrant. However, only 15 states had enacted any FRT legislation covering law enforcement at the start of 2025, and North Dakota is not among them.

Stay On the Cutting Edge: Get the Tom's Hardware Newsletter Get Tom's Hardware's best news and in-depth reviews, straight to your inbox. Contact me with news and offers from other Future brands Receive email from us on behalf of our trusted partners or sponsors

... continue reading