Tennessee Grandmother Jailed After AI Facial Recognition Mistake in Fraud Case
Grandmother Jailed After AI Facial Recognition Error in Fraud

Tennessee Grandmother Wrongfully Jailed Due to AI Facial Recognition Error in Fraud Case

Angela Lipps, a 50-year-old grandmother from Tennessee, endured nearly six months of incarceration after an artificial intelligence facial recognition system incorrectly identified her as a suspect in a North Dakota bank fraud investigation. According to reports from InForum, a news outlet in south-east North Dakota, Fargo police used the technology to link Lipps to organized bank fraud, despite her claims of never having visited the state or committed the crimes.

Arrest and Extradition Process

In July, U.S. marshals arrested Lipps at her Tennessee home while she was babysitting four children. She described being taken away at gunpoint and booked into a county jail as a fugitive from justice from North Dakota. Lipps, a mother of three and grandmother of five who has lived most of her life in north central Tennessee, had never been on an airplane until authorities flew her to North Dakota last year to face charges.

She remained in a Tennessee jail for nearly four months without bail while awaiting extradition, charged with four counts of unauthorized use of personal identifying information and four counts of theft. Fargo police records obtained by WDAY News indicate that detectives reviewed surveillance video from April and May 2025, showing a woman using a fake U.S. army military ID to withdraw tens of thousands of dollars. Officers allegedly used facial recognition software to match Lipps based on facial features, body type, and hairstyle.

Release and Aftermath

Authorities did not transport Lipps from Tennessee until the end of October, 108 days after her arrest. She appeared in a North Dakota courtroom the next day. Her attorney, Jay Greenwood, emphasized the need for deeper investigation, stating, "If the only thing you have is facial recognition, I might want to dig a little deeper." Lipps was released on Christmas Eve after Greenwood presented bank records proving she was over 1,200 miles away in Tennessee at the time of the alleged fraud in Fargo.

However, Fargo police did not cover her trip home, leaving her stranded. Local defense attorneys and the non-profit F5 Project assisted with hotel accommodations, food, and her return to Tennessee. Lipps has since returned home but reports lasting consequences, including losing her home, car, and dog due to unpaid bills while jailed. She also noted that no one from the Fargo police department has apologized for the ordeal.

Broader Context of AI Errors

This incident is not isolated. In October, an AI system in Baltimore mistakenly identified a high school student's bag of Doritos as a firearm, leading to a police response where the student was handcuffed and searched. Earlier this year, a man in the UK was arrested for a burglary in a city he had never visited after facial recognition software confused him with another person of south Asian heritage, matching him to footage of a suspect 100 miles away.

These cases underscore ongoing concerns about the reliability and ethical implications of AI technologies in law enforcement, prompting calls for stricter oversight and verification processes to prevent similar wrongful arrests in the future.