Angela Lipps was babysitting her grandchildren on July 14, 2025, when U.S. Marshals burst in with guns drawn. The 50-year-old grandmother spent the next six months behind bars—first in Tennessee, then North Dakota—charged with bank fraud she didn’t commit. Her crime? Looking similar enough to the actual perpetrator to fool Fargo Police Department’s facial recognition software.
The Algorithm’s Fatal Flaw
Police trusted computer matching over basic investigative work.
Fargo police were hunting someone who used fake Army credentials to steal tens of thousands from local banks in spring 2025. Surveillance footage was fed through facial recognition software, which flagged Lipps. Detectives confirmed the match using her social media photos and driver’s license, comparing facial features, body type, and hairstyle. But they never bothered calling Tennessee to verify her whereabouts—a simple step that would have revealed she was 1,200 miles away during the crimes.
The Evidence They Ignored
Bank records told a completely different story than the algorithm.
Defense attorney Jay Greenwood obtained Lipps’ financial records showing Social Security deposits, gas station purchases, and Uber Eats orders—all timestamped in Tennessee during the Fargo fraud dates. “If the only thing you have is facial recognition, I might want to dig a little deeper,” Greenwood said. “That investigative process could’ve been done long before I was involved.” Instead, Lipps wasn’t even interviewed until December 19, more than five months after her arrest.
The Human Cost of Digital Shortcuts
Lost everything while proving her innocence from jail.
Charges were finally dismissed on Christmas Eve 2025, but the damage was done. Lipps lost her home, car, and dog to unpaid bills during her incarceration. Local attorneys funded her hotel stay after release—she’d been arrested in summer clothes and freed into North Dakota snow. The F5 Project, a Fargo reentry organization, drove her to Chicago for family pickup. Fargo Police Chief David Zibolski declined interviews before his March 2026 retirement.
This marks the eighth documented wrongful arrest via facial recognition in the U.S. Your bank’s security cameras are watching, and algorithms are deciding who looks suspicious. Without human verification protecting the process, any of us could be next.





























