Your Friday night stroll through the French Quarter just became data for a facial recognition algorithm. For two years, New Orleans transformed into America’s most surveilled city without telling you about it.
Project NOLA, a nonprofit you’ve probably never heard of, quietly installed over 200 AI-powered cameras that scan your face every time you walk past. The system compares your features against a database of 30,000 individuals, sending instant alerts to police when it thinks it recognizes someone. Unlike your iPhone’s Face ID that protects your privacy, these cameras treat your biometric data like public property.
The Tech That Turns Every Street Into a Police Station
The cameras aren’t your typical security setup. These sophisticated systems can identify you from 700 feet away—even in poor lighting conditions that would stump your phone’s camera.
When the AI spots a match, officers receive a mobile alert faster than your DoorDash notification arrives. Your location, identity, and a confidence score appear on their screens within minutes.
Project NOLA’s founder Bryan Lagarde, a former cop turned tech entrepreneur, assembled the watchlist from police mugshots and arrest records. The system essentially turns every camera-equipped business into an extension of the police surveillance network.
The technology goes beyond simple identification. Upload any photo, and Project NOLA can trace that person’s movements across the entire city for the past 30 days. Your coffee shop visits, restaurant stops, and evening walks become a digital breadcrumb trail that would make Google’s location tracking look amateur—almost as if the next step is China introducing an autonomous spherical police robot to roll down Bourbon Street.
When Private Surveillance Meets Public Policing
Your rights got lost in a loophole designed by lawyers who watch too much CSI. New Orleans banned facial recognition in 2020, then relaxed restrictions in 2022 for violent crime investigations only.
The catch? Police were supposed to get human oversight and report every use to the city council.
Instead, they received automated alerts from Project NOLA’s private network. No oversight. No reporting. No accountability.
Police reports from the 34 arrests frequently omitted any mention of facial recognition technology. Defendants never knew their arrests stemmed from AI surveillance, denying them the chance to challenge the evidence in court. “By adopting this system – in secret, without safeguards, and at tremendous threat to our privacy and security – the City of New Orleans has crossed a thick red line,” Wessler adds.
The Reality Check That Privacy Advocates Feared
Facial recognition errors aren’t theoretical—they’re documented disasters waiting to happen again. Randall Reid spent six days in jail after Louisiana deputies used Clearview AI to wrongly identify him in surveillance footage from a state he’d never visited.
Robert Williams was falsely arrested in Detroit for shoplifting he didn’t commit.
This unchecked expansion of surveillance technology represents a growing threat to privacy and democracy. Your digital anonymity in public spaces just evaporated without your consent. New Orleans Police Chief Anne Kirkpatrick finally suspended the system in April after media scrutiny intensified, but Project NOLA staff still receive alerts and can relay information to officers by phone or text.
The question isn’t whether this technology works—it’s whether you want to live in a city where every face becomes a potential suspect.