Essex Police Pauses Facial Recognition Over Racial Bias Concerns

Cambridge study finds bias in facial recognition system that scanned 1.3 million faces for Essex Police

C. da Costa Avatar
C. da Costa Avatar

By

Image: PA

Key Takeaways

Key Takeaways

  • Cambridge study finds Essex Police facial recognition system shows bias favoring Black people
  • National Physical Laboratory contradicts Cambridge findings using different testing methodology and standards
  • Essex Police pauses program despite 48 arrests from 1.3 million face scans

While Cambridge researchers flagged algorithmic bias, government labs found none—leaving Essex Police’s facial recognition program stuck between contradictory studies and public scrutiny.

The numbers tell a stark story. From August 2024 to February 2025, Essex Police scanned roughly 1.3 million faces using Corsight’s Apollo 4 software, leading to 48 arrests—that’s one arrest per 27,000 faces scanned. Then Cambridge University dropped a bombshell: their study found the system was statistically more likely to correctly identify Black people and men compared to other demographics, raising fairness questions that forced a pause.

Technical Accuracy vs. Algorithmic Equity

Same technology, different conclusions expose how bias detection in AI systems remains frustratingly complex.

Here’s where it gets messy. The National Physical Laboratory conducted its own evaluation using ISO standards and found no statistically significant gender or ethnicity biases. The system achieved an 89% true positive identification rate with only a 0.017% false positive rate on an 18,000-image watchlist. Same technology, different conclusions—highlighting how bias detection in AI systems remains frustratingly complex.

Essex Police commissioned both studies, and the conflicting results expose the challenge facing law enforcement nationwide. You can’t simply flip a switch and declare an algorithm “fair.” The Cambridge study tested real deployments with nearly 200 participants, while NPL used standardized laboratory conditions. Both methods have merit, but they’re measuring different aspects of the same problem.

Surveillance Expansion Marches Forward

Despite the pause, UK facial recognition expansion continues with enhanced monitoring requirements.

The Information Commissioner’s Office conducted its own audit and found Essex Police provides “reasonable data protection assurance” for live facial recognition. However, the ICO stressed that “all forces should conduct routine testing for bias… Without this, there is a real risk of unfairness.” Translation: proceed with caution and constant monitoring.

Despite the pause, UK facial recognition expansion continues. Thirteen forces already use the technology, and the Home Office plans to deploy 50 vans nationwide. London alone recorded over 1,300 arrests using facial recognition in 2024-2025 for serious crimes including rape and grievous bodily harm. The arrests matter—but so does getting the fairness equation right.

Essex Police plans to resume deployments after algorithm updates and revised policies, with enhanced community bias monitoring. Their experience becomes a crucial test case as facial recognition transforms from experimental policing tool to standard equipment. The question isn’t whether this technology works—it’s whether it works fairly for everyone walking past those cameras.

Share this

At Gadget Review, our guides, reviews, and news are driven by thorough human expertise and use our Trust Rating system and the True Score. AI assists in refining our editorial process, ensuring that every article is engaging, clear and succinct. See how we write our content here →