40,000 AI Workers Hit by Huge Voice Data Theft – Are You One of Them?

Lapsus$ hackers steal 4TB of studio-quality voice recordings from 40,000 contractors, enabling sophisticated fraud

Alex Barrientos Avatar
Alex Barrientos Avatar

By

Image: Deposit Photos

Key Takeaways

Key Takeaways

  • Lapsus$ stole studio-grade voice recordings from 40,000 contractors with government IDs
  • Voice authentication in banking apps becomes vulnerable to high-fidelity AI impersonations
  • Establish verification protocols using personal context questions to expose AI-generated calls

Studio-quality voice recordings paired with government IDs create the perfect cloning formula—and 40,000 contractors just lost both to cybercriminals. The Mercor data breach exposes a nightmare scenario where your voice becomes an un-rotatable password in the wrong hands. If you use voice authentication for banking apps or smart assistants, this breach changes everything about how you verify suspicious calls.

The Perfect Storm for Voice Fraud

Criminals now possess studio-grade audio samples linked to verified identities, enabling high-fidelity impersonations.

Lapsus$, the extortion group behind high-profile attacks, claimed responsibility for stealing 4TB from AI talent platform Mercor through a supply chain attack, according to The Register. The breach exposed 2-5 minute studio recordings from over 40,000 contractors alongside their government-issued IDs and passport photos.

Unlike typical voice leaks, these samples exceed the 15-second threshold needed for off-the-shelf cloning tools. Wiz researchers confirmed that “high-profile extortion groups like Lapsus$ were now working with TeamPCP,” the hackers who backdoored the LiteLLM open-source project in late March.

Your Voice Authentication Just Became Vulnerable

Banking apps and smart devices can’t distinguish between your real voice and sophisticated AI clones.

Voice cloning scams already cost victims $5 million in 2025, according to Trend Micro, typically using just 3-second clips from social media. The Mercor breach provides criminals with professional-grade source material. Your Google Voice Match or Alexa profiles become security theater when fraudsters possess studio recordings linked to verified identities.

Meta paused contracts with Mercor while OpenAI reviews its partnerships—corporate giants recognize the gravity.

Emergency Call Verification Strategy

Simple verification questions can expose AI-generated voices targeting your family.

Racing to respond when someone claiming to be your relative calls with an emergency? Establish verification protocols now. Ask specific questions only family members would know—recent conversations, inside jokes, or shared experiences that weren’t posted online.

AI voices excel at mimicking speech patterns but struggle with personal context. Set up codewords with elderly relatives who are prime targets. The FBI reports rising voice impersonation incidents, particularly targeting grandparents with fake emergency calls.

Immediate Protection Steps

Audit your voice-enabled accounts and update security settings before criminals strike.

  • Check which financial accounts use voice authentication and consider switching to multi-factor alternatives
  • Review smart home devices for voice-controlled purchasing or security disabling
  • Update emergency contact verification with banks and investment accounts

While Mercor contained the breach promptly and hired forensics experts, your stolen voice samples remain permanently compromised. The company faces a class action lawsuit for inadequate biometric disclosure—but legal remedies won’t unring this bell.

The next family emergency call you receive might be the first test of these new realities. Your voice is now someone else’s skeleton key.

Share this

At Gadget Review, our guides, reviews, and news are driven by thorough human expertise and use our Trust Rating system and the True Score. AI assists in refining our editorial process, ensuring that every article is engaging, clear and succinct. See how we write our content here →