Eufy Paid Users to Fake Thefts for AI Training

Anker subsidiary collected 20,000 staged crime videos through February 2025 to improve security camera algorithms

Annemarije de Boer Avatar
Annemarije de Boer Avatar

By

Our editorial process is built on human expertise, ensuring that every article is reliable and trustworthy. AI helps us shape our content to be as accurate and engaging as possible.
Learn more about our commitment to integrity in our Code of Ethics.

Image credit: Wikimedia

Key Takeaways

Key Takeaways

  • Eufy paid users $2 each to stage fake package thefts for AI training
  • Campaign collected 20,000 clips from 120+ users between December 2024-February 2025
  • Users transitioned to unpaid “Video Donation Program” earning digital badges instead

Your smart home camera company just asked you to pretend to steal packages for $2 a pop. Between December 18, 2024, and February 25, 2025, Anker’s Eufy division ran exactly this campaign—literally telling camera owners to “create events by pretending to be a thief” to train their AI detection algorithms.

The mechanics were elegantly dystopian. Upload theft videos via Google Form, submit PayPal details, collect your two bucks. Eufy aimed for 20,000 clips each of package thefts and car door-pulling scenarios. More than 120 users publicly participated, contributing to what became hundreds of thousands of training videos.

Privacy Theater Meets Data Hunger

This wasn’t Eufy’s first privacy stumble. According to security researchers, in 2023 their supposedly end-to-end encrypted cameras were caught streaming unencrypted footage to the web. Now they’re collecting massive video datasets while dodging basic transparency questions about storage, retention, or deletion policies. When pressed for details on total participation or data handling, Eufy declined to provide details.

The Gamification Continues

After the paid campaign ended, Eufy seamlessly transitioned users into their “Video Donation Program.” The app now features an “Honor Wall” showcasing top contributors—some submitting over 200,000 clips for digital badges and small gifts. It’s like Duolingo, but for training surveillance AI. They’ve expanded beyond doorbell footage to baby monitors, building proprietary datasets across your entire home.

Welcome to Surveillance Labor

This represents something new in the gig economy: turning homeowners into data laborers for surveillance capitalism. You get pocket change while companies build billion-dollar AI models from your daily life. The staged theft angle makes it feel like content creation, but you’re actually training systems that will monitor millions of homes.

The real question isn’t whether better theft detection is worth $2—it’s whether you understand what surveillance companies are building with your footage, and why they’re so reluctant to explain their data practices when the cameras stop rolling.

Share this

At Gadget Review, our guides, reviews, and news are driven by thorough human expertise and use our Trust Rating system and the True Score. AI assists in refining our editorial process, ensuring that every article is engaging, clear and succinct. See how we write our content here →