Neon Paid You Pennies to Record Calls – Then Leaked Them to Strangers

Neon app exposed 75,000 users’ phone calls and personal data through flawed security before shutting down

Al Landes Avatar
Al Landes Avatar

By

Our editorial process is built on human expertise, ensuring that every article is reliable and trustworthy. AI helps us shape our content to be as accurate and engaging as possible.
Learn more about our commitment to integrity in our Code of Ethics.

Image credit: Wikimedia

Key Takeaways

Key Takeaways

  • Neon app exposed all users’ private call recordings through zero access controls
  • App reached 75,000 downloads daily while creating massive security vulnerability backdoor
  • Founder quietly shut down without notifying users of complete data breach

Recording calls for quick cash seemed harmless until strangers started listening to your private conversations. The Neon app, which rocketed to viral success by paying users to record phone calls for AI training data, exposed every user’s personal information through a security flaw that made your most intimate conversations accessible to anyone.

The Easy Money Trap

Users earned pennies per call while unknowingly creating a data goldmine for AI companies.

Neon’s premise was seductive in its simplicity: record your phone calls, get paid, help train AI models. The app incentivized lengthy conversations, with some users making extended calls specifically to maximize earnings. Within days of launch, Neon climbed to the top-five free iPhone apps with over 75,000 downloads in a single day, according to TechCrunch. Users saw dollar signs. AI companies saw training data. Nobody saw the security disaster brewing in the backend.

A Master Key for Everyone

Any logged-in user could access every other user’s calls, transcripts, and phone numbers.

TechCrunch security researchers discovered Neon’s server configuration had zero access controls. Any authenticated user could manipulate basic API calls to access the entire database—phone numbers, raw audio files, full transcripts, call durations, and earnings data. The backend essentially handed every user a master key to everyone else’s private information. Your grandmother’s medical appointment, your job interview, your relationship drama—all available to any stranger with basic technical knowledge.

The Silent Shutdown

Founder Alex Kiam pulled the app offline without disclosing the breach to users.

When confronted with the security failure, Stanford-educated founder Alex Kiam quietly took Neon offline. Users received vague emails about “extra layers of security” with no mention of the massive data exposure. No breach notification. No acknowledgment that private phone numbers and intimate conversations had been accessible to anyone who bothered to look. The transparency you’d expect from a platform handling your most sensitive data? Nowhere to be found.

The Bigger Picture

Neon joins a growing list of apps that prioritize rapid scaling over user protection.

This pattern keeps repeating across the industry. Apps like Tea, Bumble, and Hinge have all suffered exposure scandals, yet platforms keep prioritizing viral growth over basic security. The AI gold rush has created a new category of risk: apps that monetize your personal data while treating security as an afterthought. When making money from your conversations becomes the business model, protecting those conversations apparently becomes optional.

Your private calls shouldn’t be someone else’s payday—or exploitation.

Share this

At Gadget Review, our guides, reviews, and news are driven by thorough human expertise and use our Trust Rating system and the True Score. AI assists in refining our editorial process, ensuring that every article is engaging, clear and succinct. See how we write our content here →