AI Chatbots Are Giving Out Your Real Phone Number

Google’s Gemini and other AI systems expose private phone numbers from scraped web data, routing strangers to unsuspecting users

Rex Freiberger Avatar
Rex Freiberger Avatar

By

Image: Deposit Photos

Key Takeaways

Key Takeaways

  • Gemini exposes real phone numbers from training data, routing strangers to personal cells
  • Scammers seed fake support numbers across web platforms to hijack AI recommendations
  • Privacy removal requests face 400% spike with minimal recourse under current laws

Someone starts calling your cell asking for a locksmith. Then a lawyer. Then car repair. Each caller says they “got your number from Google’s AI.” Your phone becomes customer service for businesses you’ve never heard of—because Gemini decided your digits make a perfect placeholder contact.

This nightmare scenario hit a Reddit user whose number now routes strangers seeking everything from legal advice to home repairs. They filed official privacy requests with Google. Months later, the calls continue while Google stays silent.

When AI Becomes Your Personal Directory

Chatbots are surfacing real contact details buried deep in training data.

Software engineer Daniel Abraham discovered this firsthand when strangers started messaging his WhatsApp for PayBox customer support. Gemini told them to contact “PayBox customer service” using Abraham’s personal number. PayBox confirmed it has no WhatsApp support line.

PhD student Meira Gilbert tested Gemini with a colleague’s name and instantly received that person’s private cell number—a contact detail buried in an obscure 2015 forum post that barely ranked in normal Google searches. When reporters tried replicating the query later, ChatGPT initially refused, then provided the number anyway.

Large language models memorize chunks of training data scraped from the public web and data brokers, then regurgitate phone numbers and addresses on demand.

Scammers Game the System

Fraudsters are poisoning the web so AI recommends their fake support numbers.

Google’s AI Overviews now regularly surface scam numbers when you search for customer service. Users report nearly falling for fake Swiggy and Royal Caribbean support lines that appeared ahead of legitimate company contacts.

Security firm Aurascape discovered how this works: scammers seed fake customer service numbers across Yelp reviews and YouTube comments using phrases like “official [brand] reservations number.” These entries get scraped into AI training data, so chatbots faithfully recommend numbers that route to fraudsters.

“Attackers are quietly rewriting the web that AI systems read,” warns Qi Deng of Aura Labs. Virgin Media O2 found millions of UK users have encountered fake support numbers through AI tools.

No Recourse in Sight

Getting your number removed from AI outputs proves nearly impossible.

DeleteMe reports a 400% spike in AI-related privacy complaints over seven months. The cases follow two patterns:

  • People finding their own accurate details in chatbot responses
  • Discovering they’re receiving calls meant for businesses they’ve never worked for

Abraham waited months for Google to respond to his Gemini complaint, then received a request for documentation he’d already provided. Current privacy laws offer little protection since most scraped data was technically “public” when collected—even if users never intended it for AI training.

Your best defense? Skip AI entirely for support numbers. Go directly to company websites and apps when you need help.

Update: Yelp reached out to us with background information about Yelp’s work to combat this type of fraudulent activity. For more information about Yelp’s Trust & Safety approach, you can learn more about their policies at trust.yelp.com.

Share this

At Gadget Review, our guides, reviews, and news are driven by thorough human expertise and use our Trust Rating system and the True Score. AI assists in refining our editorial process, ensuring that every article is engaging, clear and succinct. See how we write our content here →