AI-Generated MAGA Girls Are Scamming Lonely Men Out of Thousands

Medical student in India creates fake conservative influencer using AI, earning thousands monthly from MAGA followers

Alex Barrientos Avatar
Alex Barrientos Avatar

By

Image: Jessica Foster / Instagram

Key Takeaways

Key Takeaways

  • Medical student creates fake MAGA influencer “Emily Hart” earning thousands monthly
  • AI-generated conservative personas accumulate millions of followers before platform bans
  • Romance scams using deepfakes surge 70% as victims lose $70,000-$81,000

A 22-year-old medical student in India just proved how terrifyingly easy it is to monetize political loyalty. Using nothing more than AI image generators and basic social engineering, “Sam” created Emily Hart—a blonde conservative nurse who doesn’t exist—and watched her rack up 10,000 followers and thousands in monthly revenue by posting ice fishing photos and Coors Light selfies.

Emily wasn’t just any fake influencer. She was specifically designed to exploit what Google Gemini helpfully identified as a “cheat code”: the MAGA demographic. According to the AI chatbot, conservative-leaning older men represent a loyal, higher-income audience that outperforms generic content. Sam took this advice and ran with it, crafting a Jennifer Lawrence lookalike who posted rifle range videos with pro-Christian captions and anti-abortion messaging.

When AI Meets Political Manipulation

These aren’t obvious bot accounts with broken English—AI now generates believable personalities complete with backstories.

The formula worked disgustingly well. Emily’s Instagram Reels garnered millions of views within a month, driving traffic to Fanvue (think OnlyFans for political content) where subscribers paid for exclusive access and MAGA merchandise. Sam wasn’t alone—similar AI personas like Jessica Foster, a fabricated Army soldier, accumulated over a million Instagram followers before getting banned.

These accounts posted altered images with Trump, Putin, and Zelensky, monetizing foot photos and tips for $200-300 per post. The sophistication is what makes this particularly unsettling. These aren’t obvious bot accounts with broken English and stock photos. AI tools now generate believable personalities complete with backstories, regional interests, and cultural touchstones that resonate with specific audiences.

Platform Blind Spots Enable the Scam Economy

Instagram and Fanvue’s enforcement remains laughably inadequate against AI-generated personas.

Instagram and Fanvue’s enforcement remains laughably inadequate. While Emily Hart’s account was eventually banned for fraud in February, similar personas proliferate faster than platforms can identify them. The “rage bait” algorithms actually amplify this content—Sam noted that even liberals engage with these posts, boosting their virality through outraged comments.

This connects to a broader explosion in AI-powered romance scams, which have surged 70% as scammers deploy deepfakes and voice cloning. Recent victims lost 70,000 and 81,000 respectively to fake celebrity relationships that felt completely authentic until the money disappeared.

The scariest part isn’t the technology—it’s how desperately people wanted these connections to be real. As one observer put it about Jessica Foster: “The most dangerous thing isn’t that she’s fake; it’s how badly a million people needed her to be real.”

Share this

At Gadget Review, our guides, reviews, and news are driven by thorough human expertise and use our Trust Rating system and the True Score. AI assists in refining our editorial process, ensuring that every article is engaging, clear and succinct. See how we write our content here →