Palantir Gotham Predicts Your ‘Deportability’ From Bank Transactions

AI surveillance system processes banking data and social media to generate deportation risk scores for ICE raids

Annemarije de Boer Avatar
Annemarije de Boer Avatar

By

Our editorial process is built on human expertise, ensuring that every article is reliable and trustworthy. AI helps us shape our content to be as accurate and engaging as possible.
Learn more about our commitment to integrity in our Code of Ethics.

Image: QuoteInspector.com

Key Takeaways

Key Takeaways

  • Palantir’s ELITE system assigns deportability scores using bank transactions and social media
  • ICE invested over $100 million in AI surveillance targeting 3,000 daily arrests
  • ACLU reveals 40% false positive rate in algorithmic raids hitting innocent families

Palantir’s ELITE system, the AI backbone powering ICE deportation raids, treats routine financial behavior like criminal evidence. The platform crunches bank transactions alongside social media posts, license plates, and biometrics to assign you a “deportability” confidence score out of 100.

Welcome to surveillance capitalism’s most dystopian application—where algorithms decide who belongs in America based on how you use Venmo.

ICE’s AI Arsenal Costs More Than Most Tech Startups

Palantir’s government contracts exceed $100 million, with new tools designed to track “self-deportations” and enable warrantless mass arrests.

ICE has invested over $100 million in Palantir’s surveillance ecosystem, including a $29.9 million supplement for ELITE launching September 2025. Another $30 million funds “ImmigrationOS,” designed to track people considering leaving voluntarily.

The agency wants AI that scales to 3,000 daily arrests—a quota that would make Amazon warehouse managers blush. This isn’t experimental tech; it’s deployed infrastructure with the budget of a unicorn startup, except the product is human suffering instead of user engagement.

When Algorithms Get It Wrong, Innocent Families Pay

ACLU investigation reveals 40% of raids target wrong homes, while everyday activities like family remittances trigger “cartel smuggling” alerts.

ACLU FOIA documents expose a 40% false positive rate—meaning nearly half of all raids hit innocent homes. The Brennan Center found citizens routinely profiled alongside undocumented immigrants.

Gotham’s “pattern recognition” treats grocery store cash deposits and frequent wire transfers as cartel indicators, destroying lives through algorithmic guilt-by-association. Operation Black Rose exemplifies this damage: 80% of Oregon arrests targeted non-criminals, including asylum seekers like “MJMA” whose only crime was existing in Palantir’s dataset.

Tech Workers and Civil Rights Groups Challenge the Surveillance Machine

Employee protests and legal challenges target Palantir’s role in mass deportation infrastructure, but contracts keep expanding.

Palantir employees have staged protests against ICE contracts, while civil rights organizations file FOIA requests exposing the system’s flaws. Yet CEO Alex Karp defends the technology as supporting “critical operations,” even as Stephen Miller’s stock ownership raises conflict-of-interest questions.

The surveillance apparatus expands regardless of protests—your banking data remains fair game for algorithmic enforcement. Until Congress acts, every financial transaction feeds a system designed to weaponize your digital surveillance against you.

Share this

At Gadget Review, our guides, reviews, and news are driven by thorough human expertise and use our Trust Rating system and the True Score. AI assists in refining our editorial process, ensuring that every article is engaging, clear and succinct. See how we write our content here →