Google’s Bold Plan to Train AI in Space: Project Suncatcher Explained

Google partners with Planet Labs to launch two prototype satellites by 2027, testing orbital data centers with custom chips

Annemarije de Boer Avatar
Annemarije de Boer Avatar

By

Our editorial process is built on human expertise, ensuring that every article is reliable and trustworthy. AI helps us shape our content to be as accurate and engaging as possible.
Learn more about our commitment to integrity in our Code of Ethics.

Image credit: Wikimedia

Key Takeaways

Key Takeaways

  • Google launches Project Suncatcher to train AI using solar-powered satellites by 2027
  • Orbital solar panels generate eight times more power than Earth-based data centers
  • Space-based TPU clusters achieve tens of terabits per second data transfer speeds

Picture this: your next ChatGPT query gets processed by a solar-powered computer floating 400 miles above Earth. Google’s Project Suncatcher isn’t science fiction—it’s the company’s latest moonshot to solve AI’s ravenous appetite for power by literally reaching for the stars.

The Orbital Office Park

Satellite constellations could deliver eight times more solar power than earthbound data centers.

Project Suncatcher envisions interconnected satellites packed with custom Tensor Processing Units, positioned just kilometers apart in space. The pitch sounds like something from a Tesla earnings call: orbital solar panels generating near-continuous power, potentially eight times more productive than terrestrial arrays.

Google plans to test this cosmic computing concept with Planet Labs, launching two prototype satellites by early 2027. The technical specs are genuinely impressive. These space-based clusters would achieve tens of terabits per second of data transfer using free-space optical links—essentially turning the void into a high-speed internet highway.

Your AI workloads would bounce between satellites faster than your morning coffee order processes through a drive-thru.

Houston, We Have Problems

Launch costs and cosmic radiation present real engineering challenges.

Before you start planning your orbital Zoom calls, Google faces substantial hurdles. Launch costs remain astronomical, though the company projects cost parity with Earth-based data centers by the mid-2030s.

Radiation exposure could fry circuits faster than a dropped phone screen, but Google’s testing suggests their Trillium TPU architecture can survive a five-year space mission. Orbital maintenance, space debris, and regulatory oversight make terrestrial data center management look like child’s play.

“In the future, space may be the best place to scale AI compute,” says Travis Beals, Google’s Senior Director overseeing the project. That’s a bold claim considering these cosmic complications.

The Bigger Picture

This moonshot reflects Big Tech’s push beyond earthbound limitations.

Suncatcher fits Google’s pattern of audacious infrastructure experiments—remember quantum computing and self-driving cars? Success would fundamentally reshape how you think about AI scaling, potentially reducing competition for terrestrial power resources while opening entirely new frontiers for cloud computing.

The regulatory landscape remains murky, requiring international coordination on orbital management and debris mitigation. Yet the cultural significance is clear: we’re witnessing Big Tech’s evolution from disrupting industries to transcending planetary boundaries entirely.

Whether Google’s orbital ambitions prove viable or join the graveyard of abandoned moonshots, Project Suncatcher signals that AI’s future might literally be written in the stars.

Share this

At Gadget Review, our guides, reviews, and news are driven by thorough human expertise and use our Trust Rating system and the True Score. AI assists in refining our editorial process, ensuring that every article is engaging, clear and succinct. See how we write our content here →