Particle.news
Download on the App Store

Google Launches Ironwood TPU for Cloud as Reports Point to Wider Availability

The release targets high‑volume, low‑latency inference with superpod‑scale networking.

Overview

  • Google says Ironwood is its most powerful and energy‑efficient TPU to date and is now available to Google Cloud customers.
  • The chip is purpose‑built for inference and delivers over 4× better performance per chip for training and inference than the prior generation, according to Google.
  • Ironwood scales to 9,216 chips in a superpod linked by a 9.6 Tb/s Inter‑Chip Interconnect with access to 1.77 PB of shared HBM.
  • TPUs remain core to Google’s AI Hypercomputer architecture, which groups accelerators into pods to optimize compute, networking, storage and software.
  • Multiple outlets report Google is preparing to offer TPUs beyond its cloud, with Meta cited as a lead design win and on‑prem or colocation pitches noted, though Google has not confirmed these plans.