Overview
- Google said its seventh‑generation Ironwood TPUs will become publicly available in the coming weeks after earlier testing.
- Ironwood targets both large‑model training and real‑time AI agents, with pods that link up to 9,216 chips to reduce data bottlenecks.
- Google’s AI Hypercomputer architecture clusters Ironwood pods at massive scale and uses Optical Circuit Switching to route around hardware interruptions.
- Alongside Ironwood, Google introduced Axion Armv9 server CPUs with new instance families, including C4A now generally available and N4A and C4A Metal in preview.
- Customer uptake includes Anthropic planning to use up to 1 million TPUs and Lightricks deploying Ironwood, as Google lifts its 2025 capex outlook to $93 billion to meet AI demand.