Overview
- Arvind Krishna estimates a one‑gigawatt AI data center costs about $80 billion, implying roughly $8 trillion if industry plans reach around 100 gigawatts.
- He argues the returns won’t materialize, saying about $800 billion in profit would be needed just to cover interest, and he puts current technologies’ odds of reaching AGI at roughly 0–1%.
- Krishna warns accelerator chips will age out fast, forcing operators to refresh data‑center hardware roughly every five years.
- Despite the warnings, spending is still rising as Alphabet lifts its 2025 capex outlook to $91–93 billion and Amazon to about $125 billion.
- Analysts flag mounting power constraints, with Goldman Sachs reporting data‑center load around 55 gigawatts today (about 14% AI) and projecting roughly 84 gigawatts by 2027.