Particle.news

Download on the App Store

AI Supercomputers Projected to Demand $200B and 9 GW by 2030

New research highlights the exponential growth in costs and power needs for AI infrastructure, raising environmental and geopolitical concerns.

data center in server room with server racks
Data center server racks. IT modern hardware server room, data storage center, database information system. Hosting, it data backup, computing technology service, network security, artificial intelligence supercomputers, 3D mixed-media illustration
Jensen Huang announced details about Nvidia's next-generation GPU platform, Rubin.

Overview

  • A study by Georgetown, Epoch AI, and Rand projects that by 2030, the leading AI supercomputer could cost $200 billion, house 2 million chips, and require 9 GW of power — equivalent to nine nuclear reactors.
  • Power demands for AI data centers have doubled annually since 2019, far outpacing energy efficiency improvements, which have only risen by 1.34× per year.
  • xAI’s Colossus, currently the most powerful AI supercomputer, consumes 300 MW of power and cost $7 billion to build, showcasing the rapid escalation in scale and expense.
  • The United States controls 75% of global AI supercomputing power, but the strain on grids and reliance on non-renewable energy sources pose environmental and economic challenges.
  • Some hyperscalers, including AWS and Microsoft, have recently slowed expansion plans, signaling potential concerns about the sustainability of current growth trends.