Overview
- Five senators sent a letter to the Office of Science and Technology Policy and the Commerce Department seeking safeguards for consumers and details on how the administration will manage AI data centers’ energy and water impacts and cost burdens.
- Peer‑reviewed research from Cornell in Nature Sustainability estimates U.S. AI servers could add 24–44 million metric tons of CO2 annually and consume 731–1,125 million cubic meters of water by 2030, with smarter siting, cleaner power, and efficient cooling able to cut impacts by roughly 70–85%.
- Power constraints are already visible on the ground, with completed data‑center shells in Santa Clara still unenergized as Silicon Valley Power targets a major upgrade by 2028 and other regions report multi‑year grid connection delays.
- Utilities are leaning on short‑term fixes as demand rises, delaying some coal retirements, expanding natural‑gas generation including a 10‑gigawatt request in Georgia, and even sourcing used turbines to bring capacity online faster.
- Analysts and officials warn data centers could draw 7–12% of U.S. electricity by decade’s end, contributing to higher regional bills as tech firms accelerate buildouts and explore longer‑term options such as small modular reactors, nuclear restarts, and large solar plus battery projects.