Overview
- OpenAI chief Sam Altman revealed each ChatGPT query consumes around 0.34 watt-hours of electricity and 0.000085 gallons of water
- Efficiency gains per request are outpaced by the sheer volume of AI usage, driving a steep rise in data-center energy and cooling demands
- Growing concerns over carbon emissions and water use have prompted major firms to seek low-carbon power alternatives
- Microsoft has signed a 20-year agreement to restart Pennsylvania’s Three Mile Island plant and Google plans to deploy small modular reactors by 2035, with Amazon exploring similar options
- Industry leaders argue that sustainable energy strategies will be essential for harnessing AI’s projected economic benefits, including funding ideas like universal basic income