Overview
- Sam Altman’s blog post asserts that an average ChatGPT query consumes about 0.34 watt-hours of electricity and roughly one-fifteenth of a teaspoon (0.000085 gallons) of water.
- Using Altman’s figures, one month of ChatGPT queries could draw around 7.4 gigawatt-hours—enough to power about 8,200 U.S. homes—and consume roughly 1.8 million gallons of water.
- Studies have projected that AI could rival or exceed Bitcoin mining in electricity consumption by the end of 2025, intensifying concerns over data center sustainability.
- Critics and industry analysts argue that OpenAI has not disclosed the methodology behind its usage estimates and may be understating the true environmental footprint.
- Altman foresees that as data center operations become automated, the cost of AI intelligence will converge with electricity prices, potentially leading to abundant energy and cognitive resources by the 2030s.