Overview
- Google reported a typical Gemini text query uses about 0.24 watt‑hours of electricity and roughly 0.26 milliliters of water, but the figure reflects a median and excludes images, video, and training.
- Google declined to disclose total Gemini query volumes, while OpenAI says ChatGPT sees about 2.5 billion prompts daily at roughly 0.34 watt‑hours per prompt, which MIT Technology Review notes would exceed 300 GWh annually.
- MIT Technology Review highlights broader infrastructure pressures, citing plans for over 2 GW of new natural‑gas generation in Louisiana to power a single Meta data center and Google Cloud’s $25 billion AI investment in the PJM grid region.
- Forecasts point to steep growth in electricity demand, with the IEA projecting global data centers could surpass 945 TWh by 2030 as AI workloads expand.
- Analysts note efficiency gains in chips and cooling but warn rising usage can outpace savings, pressing for transparent, comprehensive reporting on energy and water across all AI workloads.