Overview
- OpenAI will begin using AWS compute immediately, with the full agreed capacity slated to be deployed before the end of 2026.
- AWS will run OpenAI’s training and ChatGPT workloads in U.S. data centers using hundreds of thousands of Nvidia GPUs rather than Amazon’s Trainium chips.
- The partnership follows OpenAI’s restructuring and diversification from Azure, with Microsoft now holding about 27% of the company.
- Amazon shares climbed roughly 4.6–5% after the news and Nvidia also rose, signaling investor confidence in AI infrastructure demand.
- Sam Altman says OpenAI’s broader contracts total about $1.4 trillion and could require around 30 GW of power, prompting questions about energy supply and long-term returns.