Overview
- OpenAI begins using AWS immediately, with full agreed capacity slated before the end of 2026 and an option to expand in 2027 and beyond.
- AWS will furnish hundreds of thousands of Nvidia GPUs, including GB200 and GB300 accelerators, interconnected via Amazon EC2 UltraServers for training and inference.
- The capacity will support both the training of new frontier models and day-to-day operation of ChatGPT for its large global user base.
- The deal marks a break from OpenAI’s prior Azure exclusivity as Microsoft remains a major investor with a roughly 27% stake following recent restructuring.
- Investors welcomed the announcement, with Amazon shares up about 4.9% and Nvidia up roughly 2.7%, even as analysts flag $1.4 trillion in OpenAI infrastructure commitments and rising data-center power needs near 30 GW.