Overview
- Microsoft Azure has integrated NVIDIA's Blackwell GB200 AI servers, becoming the first cloud platform to do so, enhancing its AI model performance.
- The NVIDIA Blackwell GB200 AI servers are equipped with B200 GPUs, offering 192 GB of HBM3e memory and designed for deep learning and large dataset processing.
- OpenAI has received one of the first engineering builds of NVIDIA's DGX B200 AI system, which includes eight B200 GPUs and offers significant performance improvements.
- OpenAI plans to use the DGX B200 platform to accelerate AI model training, benefiting from the system's 72 petaFLOPS of training performance and 144 petaFLOPS of inference performance.
- Major tech companies, including Amazon, Google, and Meta, are also interested in NVIDIA's Blackwell architecture for their AI computing needs.