Overview
- Industry analysts put Nvidia’s AI‑accelerator share near 80% to 90%, reinforcing its current lead.
- CEO Jensen Huang said in Q3 that cloud GPUs are sold out, reflecting a large backlog of data‑center demand.
- AMD struck a deal to supply OpenAI with 6 gigawatts of compute capacity, compared with about 10 gigawatts tied to Nvidia.
- Google trained its latest model on in‑house TPUs, reported to cost roughly half as much as Nvidia’s chips, and Alphabet is reportedly in talks to sell TPUs to Meta.
- Hyperscalers’ bespoke accelerators and Broadcom collaborations may chip away at specific workloads next year, though Nvidia’s CUDA software ecosystem remains a strong moat and investors are weighing the risk of a 2026 spending slowdown.