Overview
- OpenAI has no active plans to deploy Google’s TPUs at scale after conducting early tests.
- The company continues to power its services predominantly with Nvidia GPUs and AMD AI chips.
- OpenAI is trialing Google Cloud’s TPUs to lower inference costs and reduce reliance on a single supplier.
- Its in-house AI processor design is on track for tape-out by the end of 2025.
- OpenAI is reassessing its revenue-sharing and exclusivity arrangements with Microsoft while broadening its cloud infrastructure.