Particle.news

Download on the App Store

Amazon Developing 'Olympus' AI Model with 2 Trillion Parameters, Twice That of Rival OpenAI's GPT-4

Amazon's Move Towards Self-reliance in AI with Investment in Anthropic Reflects Growing Competition in Large Language Model Development, with Launch of 'Olympus' Model Possibly Happening as Early as December

  • Amazon is developing a new AI model, codenamed 'Olympus', with 2 trillion parameters, which is double that of OpenAI's GPT-4, marking a significant escalation in the AI development race between tech giants.
  • This move is a clear indication of Amazon's intent to be self-reliant in AI technology rather than relying on models from other providers like Anthropic, demonstrating a long-term strategic plan for omin-channel cloud offerings.
  • Amazon's development of 'Olympus' is backed by Rohit Prasad, former head of Alexa, with the initiative reporting directly to Amazon's CEO, Andy Jassy, reflecting the significant internal resources invested into this project.
  • The 'Olympus' AI model could potentially be unveiled as early as December. The integration with Amazon's online store and Alexa smart speakers is a prime use-case, signifying an additional edge to its e-commerce and IoT operations.
  • Despite the size and complexity of 'Olympus', experts warn that larger models do not necessarily outperform those with fewer parameters, showing that the success of 'Olympus' is contingent on more than just its sheer size.
Hero image