Particle.news

Download on the App Store

OpenAI Unveils First Open-Weight Models Since GPT-2, Run on Common Hardware

Offered under an Apache 2.0 license on platforms such as Hugging Face, gpt-oss models enable text-only reasoning on local hardware at the expense of lower accuracy than OpenAI’s closed o-series.

OpenAI logo on a. phone
Image
Image
Image

Overview

  • This move marks OpenAI’s first open-weight release since 2019’s GPT-2 and comes days before the debut of its closed GPT-5 model.
  • gpt-oss-120b activates 5.1 billion of its 117 billion parameters per token to run on a single Nvidia GPU; gpt-oss-20b uses 20 billion parameters to operate on laptops with 16 GB of memory.
  • Both models use a mixture-of-experts architecture and high-compute reinforcement-learning fine-tuning to boost reasoning efficiency while routing complex queries to cloud-based closed models for multimodal tasks.
  • In benchmarks such as Codeforces and Humanity’s Last Exam, they outperform open rivals like DeepSeek’s R1 but trail proprietary o3 and o4-mini and hallucinate in nearly half of PersonQA queries.
  • OpenAI has not released the training datasets, citing ongoing copyright lawsuits, and reports that internal and third-party safety evaluations found no evidence of high-capability cyber or biosecurity threats.