OpenAI's First Open-Weight Model

GPT-OSS

Apache 2.0 Open Source Reasoning Model

gpt-oss-120b and gpt-oss-20b are open-weight language models released by OpenAI, featuring Mixture-of-Experts (MoE) architecture. They excel at reasoning tasks with powerful tool usage and code generation capabilities, optimized for efficient deployment.

GPT-OSS-120B
Large-scale reasoning model, approaching o4-mini performance
  • 117B Total Parameters,5.1B Active Parameters
  • Runs on single 80GB GPU
  • TauBench Tool Usage:68%
  • Codeforces Programming:2622 Elo
GPT-OSS-20B
Lightweight edge model for local deployment
  • 21B Total Parameters,3.6B Active Parameters
  • Requires only 16GB memory
  • Outperforms o3-mini in math competitions
  • Suitable for edge devices and privacy scenarios