GPT-5-Codex-Mini Released: 4x More Usage

GPT-5-Codex-Mini cuts costs while giving you 4x more usage. Auto-switches when you're running low. Available now in CLI and IDE Extension.

GPT-5-Codex-Mini Released: 4x More Usage

TL;DR

  • New gpt-5-codex-mini model offers 4x more usage at lower cost
  • CLI and IDE Extension auto-suggest switching at 90% usage limit
  • No breaking changes — opt-in via CLI flag, slash command, or config

New

  • GPT-5-Codex-Mini model — Smaller, cheaper alternative to gpt-5-codex that quadruples your usage allowance within ChatGPT subscriptions.
  • Auto-switching at 90% usage — CLI and IDE Extension now prompt you to switch to Mini when approaching your 5-hour limit, keeping you productive without interruptions.

How to Use

  • CLI: Run codex --model gpt-5-codex-mini or use the /model slash command
  • IDE Extension: Select GPT-5-Codex-Mini from the dropdown menu
  • Persistent: Set model = "gpt-5-codex-mini" in your config.toml configuration file

Update Codex CLI: npm install -g @openai/codex-cli@latest

Source: Codex