GPT-5-Codex-Mini Released: 4x More Usage
GPT-5-Codex-Mini cuts costs while giving you 4x more usage. Auto-switches when you're running low. Available now in CLI and IDE Extension.
TL;DR
- New
gpt-5-codex-minimodel offers 4x more usage at lower cost - CLI and IDE Extension auto-suggest switching at 90% usage limit
- No breaking changes — opt-in via CLI flag, slash command, or config
New
- GPT-5-Codex-Mini model — Smaller, cheaper alternative to gpt-5-codex that quadruples your usage allowance within ChatGPT subscriptions.
- Auto-switching at 90% usage — CLI and IDE Extension now prompt you to switch to Mini when approaching your 5-hour limit, keeping you productive without interruptions.
How to Use
- CLI: Run
codex --model gpt-5-codex-minior use the/modelslash command - IDE Extension: Select GPT-5-Codex-Mini from the dropdown menu
- Persistent: Set
model = "gpt-5-codex-mini"in yourconfig.tomlconfiguration file
Update Codex CLI: npm install -g @openai/codex-cli@latest
Source: Codex