GPT-5.1-Codex-Max Released: Faster, Smarter Coding
OpenAI's new GPT-5.1-Codex-Max is now live as the default model for Codex CLI and IDE users. Faster, smarter, and more token-efficient. Extra High reasoning mode added for complex tasks.
TL;DR
- GPT-5.1-Codex-Max is now the default model for CLI and IDE Extension users
- Faster, more intelligent, and more token-efficient than previous versions
- New "Extra High" reasoning effort for non-latency-sensitive tasks
New
- GPT-5.1-Codex-Max — New frontier agentic coding model trained on software engineering, math, and research tasks. Now default for signed-in ChatGPT users in CLI and IDE Extension.
- Extra High reasoning effort — New
xhighreasoning level lets the model think longer for better answers on non-time-critical tasks. Medium remains recommended for daily work.
How to Use
- CLI: Run
codex --model gpt-5.1-codex-maxor use the/modelslash command - IDE Extension: Select GPT-5.1-Codex-Max from the dropdown menu
- Persistent: Update your
config.tomlwithmodel = "gpt-5.1-codex-max"
API access for GPT-5.1-Codex-Max coming soon. See GPT-5.1-Codex and GPT-5.1-Codex-Mini Now Available for context on the broader model lineup.
Source: Codex