What version of Codex CLI is running?
codex-cli 0.99.0-alpha.10
What subscription do you have?
ChatGPT Pro
Which model were you using?
gpt-5.3-codex
What platform is your computer?
Darwin 25.2.0 arm64 arm
What terminal emulator and version are you using (if applicable)?
Ghostty
What issue are you seeing?
Both config.toml and TUI are set for gpt-5.3-codex, but the output and SSE captures show that the model name is actually gpt-5.2-2025-12-11.
What steps can reproduce the bug?
- Set both
config.toml and TUI to gpt-5.3-codex
- Run
RUST_LOG='codex_tui::chatwidget=info,codex_api::sse::responses=trace' codex
- Send a prompt
log/codex-tui.log shows response.model is gpt-5.2-2025-12-11 from event response.created
What is the expected behavior?
to actually use gpt-5.3-codex
Additional information
No response
What version of Codex CLI is running?
codex-cli 0.99.0-alpha.10
What subscription do you have?
ChatGPT Pro
Which model were you using?
gpt-5.3-codex
What platform is your computer?
Darwin 25.2.0 arm64 arm
What terminal emulator and version are you using (if applicable)?
Ghostty
What issue are you seeing?
Both
config.tomland TUI are set forgpt-5.3-codex, but the output and SSE captures show that the model name is actuallygpt-5.2-2025-12-11.What steps can reproduce the bug?
config.tomland TUI togpt-5.3-codexRUST_LOG='codex_tui::chatwidget=info,codex_api::sse::responses=trace' codexlog/codex-tui.logshowsresponse.modelisgpt-5.2-2025-12-11from eventresponse.createdWhat is the expected behavior?
to actually use
gpt-5.3-codexAdditional information
No response