-
Notifications
You must be signed in to change notification settings - Fork 2.8k
Open
Description
Problem
After Responses API support was added in #5783, OpenAI provider routing is still inconsistent:
gpt-5.2-codexcan be routed viachat/completionsinstead ofresponses.OPENAI_BASE_PATHhandling is inconsistent across request types (chat/completionsvsresponsesvsmodels).- Endpoint selection relies on brittle hardcoded checks and string replacement assumptions.
This causes real failures for codex models and confusing behavior when users configure custom OpenAI-compatible base paths.
Reproduction
- Configure provider
openaiwith modelgpt-5.2-codexand defaultOPENAI_BASE_PATH=v1/chat/completions. - Run a normal completion/streaming request.
- Observe request routed incorrectly for some codex variants, or inconsistent endpoint handling depending on path config.
Additional repro for custom paths:
- Set
OPENAI_HOSTto a proxy host. - Set
OPENAI_BASE_PATH=openai/v1/chat/completions. - Fetch models / run streaming / run non-streaming.
- Observe path handling differences between responses and models endpoints.
Expected Behavior
gpt-5.*-codexshould default to Responses API routing when no explicit custom override is provided.- If
OPENAI_BASE_PATHexplicitly points to.../chat/completions, routing should remain chat-completions for compatibility. - Responses and models paths should be derived consistently from configured base path for OpenAI-compatible hosts.
Scope
- OpenAI provider endpoint selection and path mapping only.
- No UI schema change.
- No provider contract expansion.
Related
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels