Skip to content

fix(openai): OPENAI_BASE_PATH routing is inconsistent and gpt-5.2-codex can use wrong API #7253

@bavadim

Description

@bavadim

Problem

After Responses API support was added in #5783, OpenAI provider routing is still inconsistent:

  1. gpt-5.2-codex can be routed via chat/completions instead of responses.
  2. OPENAI_BASE_PATH handling is inconsistent across request types (chat/completions vs responses vs models).
  3. Endpoint selection relies on brittle hardcoded checks and string replacement assumptions.

This causes real failures for codex models and confusing behavior when users configure custom OpenAI-compatible base paths.

Reproduction

  1. Configure provider openai with model gpt-5.2-codex and default OPENAI_BASE_PATH=v1/chat/completions.
  2. Run a normal completion/streaming request.
  3. Observe request routed incorrectly for some codex variants, or inconsistent endpoint handling depending on path config.

Additional repro for custom paths:

  1. Set OPENAI_HOST to a proxy host.
  2. Set OPENAI_BASE_PATH=openai/v1/chat/completions.
  3. Fetch models / run streaming / run non-streaming.
  4. Observe path handling differences between responses and models endpoints.

Expected Behavior

  • gpt-5.*-codex should default to Responses API routing when no explicit custom override is provided.
  • If OPENAI_BASE_PATH explicitly points to .../chat/completions, routing should remain chat-completions for compatibility.
  • Responses and models paths should be derived consistently from configured base path for OpenAI-compatible hosts.

Scope

  • OpenAI provider endpoint selection and path mapping only.
  • No UI schema change.
  • No provider contract expansion.

Related

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions