Skip to content

fix: custom API endpoints for Codex/Claude/Gemini engines#21018

Closed
Copilot wants to merge 2 commits intomainfrom
copilot/fix-custom-api-endpoints-issue
Closed

fix: custom API endpoints for Codex/Claude/Gemini engines#21018
Copilot wants to merge 2 commits intomainfrom
copilot/fix-custom-api-endpoints-issue

Conversation

Copy link
Contributor

Copilot AI commented Mar 15, 2026

Summary

Fixes the "Custom API Endpoints doesn't work" issue where using OPENAI_BASE_URL in engine.env (e.g. to use openrouter.ai) caused Codex to silently fall back to the default gpt-5.3-codex model instead of the user's configured model.

Root cause

Codex's built-in openai provider validates model names against a known list. When a custom model name like openrouter/free is not recognised, it falls back to the default model rather than forwarding the name to the API. Additionally, the engines (Codex, Claude) did not explicitly declare their API base-URL env var, unlike Gemini which already declared GEMINI_API_BASE_URL.

Changes

codex_engine.go — Declare OPENAI_BASE_URL = http://host.docker.internal:<CodexPort>/v1 when the AWF firewall is enabled. A user-provided OPENAI_BASE_URL in engine.env still overrides this via maps.Copy; the AWF --openai-api-target flag (already set by extractAPITargetHost) routes those proxy calls to the custom endpoint.

claude_engine.go — Same pattern: declare ANTHROPIC_BASE_URL = http://host.docker.internal:<ClaudePort> when firewall is enabled.

awf_helpers.go — Add GEMINI_API_BASE_URL--gemini-api-target extraction alongside the existing OPENAI_BASE_URL and ANTHROPIC_BASE_URL handlers, completing coverage for all three variable-endpoint engines.

codex_mcp.go — When OPENAI_BASE_URL is detected in engine.env, inject a model_provider = "openai-compat" block at the top of the generated config.toml (required by TOML: bare keys must precede section headers). The custom provider points its base_url at the local AWF LLM proxy (host.docker.internal:<CodexPort>/v1); AWF's --openai-api-target routes those requests onward to the user's endpoint. Because the provider is user-defined rather than built-in, it passes model names through as-is — enabling e.g. model: "openrouter/free".

How the user's example now works

engine:
  id: codex
  model: "openrouter/free"
  env:
    OPENAI_BASE_URL: "https://openrouter.ai/api/v1"
    OPENAI_API_KEY: ${{ secrets.OPENROUTER_API_KEY }}
  1. Compiler detects OPENAI_BASE_URL → emits --openai-api-target openrouter.ai to AWF
  2. config.toml gains model_provider = "openai-compat" with base_url = "http://host.docker.internal:10001/v1"
  3. Codex calls the local AWF LLM proxy with model = "openrouter/free" (no validation, no fallback)
  4. AWF routes the request to openrouter.ai, which accepts the model name

Security Summary

No new security vulnerabilities introduced. CodeQL scan: 0 alerts.



✨ PR Review Safe Output Test - Run 23102624714

💥 [THE END] — Illustrated by Smoke Claude ·



✨ PR Review Safe Output Test - Run 23110576257

💥 [THE END] — Illustrated by Smoke Claude ·

Copilot AI and others added 2 commits March 15, 2026 02:19
- codex_engine: declare OPENAI_BASE_URL → AWF LLM proxy when firewall enabled
- claude_engine: declare ANTHROPIC_BASE_URL → AWF LLM proxy when firewall enabled
- awf_helpers: add GEMINI_API_BASE_URL → --gemini-api-target support
- codex_mcp: inject openai-compat model_provider in config.toml when
  OPENAI_BASE_URL is set in engine.env so any model name (e.g. openrouter/free)
  is accepted without the built-in openai provider falling back to gpt-5.3-codex

Co-authored-by: pelikhan <4175913+pelikhan@users.noreply.github.com>
@pelikhan
Copy link
Contributor

@lpcox ler's chat about making something like this work

Copy link
Contributor

@github-actions github-actions bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💥 Automated smoke test review - all systems nominal!

💥 [THE END] — Illustrated by Smoke Claude

GIT_COMMITTER_EMAIL: github-actions[bot]@users.noreply.github.com
GIT_COMMITTER_NAME: github-actions[bot]
OPENAI_API_KEY: ${{ secrets.CODEX_API_KEY || secrets.OPENAI_API_KEY }}
OPENAI_BASE_URL: http://host.docker.internal:10001/v1
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Consider documenting why host.docker.internal is used here — this is a Docker-specific hostname for accessing the host from inside a container. Worth a comment for clarity.

-- /bin/bash -c 'export PATH="$(find /opt/hostedtoolcache -maxdepth 4 -type d -name bin 2>/dev/null | tr '\''\n'\'' '\'':'\'')$PATH"; [ -n "$GOROOT" ] && export PATH="$GOROOT/bin:$PATH" || true && claude --print --disable-slash-commands --no-chrome --mcp-config /tmp/gh-aw/mcp-config/mcp-servers.json --allowed-tools '\''Bash,BashOutput,Edit,Edit(/tmp/gh-aw/cache-memory/*),ExitPlanMode,Glob,Grep,KillBash,LS,MultiEdit,MultiEdit(/tmp/gh-aw/cache-memory/*),NotebookEdit,NotebookRead,Read,Read(/tmp/gh-aw/cache-memory/*),Task,TodoWrite,Write,Write(/tmp/gh-aw/cache-memory/*),mcp__github__download_workflow_run_artifact,mcp__github__get_code_scanning_alert,mcp__github__get_commit,mcp__github__get_dependabot_alert,mcp__github__get_discussion,mcp__github__get_discussion_comments,mcp__github__get_file_contents,mcp__github__get_job_logs,mcp__github__get_label,mcp__github__get_latest_release,mcp__github__get_me,mcp__github__get_notification_details,mcp__github__get_pull_request,mcp__github__get_pull_request_comments,mcp__github__get_pull_request_diff,mcp__github__get_pull_request_files,mcp__github__get_pull_request_review_comments,mcp__github__get_pull_request_reviews,mcp__github__get_pull_request_status,mcp__github__get_release_by_tag,mcp__github__get_secret_scanning_alert,mcp__github__get_tag,mcp__github__get_workflow_run,mcp__github__get_workflow_run_logs,mcp__github__get_workflow_run_usage,mcp__github__issue_read,mcp__github__list_branches,mcp__github__list_code_scanning_alerts,mcp__github__list_commits,mcp__github__list_dependabot_alerts,mcp__github__list_discussion_categories,mcp__github__list_discussions,mcp__github__list_issue_types,mcp__github__list_issues,mcp__github__list_label,mcp__github__list_notifications,mcp__github__list_pull_requests,mcp__github__list_releases,mcp__github__list_secret_scanning_alerts,mcp__github__list_starred_repositories,mcp__github__list_tags,mcp__github__list_workflow_jobs,mcp__github__list_workflow_run_artifacts,mcp__github__list_workflow_runs,mcp__github__list_workflows,mcp__github__pull_request_read,mcp__github__search_code,mcp__github__search_issues,mcp__github__search_orgs,mcp__github__search_pull_requests,mcp__github__search_repositories,mcp__github__search_users'\'' --debug-file /tmp/gh-aw/agent-stdio.log --verbose --permission-mode bypassPermissions --output-format stream-json "$(cat /tmp/gh-aw/aw-prompts/prompt.txt)"${GH_AW_MODEL_AGENT_CLAUDE:+ --model "$GH_AW_MODEL_AGENT_CLAUDE"}' 2>&1 | tee -a /tmp/gh-aw/agent-stdio.log
env:
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
ANTHROPIC_BASE_URL: http://host.docker.internal:10000
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The ANTHROPIC_BASE_URL override looks correct for custom API endpoint support. Ensure the endpoint value is validated/sanitized at runtime to prevent SSRF if user-configurable.

@github-actions

This comment has been minimized.

Copy link
Contributor

@github-actions github-actions bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💥 Automated smoke test review - all systems nominal!

💥 [THE END] — Illustrated by Smoke Claude

GIT_COMMITTER_EMAIL: github-actions[bot]@users.noreply.github.com
GIT_COMMITTER_NAME: github-actions[bot]
OPENAI_API_KEY: ${{ secrets.CODEX_API_KEY || secrets.OPENAI_API_KEY }}
OPENAI_BASE_URL: http://host.docker.internal:10001/v1
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Smoke test review comment #1: OPENAI_BASE_URL correctly points to the local AWF LLM proxy. This ensures custom API endpoints route through the firewall-aware proxy rather than directly to external services. ✅

-- /bin/bash -c 'export PATH="$(find /opt/hostedtoolcache -maxdepth 4 -type d -name bin 2>/dev/null | tr '\''\n'\'' '\'':'\'')$PATH"; [ -n "$GOROOT" ] && export PATH="$GOROOT/bin:$PATH" || true && claude --print --disable-slash-commands --no-chrome --mcp-config /tmp/gh-aw/mcp-config/mcp-servers.json --allowed-tools '\''Bash,BashOutput,Edit,Edit(/tmp/gh-aw/cache-memory/*),ExitPlanMode,Glob,Grep,KillBash,LS,MultiEdit,MultiEdit(/tmp/gh-aw/cache-memory/*),NotebookEdit,NotebookRead,Read,Read(/tmp/gh-aw/cache-memory/*),Task,TodoWrite,Write,Write(/tmp/gh-aw/cache-memory/*),mcp__github__download_workflow_run_artifact,mcp__github__get_code_scanning_alert,mcp__github__get_commit,mcp__github__get_dependabot_alert,mcp__github__get_discussion,mcp__github__get_discussion_comments,mcp__github__get_file_contents,mcp__github__get_job_logs,mcp__github__get_label,mcp__github__get_latest_release,mcp__github__get_me,mcp__github__get_notification_details,mcp__github__get_pull_request,mcp__github__get_pull_request_comments,mcp__github__get_pull_request_diff,mcp__github__get_pull_request_files,mcp__github__get_pull_request_review_comments,mcp__github__get_pull_request_reviews,mcp__github__get_pull_request_status,mcp__github__get_release_by_tag,mcp__github__get_secret_scanning_alert,mcp__github__get_tag,mcp__github__get_workflow_run,mcp__github__get_workflow_run_logs,mcp__github__get_workflow_run_usage,mcp__github__issue_read,mcp__github__list_branches,mcp__github__list_code_scanning_alerts,mcp__github__list_commits,mcp__github__list_dependabot_alerts,mcp__github__list_discussion_categories,mcp__github__list_discussions,mcp__github__list_issue_types,mcp__github__list_issues,mcp__github__list_label,mcp__github__list_notifications,mcp__github__list_pull_requests,mcp__github__list_releases,mcp__github__list_secret_scanning_alerts,mcp__github__list_starred_repositories,mcp__github__list_tags,mcp__github__list_workflow_jobs,mcp__github__list_workflow_run_artifacts,mcp__github__list_workflow_runs,mcp__github__list_workflows,mcp__github__pull_request_read,mcp__github__search_code,mcp__github__search_issues,mcp__github__search_orgs,mcp__github__search_pull_requests,mcp__github__search_repositories,mcp__github__search_users'\'' --debug-file /tmp/gh-aw/agent-stdio.log --verbose --permission-mode bypassPermissions --output-format stream-json "$(cat /tmp/gh-aw/aw-prompts/prompt.txt)"${GH_AW_MODEL_AGENT_CLAUDE:+ --model "$GH_AW_MODEL_AGENT_CLAUDE"}' 2>&1 | tee -a /tmp/gh-aw/agent-stdio.log
env:
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
ANTHROPIC_BASE_URL: http://host.docker.internal:10000
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Smoke test review comment #2: ANTHROPIC_BASE_URL added consistently across Claude engine steps. Good — this mirrors the Codex OPENAI_BASE_URL pattern and ensures all engines declare their proxy base URL. ✅

@lpcox
Copy link
Collaborator

lpcox commented Mar 15, 2026

AWF Compatibility Analysis

Checked the AWF firewall codebase (gh-aw-firewall) to see which of these custom endpoint changes are supported by existing AWF configuration flags.

✅ Fully Supported

  • OPENAI_BASE_URL--openai-api-target — AWF has this CLI flag, the api-proxy sidecar reads OPENAI_API_TARGET, and custom targets are auto-added to the Squid domain allowlist. The Codex engine setting OPENAI_BASE_URL to the local proxy works end-to-end.

  • ANTHROPIC_BASE_URL--anthropic-api-target — Same pattern, fully wired up in AWF (CLI flag → WrapperConfig → docker-compose env → containers/api-proxy/server.js).

  • model_provider = "openai-compat" (codex_mcp.go) — Purely a Codex config.toml concern that bypasses model name validation. No AWF changes needed; the proxy forwards whatever model name comes through.

❌ Gap: --gemini-api-target

The PR adds GEMINI_API_BASE_URL--gemini-api-target extraction in awf_helpers.go, but AWF does not have a --gemini-api-target flag:

  • No --gemini-api-target CLI option in src/cli.ts
  • No geminiApiTarget property in WrapperConfig (src/types.ts)
  • No GEMINI_API_TARGET handling in containers/api-proxy/server.js
  • No dedicated Gemini proxy port in the api-proxy sidecar

AWF currently has three api-target flags: --openai-api-target, --anthropic-api-target, and --copilot-api-target. Gemini is not yet covered.

What happens at runtime

When gh-aw emits --gemini-api-target, AWF will reject it as an unknown option and the run will fail. Two options:

  1. Add --gemini-api-target to AWF — new CLI flag, WrapperConfig property, api-proxy port, and Squid allowlist auto-add (mirrors the OpenAI/Anthropic pattern)
  2. Skip emitting the flag in gh-aw — since Gemini doesn't route through the api-proxy today, just ensure the custom Gemini host lands in --allow-domains and let traffic go through Squid directly

Option 1 is the cleaner long-term approach. Happy to add --gemini-api-target support on the AWF side if that's the direction.

@github-actions
Copy link
Contributor

Hey @Copilot 👋 — great work on this fix! Resolving the silent model-name fallback for custom API endpoints (e.g. openrouter/free) is a meaningful improvement for users who want to route Codex through non-OpenAI providers, and the three-engine coverage (Codex, Claude, Gemini) makes it feel complete.

The PR is well-structured and looks ready for maintainer review once the draft flag is removed. Here's the checklist summary:

Check Result
On-topic ✅ yes
Follows process ✅ yes — authored by Copilot coding agent as expected
Focused ✅ yes — lock-file regeneration is a natural make recompile side-effect
New dependencies ✅ no
Tests included ✅ yes — awf_helpers_test.go (+82) and codex_engine_test.go (+155)
Description ✅ yes — detailed root-cause analysis, per-file change notes, and end-to-end flow walkthrough
Lines changed 394

Verdict: 🟢 Aligned — quality: lgtm

The only minor flag is that the PR is still in draft state. Once you're satisfied with CI results, marking it ready for review would be the next step. Everything else looks solid!

Generated by Contribution Check ·

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Custom API Endpoints doesn't work

3 participants