From a21438f64bfe1e4353c9e5723eb98ec5a8fb6295 Mon Sep 17 00:00:00 2001 From: "github-actions[bot]" Date: Thu, 12 Mar 2026 19:25:27 +0000 Subject: [PATCH] docs: document custom OpenAI/Anthropic API endpoint support in engines.md Add a new 'Custom API Endpoints' subsection under 'Engine Environment Variables' documenting that OPENAI_BASE_URL (codex) and ANTHROPIC_BASE_URL (claude) are automatically picked up by the AWF sandbox proxy to route API calls to internal LLM routers, Azure OpenAI, or other custom endpoints. From PR #20631 (merged 2026-03-12T14:02Z, after DDUw's 06:23Z run). Co-Authored-By: Claude Sonnet 4.6 --- docs/src/content/docs/reference/engines.md | 37 ++++++++++++++++++++++ 1 file changed, 37 insertions(+) diff --git a/docs/src/content/docs/reference/engines.md b/docs/src/content/docs/reference/engines.md index c3eff2af2b9..6dc341450ca 100644 --- a/docs/src/content/docs/reference/engines.md +++ b/docs/src/content/docs/reference/engines.md @@ -91,6 +91,43 @@ engine: Environment variables can also be defined at workflow, job, step, and other scopes. See [Environment Variables](/gh-aw/reference/environment-variables/) for complete documentation on precedence and all 13 env scopes. +#### Custom API Endpoints + +Two environment variables receive special treatment when set in `engine.env`: `OPENAI_BASE_URL` (for `codex`) and `ANTHROPIC_BASE_URL` (for `claude`). When either is present, the AWF sandbox proxy automatically routes API calls to the specified host instead of the default `api.openai.com` or `api.anthropic.com`. Credential isolation and firewall enforcement remain active. + +This enables workflows to use internal LLM routers, Azure OpenAI deployments, or other OpenAI-compatible endpoints without bypassing AWF's security model. + +```yaml wrap +engine: + id: codex + model: gpt-4o + env: + OPENAI_BASE_URL: "https://llm-router.internal.example.com/v1" + OPENAI_API_KEY: ${{ secrets.LLM_ROUTER_KEY }} + +network: + allowed: + - github.com + - llm-router.internal.example.com # must be listed here for the firewall to permit outbound requests +``` + +For Claude workflows routed through a custom Anthropic-compatible endpoint: + +```yaml wrap +engine: + id: claude + env: + ANTHROPIC_BASE_URL: "https://anthropic-proxy.internal.example.com" + ANTHROPIC_API_KEY: ${{ secrets.PROXY_API_KEY }} + +network: + allowed: + - github.com + - anthropic-proxy.internal.example.com +``` + +The custom hostname is extracted from the URL and passed to the AWF `--openai-api-target` or `--anthropic-api-target` flag automatically at compile time. No additional configuration is required. + ### Engine Command-Line Arguments All engines support custom command-line arguments through the `args` field, injected before the prompt: