Bug Report
Analysis
extractAPITargetHost() in pkg/workflow/awf_helpers.go:315-324 strips everything after the first / from the URL, discarding the path component. This means OPENAI_BASE_URL values like:
https://stone-dataplatform-production.cloud.databricks.com/serving-endpoints
get reduced to just:
stone-dataplatform-production.cloud.databricks.com
The path /serving-endpoints is required by the Databricks serving endpoint proxy. Without it, API requests go to https://host/v1/chat/completions instead of https://host/serving-endpoints/v1/chat/completions, which returns 404.
Root Cause
gh-aw (pkg/workflow/awf_helpers.go:321-324):
// Remove path suffix if present (everything after first /)
if idx := strings.Index(host, "/"); idx != -1 {
host = host[:idx]
}
This intentionally strips the path because the downstream consumer (--openai-api-target) was designed to accept only a hostname.
gh-aw-firewall (containers/api-proxy/server.js:278-281):
const options = {
hostname: targetHost, // only a hostname — no path support
port: 443,
path: targetUrl.pathname + targetUrl.search, // uses req.url directly
};
The proxy uses targetHost as the hostname in https.request(), so it cannot include a path. And req.url (e.g., /v1/chat/completions) is used as the request path directly — there's no base path prepended.
Impact
Any OPENAI_BASE_URL (or ANTHROPIC_BASE_URL) that includes a required path prefix will silently fail. This affects:
- Databricks serving endpoints (
/serving-endpoints)
- Azure OpenAI deployments (
/openai/deployments/<name>)
- Any corporate LLM router that uses a path prefix
Implementation Plan
1. gh-aw: Extract and pass the base path separately
File: pkg/workflow/awf_helpers.go
- Add a new function
extractAPIBasePath(workflowData, envVar) that returns the path component (e.g., /serving-endpoints) from the URL, or "" if there is none.
- In
BuildAWFArgs() (~line 196), after setting --openai-api-target, also pass:
openaiBasePath := extractAPIBasePath(config.WorkflowData, "OPENAI_BASE_URL")
if openaiBasePath != "" {
awfArgs = append(awfArgs, "--openai-api-base-path", openaiBasePath)
}
- Same for
ANTHROPIC_BASE_URL → --anthropic-api-base-path.
File: pkg/workflow/awf_helpers_test.go
- Add tests for
extractAPIBasePath:
"https://host.com/serving-endpoints" → "/serving-endpoints"
"https://host.com/v1" → "/v1"
"https://host.com/openai/deployments/gpt-4" → "/openai/deployments/gpt-4"
"https://host.com" → ""
"host.com" → ""
2. gh-aw-firewall: Accept and use the base path
File: src/cli.ts
- Add CLI option:
--openai-api-base-path <path> (and --anthropic-api-base-path)
- Pass to config as
openaiApiBasePath
File: src/docker-manager.ts
- Pass
OPENAI_API_BASE_PATH env var to the api-proxy container
File: containers/api-proxy/server.js
- Read
OPENAI_API_BASE_PATH from env (default: "")
- In the OpenAI request handler (~line 414), prepend the base path:
const basePath = OPENAI_API_BASE_PATH || '';
const fullPath = basePath + req.url; // e.g., "/serving-endpoints" + "/v1/chat/completions"
proxyRequest(req, res, OPENAI_API_TARGET, { ... }, 'openai', fullPath);
- Update
proxyRequest() to accept an optional overridePath parameter, or prepend before calling it
3. Follow Guidelines
- Error messages:
"[what's wrong]. [what's expected]. [example]" style
- Run
make agent-finish before completing
- Use
console.FormatInfoMessage() for logging base path config
Bug Report
Analysis
extractAPITargetHost()inpkg/workflow/awf_helpers.go:315-324strips everything after the first/from the URL, discarding the path component. This meansOPENAI_BASE_URLvalues like:get reduced to just:
The path
/serving-endpointsis required by the Databricks serving endpoint proxy. Without it, API requests go tohttps://host/v1/chat/completionsinstead ofhttps://host/serving-endpoints/v1/chat/completions, which returns 404.Root Cause
gh-aw (
pkg/workflow/awf_helpers.go:321-324):This intentionally strips the path because the downstream consumer (
--openai-api-target) was designed to accept only a hostname.gh-aw-firewall (
containers/api-proxy/server.js:278-281):The proxy uses
targetHostas thehostnameinhttps.request(), so it cannot include a path. Andreq.url(e.g.,/v1/chat/completions) is used as the request path directly — there's no base path prepended.Impact
Any
OPENAI_BASE_URL(orANTHROPIC_BASE_URL) that includes a required path prefix will silently fail. This affects:/serving-endpoints)/openai/deployments/<name>)Implementation Plan
1. gh-aw: Extract and pass the base path separately
File:
pkg/workflow/awf_helpers.goextractAPIBasePath(workflowData, envVar)that returns the path component (e.g.,/serving-endpoints) from the URL, or""if there is none.BuildAWFArgs()(~line 196), after setting--openai-api-target, also pass:ANTHROPIC_BASE_URL→--anthropic-api-base-path.File:
pkg/workflow/awf_helpers_test.goextractAPIBasePath:"https://host.com/serving-endpoints"→"/serving-endpoints""https://host.com/v1"→"/v1""https://host.com/openai/deployments/gpt-4"→"/openai/deployments/gpt-4""https://host.com"→"""host.com"→""2. gh-aw-firewall: Accept and use the base path
File:
src/cli.ts--openai-api-base-path <path>(and--anthropic-api-base-path)openaiApiBasePathFile:
src/docker-manager.tsOPENAI_API_BASE_PATHenv var to the api-proxy containerFile:
containers/api-proxy/server.jsOPENAI_API_BASE_PATHfrom env (default:"")proxyRequest()to accept an optionaloverridePathparameter, or prepend before calling it3. Follow Guidelines
"[what's wrong]. [what's expected]. [example]"stylemake agent-finishbefore completingconsole.FormatInfoMessage()for logging base path config