Problem
Copilot CLI 1.0.21 (shipped in gh-aw v0.67.3) added a startup model validation step: when COPILOT_MODEL is set, the CLI calls GET /models to validate the model name before execution. This endpoint rejects classic PATs (ghp_* tokens), causing the agent step to fail with exit code 1 immediately on startup.
Context
Original report: github/gh-aw#25593
- Affected: Workflows that set
COPILOT_MODEL and use a classic PAT as COPILOT_GITHUB_TOKEN.
- Unaffected: Workflows without
COPILOT_MODEL; workflows using OAuth or fine-grained tokens.
- Regression: Introduced in gh-aw v0.67.3 (Copilot CLI 1.0.21); v0.67.2 (CLI 1.0.20) was not affected.
The AWF api-proxy sidecar already has special handling for Copilot GET /models (see containers/api-proxy/server.js:891-914), which routes using COPILOT_GITHUB_TOKEN to api.githubcopilot.com/models. However, a classic PAT is not accepted by the Copilot inference API /models endpoint regardless of proxying.
Root Cause
The api-proxy correctly proxies GET /models to api.githubcopilot.com/models using the Copilot OAuth token. The issue is that when only a classic PAT is available (no OAuth exchange), the /models endpoint returns an error. The proxy cannot synthesize an OAuth token from a classic PAT.
Additionally, the note in stored memory confirms: "Copilot BYOK/offline workflows must set COPILOT_MODEL explicitly to avoid Copilot CLI BYOK startup failures" — this is now in tension with classic PAT workflows that also set COPILOT_MODEL.
Proposed Solution
- Intercept
GET /models in the api-proxy and return a synthetic response (containers/api-proxy/server.js): When COPILOT_GITHUB_TOKEN is a classic PAT (detected by ghp_ prefix), return a mock 200 response with a hardcoded models list, bypassing the upstream validation call. This maintains backward compatibility without requiring token type changes.
- Document the token type requirement: Add a warning in the workflow template or docs when
COPILOT_MODEL is set but the token is a classic PAT.
- Alternative: Detect classic PAT in
src/docker-manager.ts and automatically unset COPILOT_MODEL when the token cannot support model validation, falling back to the default model (with a logged warning).
Generated by Firewall Issue Dispatcher · ● 2.1M · ◷
Problem
Copilot CLI 1.0.21 (shipped in gh-aw v0.67.3) added a startup model validation step: when
COPILOT_MODELis set, the CLI callsGET /modelsto validate the model name before execution. This endpoint rejects classic PATs (ghp_*tokens), causing the agent step to fail with exit code 1 immediately on startup.Context
Original report: github/gh-aw#25593
COPILOT_MODELand use a classic PAT asCOPILOT_GITHUB_TOKEN.COPILOT_MODEL; workflows using OAuth or fine-grained tokens.The AWF api-proxy sidecar already has special handling for Copilot
GET /models(seecontainers/api-proxy/server.js:891-914), which routes usingCOPILOT_GITHUB_TOKENtoapi.githubcopilot.com/models. However, a classic PAT is not accepted by the Copilot inference API/modelsendpoint regardless of proxying.Root Cause
The api-proxy correctly proxies
GET /modelstoapi.githubcopilot.com/modelsusing the Copilot OAuth token. The issue is that when only a classic PAT is available (no OAuth exchange), the/modelsendpoint returns an error. The proxy cannot synthesize an OAuth token from a classic PAT.Additionally, the note in stored memory confirms: "Copilot BYOK/offline workflows must set
COPILOT_MODELexplicitly to avoid Copilot CLI BYOK startup failures" — this is now in tension with classic PAT workflows that also setCOPILOT_MODEL.Proposed Solution
GET /modelsin the api-proxy and return a synthetic response (containers/api-proxy/server.js): WhenCOPILOT_GITHUB_TOKENis a classic PAT (detected byghp_prefix), return a mock200response with a hardcoded models list, bypassing the upstream validation call. This maintains backward compatibility without requiring token type changes.COPILOT_MODELis set but the token is a classic PAT.src/docker-manager.tsand automatically unsetCOPILOT_MODELwhen the token cannot support model validation, falling back to the default model (with a logged warning).