Skip to content

Dynamic bash tool description breaks Anthropic prompt caching since v1.3.7 (Sonnet extremely slow via GitHub Copilot) #20648

@dominikfoerster2003

Description

@dominikfoerster2003

Description

Anthropic models (Claude Sonnet) became extremely slow starting with v1.3.7 when used via the GitHub Copilot provider. GPT models over the same Copilot endpoint are unaffected. Downgrading to v1.3.6 immediately restores fast responses.

Tested by downloading release binaries and comparing identical prompts with Sonnet via GitHub Copilot:

Version Sonnet Speed
v1.3.5 Fast
v1.3.6 Fast
v1.3.7 Extremely slow
v1.3.10 Extremely slow
v1.3.12 Extremely slow
v1.3.13 Extremely slow

GPT models remain fast across all versions.

Supposition:

In v1.3.4, the bash tool description was deliberately made static to improve cache hit rates (changelog: "Adjust bash tool description to increase cache hit rates between projects").

In v1.3.7, the PowerShell support PR (#16069) reintroduced dynamic, per-project values into the bash tool description.

Environment

  • Provider: GitHub Copilot
  • Model: Claude Sonnet 4.6
  • Fast version: v1.3.6
  • Slow version: v1.3.7 through v1.3.13

Related

  • v1.3.4 changelog: "Adjust bash tool description to increase cache hit rates between projects"
  • feat(windows): add first-class pwsh/powershell support #16069 (PowerShell support PR that introduced the dynamic placeholders)
  • applyCaching() in packages/opencode/src/provider/transform.ts sets cache breakpoints on system messages and last conversation turns

Plugins

None

OpenCode version

1.3.13

Steps to reproduce

  1. Configure OpenCode with GitHub Copilot as provider and select a Claude Sonnet model

  2. Download v1.3.6 and v1.3.7 binaries:
    bash
    mkdir -p /tmp/opencode-test
    curl -L -o /tmp/opencode-test/v1.3.6.zip "https://github.com/anomalyco/opencode/releases/download/v1.3.6/opencode-darwin-arm64.zip"
    curl -L -o /tmp/opencode-test/v1.3.7.zip "https://github.com/anomalyco/opencode/releases/download/v1.3.7/opencode-darwin-arm64.zip"
    unzip -o /tmp/opencode-test/v1.3.6.zip -d /tmp/opencode-test/v1.3.6
    unzip -o /tmp/opencode-test/v1.3.7.zip -d /tmp/opencode-test/v1.3.7

  3. Open any project directory and run v1.3.6:
    bash
    /tmp/opencode-test/v1.3.6/opencode

  4. Send a simple prompt (e.g. "list all files in this directory") → observe fast response

  5. Exit and run v1.3.7 in the same project:
    bash
    /tmp/opencode-test/v1.3.7/opencode

  6. Send the same prompt → observe significantly slower response

Expected: Both versions respond at similar speed.
Actual: v1.3.7+ is dramatically slower with Anthropic models. GPT models remain fast in both versions.

Screenshot and/or share link

No response

Operating System

macOS 26.4

Terminal

iTerm2

Metadata

Metadata

Assignees

Labels

bugSomething isn't workingcoreAnything pertaining to core functionality of the application (opencode server stuff)perfIndicates a performance issue or need for optimization

Type

No type
No fields configured for issues without a type.

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions