Skip to content

Auto-compact fails for Bedrock-hosted Claude models due to incorrect context limit #16356

@nathanl

Description

@nathanl

Problem

I'm using Opus 4.6 via Amazon Bedrock in OpenCode. Auto-compaction is not happening, resulting in broken sessions.
Claude suggests this is an issue with models.dev reporting the Anthropic token limit rather than the Bedrock one.

In a session, at the top of the sidebar I see "200,084 tokens" and "20% used", indicating a 1M token limit.

But at this point the session has the error undefined: The model returned the following errors: prompt is too long: 200148 tokens > 200000 maximum.
If I try to manually /compact, that fails too: undefined: The model returned the following errors: messages.5.content.6: thinkingorredacted_thinking blocks in the latest assistant message cannot be modified. These blocks must remain as they were in the original response.
The only thing I can do is abandon the session.

Expected Behavior

OpenCode should use the correct 200k token limit that Bedrock uses (maybe by fixing what models.dev reports).

Workaround

Override the limit in ~/.config/opencode/opencode.json:

{
  "provider": {
    "amazon-bedrock": {
      "models": {
        "us.anthropic.claude-opus-4-6-v1": {
          "limit": {
            "context": 200000,
            "output": 32000
          }
        }
      }
    }
  }
}

With this, in a new session I see 20k tokens and "10% used".

Plugins

None

OpenCode version

1.2.20

Steps to reproduce

  1. Use Claude Opus 4.6 via Amazon Bedrock
  2. Continue session until 200k tokens are used

Screenshot and/or share link

Image

Operating System

MacOS Tahoe 26.3

Terminal

iTerm2 3.6.8

Metadata

Metadata

Assignees

Labels

bugSomething isn't workingcoreAnything pertaining to core functionality of the application (opencode server stuff)

Type

No type
No fields configured for issues without a type.

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions