Skip to content

feat(compaction): Handle when models have input limit#6875

Open
shantur wants to merge 1 commit intoanomalyco:devfrom
shantur:feature/handle-input-limits
Open

feat(compaction): Handle when models have input limit#6875
shantur wants to merge 1 commit intoanomalyco:devfrom
shantur:feature/handle-input-limits

Conversation

@shantur
Copy link
Contributor

@shantur shantur commented Jan 4, 2026

No description provided.

@shantur shantur force-pushed the feature/handle-input-limits branch from 7bb2959 to b7de78d Compare January 4, 2026 22:49
@thdxr thdxr force-pushed the dev branch 3 times, most recently from f1ae801 to 08fa7f7 Compare January 30, 2026 14:37
lightsofapollo added a commit to lightsofapollo/opencode that referenced this pull request Feb 10, 2026
…nput is set

isOverflow() uses `limit.input` directly without subtracting output
token headroom, unlike the `context - output` fallback path. This means
models with an explicit input limit (e.g. Claude with prompt caching)
don't trigger compaction until the full input budget is consumed,
leaving zero room for the next response.

Three tests added:
- Near-boundary case: 198K/200K used, should compact but doesn't
- Parity test: same tokens correctly trigger compaction without limit.input
- Asymmetry test: identical 200K models diverge by exactly 32K (the missing output headroom)

Related: anomalyco#10634, anomalyco#8089, anomalyco#11086, anomalyco#12621
See also: anomalyco#6875, anomalyco#12924
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant