Skip to content

[Auto] context_overflow: The conversation is too long for this model. Try /undo to remove recent turns or #99

@igorcosta

Description

@igorcosta

Auto-Reported Error

Field Value
Error Type context_overflow
CLI Version 0.8.3
Model z-ai/glm-5.1
Provider openrouter
Platform darwin 25.4.0
Session ID N/A
Report ID XKYm2OLD1SKuqeuz4fene

Error Message

The conversation is too long for this model. Try /undo to remove recent turns or /new to start fresh.
This endpoint's maximum context length is 202752 tokens. However, you requested about 203889 tokens (183648 of text input, 4241 of tool input, 16000 in the output). Please reduce the length of either one, or use the context-compression plugin to compress your prompt automatically.

Stack Trace

ApiError: The conversation is too long for this model. Try /undo to remove recent turns or /new to start fresh.
This endpoint's maximum context length is 202752 tokens. However, you requested about 203889 tokens (183648 of text input, 4241 of tool input, 16000 in the output). Please reduce the length of either one, or use the context-compression plugin to compress your prompt automatically.
    at makeError (/$bunfs/root/autohand-macos-arm64:32071:22)
    at async makeRequest (/$bunfs/root/autohand-macos-arm64:32827:37)
    at async complete (/$bunfs/root/autohand-macos-arm64:32785:48)
    at processTicksAndRejections (native:7:39)

Session Context

  • Conversation length: 7 messages
  • Context usage: 209%
  • Last tools used: N/A
  • Retry attempt: N/A

Additional Context

{}

Auto-reported by Autohand CLI v0.8.3 | Report ID: XKYm2OLD1SKuqeuz4fene

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions