Skip to content

grep: per-file match cap causes LLM retries (10 matches hardcoded) #617

@FlorianBruniaux

Description

@FlorianBruniaux

Problem

rtk grep caps results at 10 matches per file (src/grep_cmd.rs:113.take(10)), with a global max of 50.

This produces a "+N more" truncation that tells Claude signal exists but withholds it. Claude then re-runs the search with different args to recover the missing results — burning more tokens than a raw grep would have.

Root cause

src/grep_cmd.rs:113:

.take(10)  // hardcoded per-file cap

src/main.rs:309: global max = 50.

Impact

  • LLM sees "+N more" and loops to find the missing matches
  • Each retry costs more tokens than the original search
  • Net result: RTK makes token consumption worse than raw grep

Proposed fix

  • Remove per-file cap (or raise to 50)
  • Raise global max from 50 → 100
  • Add optional --max-per-file <N> flag for users who want stricter limits

Acceptance criteria

  • rtk grep "pattern" returns all matches per file up to the global max
  • No "+N more" truncation that signals hidden results
  • Token savings still ≥50% vs raw grep (from deduplication and grouping, not caps)

Metadata

Metadata

Assignees

No one assigned

    Labels

    P0Critical: causes LLM loops, worse than raw commandbugSomething isn't workingfilter-qualityFilter produces incorrect/truncated signal

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions