Skip to content

feat(router): LLM-generated contextual acknowledgment messages#483

Merged
zbigniewsobiecki merged 1 commit intodevfrom
feat/llm-ack-messages
Feb 22, 2026
Merged

feat(router): LLM-generated contextual acknowledgment messages#483
zbigniewsobiecki merged 1 commit intodevfrom
feat/llm-ack-messages

Conversation

@zbigniewsobiecki
Copy link
Copy Markdown
Member

Summary

  • Replace hardcoded INITIAL_MESSAGES lookups in webhook ack handlers (Trello, GitHub, JIRA) with a single-shot LLM call that generates a short, context-aware acknowledgment message reflecting the actual request content (card name, PR title, comment text)
  • Uses the existing progressModel config (defaults to openrouter:google/gemini-2.5-flash-lite) with a 5-second timeout — zero new configuration required
  • Gracefully falls back to static INITIAL_MESSAGES on any failure: no model configured, missing API key, LLM error, timeout, or empty output

Changes

File Change
src/router/ackMessageGenerator.ts NewgenerateAckMessage(), context extractors for each provider, LLM call with fallback chain
src/router/trello.ts Use generateAckMessage() in tryPostTrelloAck()
src/router/github.ts Use generateAckMessage() in tryPostGitHubAck()
src/router/jira.ts Use generateAckMessage() in tryPostJiraAck()
tests/unit/router/ackMessageGenerator.test.ts New — 30 tests covering context extractors + all fallback scenarios
tests/unit/router/{trello,github,jira}.test.ts Updated mocks for new ack generator module

Fallback chain

generateAckMessage()
  → LLM generates contextual message     ✓ happy path
  → No progressModel configured          → INITIAL_MESSAGES[agentType]
  → No OPENROUTER_API_KEY credential     → INITIAL_MESSAGES[agentType]
  → LLM call fails (network, auth, etc)  → INITIAL_MESSAGES[agentType]
  → LLM call exceeds 5s timeout          → INITIAL_MESSAGES[agentType]
  → LLM returns empty output             → INITIAL_MESSAGES[agentType]
  → Unknown agentType                    → generic "Working on it..."

Test plan

  • 30 unit tests for ackMessageGenerator (context extractors, all fallback paths, env var restoration, timeout)
  • Existing router handler tests updated and passing (49 tests)
  • Full test suite passing (2600 tests)
  • TypeScript compilation clean
  • Biome lint clean (0 errors)

🤖 Generated with Claude Code

Replace hardcoded INITIAL_MESSAGES lookups in webhook ack handlers with
a single-shot LLM call that produces a short, context-aware message
reflecting the actual request (e.g., card name, PR title, comment text).

Uses the existing progressModel config (defaults to gemini-2.5-flash-lite
via OpenRouter) with a 5-second timeout. On any failure — no model
configured, missing API key, LLM error, timeout, or empty output —
gracefully falls back to the static INITIAL_MESSAGES.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@zbigniewsobiecki zbigniewsobiecki merged commit 0333d94 into dev Feb 22, 2026
5 checks passed
@zbigniewsobiecki zbigniewsobiecki deleted the feat/llm-ack-messages branch February 22, 2026 17:15
This was referenced Feb 22, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant