Skip to content

Conversation

@CodeYHJ
Copy link
Contributor

@CodeYHJ CodeYHJ commented Jan 5, 2026

在LLMResponse逻辑里,没有执行eventBus通知逻辑

Fixes #1245

Summary by CodeRabbit

  • New Features
    • Streaming responses now broadcast consistently across all application windows, ensuring synchronized message updates across the interface.

✏️ Tip: You can customize this high-level summary in your review settings.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Jan 5, 2026

📝 Walkthrough

Walkthrough

The pull request modifies the LLM event handler to broadcast the complete message to all renderer windows via the event bus when finishing a streaming response, ensuring that downstream listeners can receive the completion notification.

Changes

Cohort / File(s) Summary
Event broadcast in streaming response handler
src/main/presenter/agentPresenter/streaming/llmEventHandler.ts
Added broadcast of the complete message to all renderer windows using STREAM_EVENTS.RESPONSE after message content is updated via editMessage in handleLLMAgentResponse

Estimated code review effort

🎯 1 (Trivial) | ⏱️ ~3 minutes

Poem

🐰 A sound was lost in the streaming sea,
The message arrived, but quiet it'd be!
Now the event bell rings loud and clear,
Each window hears the response cheer! 🔔✨

Pre-merge checks and finishing touches

✅ Passed checks (5 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title 'fix: Sound effect not triggered on LLM stream response' directly matches the bug being fixed - it clearly summarizes the main issue and solution.
Linked Issues check ✅ Passed The PR addresses issue #1245 by adding event bus notification in the LLM response handler, ensuring sound effects are triggered when LLM stream responses complete.
Out of Scope Changes check ✅ Passed The single change broadcasts the message via event bus in handleLLMAgentResponse, which is directly scoped to fixing the missing sound effect notification for LLM streams.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.
✨ Finishing touches
  • 📝 Generate docstrings

📜 Recent review details

Configuration used: defaults

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between f8a2c62 and e834227.

📒 Files selected for processing (1)
  • src/main/presenter/agentPresenter/streaming/llmEventHandler.ts
🧰 Additional context used
📓 Path-based instructions (8)
**/*.{ts,tsx,js,jsx}

📄 CodeRabbit inference engine (CLAUDE.md)

Use English for logs and comments in TypeScript/JavaScript code

Files:

  • src/main/presenter/agentPresenter/streaming/llmEventHandler.ts
**/*.{ts,tsx}

📄 CodeRabbit inference engine (CLAUDE.md)

Use TypeScript with strict type checking enabled

Use OxLint for linting JavaScript and TypeScript files; ensure lint-staged hooks and typecheck pass before commits

Files:

  • src/main/presenter/agentPresenter/streaming/llmEventHandler.ts
src/main/presenter/**/*.ts

📄 CodeRabbit inference engine (CLAUDE.md)

src/main/presenter/**/*.ts: Use EventBus to broadcast events from main to renderer via mainWindow.webContents.send()
Implement one presenter per functional domain in the main process

Files:

  • src/main/presenter/agentPresenter/streaming/llmEventHandler.ts
src/main/**/*.ts

📄 CodeRabbit inference engine (CLAUDE.md)

src/main/**/*.ts: Use EventBus from src/main/eventbus.ts for decoupled inter-process communication
Context isolation must be enabled with preload scripts for secure IPC communication

Electron main process code should reside in src/main/, with presenters organized in presenter/ subdirectory (Window, Tab, Thread, Mcp, Config, LLMProvider), and app events managed via eventbus.ts

Files:

  • src/main/presenter/agentPresenter/streaming/llmEventHandler.ts
**/*.{js,ts,tsx,jsx,vue,mjs,cjs}

📄 CodeRabbit inference engine (.cursor/rules/development-setup.mdc)

All logs and comments must be in English

Files:

  • src/main/presenter/agentPresenter/streaming/llmEventHandler.ts
**/*.{js,ts,tsx,jsx,mjs,cjs}

📄 CodeRabbit inference engine (.cursor/rules/development-setup.mdc)

Use OxLint as the linter

Files:

  • src/main/presenter/agentPresenter/streaming/llmEventHandler.ts
**/*.{js,ts,tsx,jsx,vue,json,mjs,cjs}

📄 CodeRabbit inference engine (.cursor/rules/development-setup.mdc)

Use Prettier as the code formatter

Files:

  • src/main/presenter/agentPresenter/streaming/llmEventHandler.ts
**/*.{ts,tsx,vue}

📄 CodeRabbit inference engine (AGENTS.md)

**/*.{ts,tsx,vue}: Use camelCase for variable and function names; use PascalCase for types and classes; use SCREAMING_SNAKE_CASE for constants
Configure Prettier with single quotes, no semicolons, and line width of 100 characters. Run pnpm run format after completing features

Files:

  • src/main/presenter/agentPresenter/streaming/llmEventHandler.ts
🧠 Learnings (3)
📓 Common learnings
Learnt from: CR
Repo: ThinkInAIXYZ/deepchat PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-05T02:40:52.831Z
Learning: Applies to src/main/presenter/llmProviderPresenter/providers/*.ts : LLM provider implementations must follow the standardized event interface with `coreStream` method
📚 Learning: 2026-01-05T02:40:52.831Z
Learnt from: CR
Repo: ThinkInAIXYZ/deepchat PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-05T02:40:52.831Z
Learning: Applies to src/main/presenter/llmProviderPresenter/providers/*.ts : LLM provider implementations must follow the standardized event interface with `coreStream` method

Applied to files:

  • src/main/presenter/agentPresenter/streaming/llmEventHandler.ts
📚 Learning: 2026-01-05T02:40:52.831Z
Learnt from: CR
Repo: ThinkInAIXYZ/deepchat PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-05T02:40:52.831Z
Learning: Applies to src/main/presenter/**/*.ts : Use EventBus to broadcast events from main to renderer via `mainWindow.webContents.send()`

Applied to files:

  • src/main/presenter/agentPresenter/streaming/llmEventHandler.ts
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: build-check (x64)
🔇 Additional comments (1)
src/main/presenter/agentPresenter/streaming/llmEventHandler.ts (1)

234-234: The implementation correctly handles sound effects; they are triggered selectively, not on every chunk. The playToolcallSound() is called only when msg.tool_call === 'start' (once per tool call), and playTypewriterSound() is throttled to 120ms intervals during text streaming. The STREAM_EVENTS.RESPONSE broadcasts serve multiple purposes beyond sound—primarily UI state updates and content accumulation—and the ALL_WINDOWS target is appropriate for the multi-window architecture. No changes needed.

Likely an incorrect or invalid review comment.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@zerob13 zerob13 merged commit 4b1e93e into ThinkInAIXYZ:dev Jan 5, 2026
2 checks passed
@zerob13 zerob13 mentioned this pull request Jan 5, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[BUG] LLM STREAM 返回时候没有触发声效

2 participants