fix: restore reasoning streaming and add exception safety to agent loop#195
Open
catiglu wants to merge 4 commits intolsdefine:mainfrom
Open
fix: restore reasoning streaming and add exception safety to agent loop#195catiglu wants to merge 4 commits intolsdefine:mainfrom
catiglu wants to merge 4 commits intolsdefine:mainfrom
Conversation
核心改进 (Top3 from 架构审计): 1. RULES lsdefine#11 强化: 报告 MUST 包含 verify_claims markdown 验证表,否则无效 - 堵死 C1 漏洞 (Agent 跳过验证直接报告) - 显式路径 scripts/verify_claims.py 2. META-SOP 公理lsdefine#5 追加导入自检: - 使用 verify_claims.py 前先 from scripts.verify_claims import VerificationResult - 导入失败则回退手动 file_read 路径 - 堵死 C3 漏洞 (verify_claims.py 损坏未检测) 3. RULES 区全部分类 ([致命型]/[隐蔽型]/[效率型]): - [致命型]×5: 交叉验证/闭环/进程/SOP/验证闭环 - [隐蔽型]×5: Web 搜索/编码安全/web JS/飞书 CLI/Everything - [效率型]×2: 搜索先行/窗口枚举 - 为 RULES 膨胀做准备,紧急时优先关注致命型 新增文件: - scripts/verify_claims.py (144 行通用验证工具) - scripts/search*.py (6 个搜索后端) - tests/test_*.py (2 个测试) 验证: 9/9 PASS, 10/10 物理证据闭环
Key changes from remote branch: agentmain.py (L152-224): + Added try/except/finally block for crash recovery + Real-time chunk collection to prevent response loss + abort_flag support for graceful Ctrl+C interruption + Slash command pre-check (/quit, /help intercept) llmcore.py (L263): + Restored 'yield text' for reasoning_content streaming - Remote version was missing yield → reasoning black-boxed llmcore.py (L330): + Restored 'yield reasoning' for thinking block rendering - Remote version had blocks.append() only → delayed UI update Impact: Agent now provides production-grade streaming responses + error resilience.
Key changes: 1. agentmain.py: Added try/except/finally for crash recovery + real-time chunk collection 2. agentmain.py: abort_flag support for graceful Ctrl+C interruption 3. agentmain.py: Slash command pre-check (/quit, /help intercept) 4. llmcore.py: Restored yield text for reasoning_content streaming 5. llmcore.py: Restored yield reasoning for thinking block rendering
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
🔧 Key Changes
This PR restores reasoning/streaming functionality that was broken in the remote merge, plus adds production-grade exception safety.
1. Exception Safety in Agent Loop (
agentmain.py:L143-195)abort_flagsupport for graceful Ctrl+C interruption2. Reasoning Content Streaming (
llmcore.py:L243-244)yield textfor reasoning_content streaming3. Thinking Block Rendering (
llmcore.py:L259-260)blocks.append({"type": "thinking", ...})with proper rendering📊 Impact
🚀 How to Test
python agentmain.py