Bug Description
During LLM streaming in the desktop UI (and web app), text content appears duplicated/repeated. After quit+reopen, content displays correctly (loaded from DB).
Steps to Reproduce
- Open the opencode desktop app or web UI
- Send a message to trigger LLM streaming
- Observe the streaming text output — content appears repeated/duplicated
- Quit and reopen the app — content now displays correctly
Expected Behavior
Streaming text should appear incrementally without duplication.
Actual Behavior
Text content is duplicated during streaming. For example, instead of showing Hello world, the UI shows Hello worldHello world.
Root Cause
In global-sdk.tsx, SSE events are batched in 16ms windows with coalescing. When message.part.updated events coalesce within a single batch (text-end replaces text-start at the same queue index), stale message.part.delta events remain in the queue and get applied on top of the authoritative full text.
Environment
- Platform: macOS (desktop app via Tauri), also affects web UI
- Observed on latest dev branch
Bug Description
During LLM streaming in the desktop UI (and web app), text content appears duplicated/repeated. After quit+reopen, content displays correctly (loaded from DB).
Steps to Reproduce
Expected Behavior
Streaming text should appear incrementally without duplication.
Actual Behavior
Text content is duplicated during streaming. For example, instead of showing
Hello world, the UI showsHello worldHello world.Root Cause
In
global-sdk.tsx, SSE events are batched in 16ms windows with coalescing. Whenmessage.part.updatedevents coalesce within a single batch (text-endreplacestext-startat the same queue index), stalemessage.part.deltaevents remain in the queue and get applied on top of the authoritative full text.Environment