-
Notifications
You must be signed in to change notification settings - Fork 10.4k
Description
Description
When using the OpenCode web UI from a mobile browser (Chrome on iOS/Android), the toast "Failed to send prompt" / "请求失败" frequently appears after sending a prompt. However, the prompt actually reaches the server and the agent runs normally. The toast is a false alarm caused by the streaming response breaking before the response body arrives.
This does not occur (or occurs much less frequently) on desktop browsers.
Root Cause Analysis
The POST /session/:sessionID/message endpoint uses Hono's stream() to deliver the response:
// packages/opencode/src/server/routes/session.ts (around line 730)
async (c) => {
c.status(200)
c.header("Content-Type", "application/json")
return stream(c, async (stream) => {
const sessionID = c.req.valid("param").sessionID
const body = c.req.valid("json")
const msg = await SessionPrompt.prompt({ ...body, sessionID })
stream.write(JSON.stringify(msg)) // ← only writes AFTER agent completes
})
}The problem: SessionPrompt.prompt() runs the full agent loop (tool calls, LLM inference, etc.) which can take minutes. During this entire time, the HTTP response body contains zero bytes. The response is effectively a long-lived idle stream.
On mobile browsers, the network stack is more aggressive about connection management (WiFi power saving, TCP keepalive timeouts, OS-level network throttling). When the response body stays empty for too long, the browser tears down the connection.
On the client side (packages/app/src/components/prompt-input.tsx):
void send().catch((err) => {
showToast({
title: "Failed to send prompt", // false alarm!
description: errorMessage(err), // shows "请求失败" (generic fallback)
})
removeOptimisticMessage() // removes user's message from UI
restoreInput() // puts text back in input box
})The errorMessage() function falls through to the generic language.t("common.requestFailed") because the browser-level error (likely a DOMException or TypeError from ReadableStream teardown) doesn't match the expected error shapes.
Timeline of failure
1. Phone sends POST /session/:id/message → server receives it ✅
2. Server returns HTTP 200 + empty streaming body
3. SessionPrompt.prompt() starts running (agent working)
4. Response body stays empty for minutes...
5. Mobile browser kills the idle connection
6. Client-side fetch() body reading fails with browser-level error
7. catch() fires → "Failed to send prompt" toast (FALSE ALARM)
8. Server continues processing → agent completes successfully
9. SSE events correct the UI state (messages appear)
Impact
- User confusion: toast says "failed" but the agent is running — users may retry, causing duplicate agent executions and wasted API costs
- UI glitch:
removeOptimisticMessage()briefly removes the user's message, then SSE events restore it - Mobile usability: makes the web UI feel unreliable on phones
Environment
- OpenCode version: 1.1.53
- Access method:
opencode web(tested both direct LAN and through reverse proxy — same result) - Affected: Mobile Chrome (iOS & Android), likely all mobile browsers
- Not affected: Desktop Chrome (more tolerant of long-idle streams)
Steps to Reproduce
- Start
opencode web - Open the web UI from a mobile phone browser
- Send a prompt that triggers a long agent run (e.g., multi-step code editing)
- Wait — within 30-120 seconds, the "Failed to send prompt" toast appears
- Observe: the agent is actually running normally despite the error toast
Suggested Fixes
Option A: Switch web UI to prompt_async (Recommended)
The POST /session/:sessionID/prompt_async endpoint already exists and returns 204 immediately. The web UI already uses SSE for real-time updates. Change the client to:
- Call
prompt_asyncinstead ofprompt(fire-and-forget) - Rely on existing SSE events for progress/completion
This eliminates the long-lived response entirely.
Option B: Add heartbeat bytes to the streaming response
Write a periodic heartbeat (e.g., a space or newline every 30 seconds) to keep the connection alive:
return stream(c, async (stream) => {
const heartbeat = setInterval(() => stream.write(" "), 30_000)
try {
const msg = await SessionPrompt.prompt({ ...body, sessionID })
stream.write(JSON.stringify(msg))
} finally {
clearInterval(heartbeat)
}
})Note: This requires the client to handle/strip the heartbeat bytes before JSON parsing.
Option C: Early acknowledgment
Return { accepted: true, messageID } immediately, then process asynchronously.
Related Issues
- iOS Safari/Firefox: session history rolls back after backgrounding until refresh + error toasts on resume #10721 — iOS Safari/Firefox session history rollback (related SSE/streaming issues on mobile)
- Basic Auth breaks the CORS support #10047 — Basic Auth breaks CORS support