Problem
backend/src/routes/chat.ts — POST /chat opens an SSE stream and awaits runLLMStream() with no timeout:
res.setHeader("Content-Type", "text/event-stream");
res.flushHeaders();
const { fullText, events } = await runLLMStream({ ... }); // no timeout or AbortSignal
If the upstream LLM API stalls (network partition, provider outage), the SSE connection stays open indefinitely. Under load, a wave of stalled requests can exhaust the process's open-connection limit.
The same applies to POST /projects/:projectId/chat and POST /tabular-review/:reviewId/generate.
Fix
Wrap the LLM call in a timeout using AbortController (e.g., 3 minutes). On timeout:
- Abort the upstream request.
- Write an error SSE event (
{ type: "error", message: "Request timed out" }).
- Write
data: [DONE] and end the response.
Problem
backend/src/routes/chat.ts—POST /chatopens an SSE stream and awaitsrunLLMStream()with no timeout:If the upstream LLM API stalls (network partition, provider outage), the SSE connection stays open indefinitely. Under load, a wave of stalled requests can exhaust the process's open-connection limit.
The same applies to
POST /projects/:projectId/chatandPOST /tabular-review/:reviewId/generate.Fix
Wrap the LLM call in a timeout using
AbortController(e.g., 3 minutes). On timeout:{ type: "error", message: "Request timed out" }).data: [DONE]and end the response.