Skip to content

Add async iterable support to ChatClient.completeStreamingChat()#521

Draft
Copilot wants to merge 2 commits intomainfrom
copilot/fix-chatclient-streaming-callbacks
Draft

Add async iterable support to ChatClient.completeStreamingChat()#521
Copilot wants to merge 2 commits intomainfrom
copilot/fix-chatclient-streaming-callbacks

Conversation

Copy link
Contributor

Copilot AI commented Mar 12, 2026

completeStreamingChat() only accepted a callback, causing confusing runtime errors when developers used the OpenAI SDK's familiar for await pattern. This adds async iterable support while keeping the callback overloads for backward compatibility.

API changes

Two new overloads return AsyncIterable<any> when no callback is passed:

// New: async iterable (consistent with OpenAI SDK)
for await (const chunk of chatClient.completeStreamingChat(messages)) {
  process.stdout.write(chunk.choices?.[0]?.delta?.content ?? '');
}

// New: with tools
for await (const chunk of chatClient.completeStreamingChat(messages, tools)) { ... }

// Existing: callback pattern — unchanged
await chatClient.completeStreamingChat(messages, (chunk) => { ... });
await chatClient.completeStreamingChat(messages, tools, (chunk) => { ... });

Implementation

  • _streamChat() — private async generator that bridges executeCommandStreaming's native callback to an async iterable via a chunk queue + wakeConsumer notify pattern. await streamPromise in the finally block ensures native resources are cleaned up even on early iterator exit.
  • _completeStreamingChatWithCallback() — extracted from the original implementation; behavior unchanged.
  • Public completeStreamingChat() dispatches to the appropriate implementation based on whether a callback argument is present.

Other

  • Tests: added async iterable integration test; updated validation tests to cover both patterns.
  • Docs (ChatClient.md) and sample app updated to reflect all four overloads.

Warning

Firewall rules blocked me from connecting to one or more addresses (expand for details)

I tried to connect to the following addresses, but was blocked by firewall rules:

  • imzvsblobprodcus368.vsblob.vsassets.io
    • Triggering command: /home/REDACTED/work/_temp/ghcca-node/node/bin/node node script/install.cjs -m conntrack --ctstate INVALID,NEW -j DROP (dns block)

If you need me to access, download, or install something from one of these locations, you can either:

Original prompt

This section details on the original issue you should resolve

<issue_title>JavaScript ChatClient Streaming Uses Callbacks, Not Async Iterators</issue_title>
<issue_description>

JavaScript ChatClient Streaming Uses Callbacks, Not Async Iterators

Severity: Documentation gap
Component: foundry-local-sdk JavaScript v0.9.x — ChatClient.completeStreamingChat()

The ChatClient returned by model.createChatClient() provides a completeStreamingChat() method, but it uses a callback pattern rather than returning an async iterable:

// ❌ This does NOT work — throws "stream is not async iterable"
for await (const chunk of chatClient.completeStreamingChat(messages)) { ... }

// ✅ Correct pattern — pass a callback
await chatClient.completeStreamingChat(messages, (chunk) => {
  process.stdout.write(chunk.choices?.[0]?.delta?.content ?? "");
});

Impact: Developers familiar with the OpenAI SDK's async iteration pattern (for await) will encounter confusing errors. The callback must be a valid function or the SDK throws "Callback must be a valid function."

Expected: Document the callback pattern in the SDK reference. Alternatively, support the async iterable pattern for consistency with the OpenAI SDK.

---</issue_description>

<agent_instructions>Can you update the JS SDK to support openai sdk's async iteration pattern in the js sdk instead of the current way of using callbacks?</agent_instructions>

Comments on the Issue (you are @copilot in this section)


💬 Send tasks to Copilot coding agent from Slack and Teams to turn conversations into code. Copilot posts an update in your thread when it's finished.

@vercel
Copy link

vercel bot commented Mar 12, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
foundry-local Ready Ready Preview, Comment Mar 12, 2026 4:46pm

Request Review

Co-authored-by: baijumeswani <12852605+baijumeswani@users.noreply.github.com>
Copilot AI changed the title [WIP] Fix ChatClient streaming to use async iterators Add async iterable support to ChatClient.completeStreamingChat() Mar 12, 2026
Copilot AI requested a review from baijumeswani March 12, 2026 16:47
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

JavaScript ChatClient Streaming Uses Callbacks, Not Async Iterators

2 participants