fix: blank assistant text with MCP enabled after AI SDK v6 migration#20467
Open
kkugot wants to merge 1 commit intoanomalyco:devfrom
Open
fix: blank assistant text with MCP enabled after AI SDK v6 migration#20467kkugot wants to merge 1 commit intoanomalyco:devfrom
kkugot wants to merge 1 commit intoanomalyco:devfrom
Conversation
Contributor
|
The following comment was made by an LLM, it may be inaccurate: Potential Related PRs Found:
|
Contributor
|
Thanks for updating your PR! It now meets our contributing guidelines. 👍 |
df65986 to
bf4dbc9
Compare
5d6eef9 to
49cc60e
Compare
Contributor
|
Thanks for updating your PR! It now meets our contributing guidelines. 👍 |
The v6 upgrade changed the default unmapped finish reason from 'unknown' to 'other', but the prompt loop continue sets were not updated to match. This caused premature loop exit after MCP tool calls, before the model produced final assistant text. Additionally, the LLM middleware still targeted legacy 'prompt' param key while v6 Responses requests use 'messages' or 'input', causing ProviderTransform.message() to be silently skipped. Fixes anomalyco#20050 Fixes anomalyco#20465
49cc60e to
8c1272b
Compare
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Issue for this PR
Closes #20465
Type of change
What does this PR do?
Fixes blank assistant text in TUI when MCP servers are enabled, a regression introduced in v1.3.4 by the AI SDK 5 -> 6 upgrade.
Two interacting problems:
The v6 finish-reason adapter changed the default unmapped reason from
unknowntoother. The prompt loop continue sets inprompt.tshadunknownremoved butotherwas never added as a replacement. This causes the loop to exit after MCP tool calls before the model produces final text. MCP amplifies this because tool-enabled requests hit this code path far more often.The LLM middleware in
llm.tstargetedargs.params.promptfor applyingProviderTransform.message(), but v6 Responses requests usemessagesorinput. This meant message normalization was silently skipped on Responses requests.The fix adds
otherto both loop continue sets and transforms all three v6 param shapes in the middleware.How did you verify your code works?
bun test test/session/llm.test.ts— 14 passbun typecheck— no new errorsScreenshots / recordings
N/A — not a UI change, the fix restores existing expected behavior.
Checklist