Skip to content

Conversation

@mschristensen
Copy link
Contributor

Description

Adds a page to the Messaging section that describes sending tool calls and results to users over channels. Indicates ability to build generative user interfaces or implement human in the loop workflows.

Checklist

Adds a page to the Messaging section that describes sending tool calls
and results to users over channels. Indicates ability to build
generative user interfaces or implement human in the loop workflows.
@coderabbitai
Copy link

coderabbitai bot commented Jan 13, 2026

Important

Review skipped

Auto reviews are disabled on this repository.

Please check the settings in the CodeRabbit UI or the .coderabbit.yaml file in this repository. To trigger a single review, invoke the @coderabbitai review command.

You can disable this status message by setting the reviews.review_status to false in the CodeRabbit configuration file.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Tool call arguments can be streamed token by token as they are generated by the model. When implementing token-level streaming, your UI should handle parsing partial JSON gracefully to render realtime updates as the arguments stream in. To learn more about approaches to token streaming, see the [token streaming](/docs/ai-transport/features/token-streaming) documentation.
</Aside>

## Human-in-the-loop workflows <a id="human-in-the-loop"/>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You mention HITL, but what about other tool calls that are invoked client-side? Eg to get location, read or send texts on a mobile, upload photos etc.

</Code>

<Aside data-type="note">
Tool call arguments can be streamed token by token as they are generated by the model. When implementing token-level streaming, your UI should handle parsing partial JSON gracefully to render realtime updates as the arguments stream in. To learn more about approaches to token streaming, see the [token streaming](/docs/ai-transport/features/token-streaming) documentation.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What's the value in streaming the arguments?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

review-app Create a Heroku review app

Development

Successfully merging this pull request may close these issues.

4 participants