Skip to content

Conversation

@nmaguiar
Copy link
Collaborator

This pull request refines how conversation messages and model responses are processed in js/owrap.ai.js to improve handling of message parts and ensure correct conversation history management. The main changes focus on extracting only relevant text from message parts, preventing mutation of original data, and correctly appending new user prompts and model responses to the conversation.

Conversation message handling improvements:

  • Modified the getConversation method to clone conversation entries before processing, preventing mutation of the original conversation history. It now extracts only the text from message parts, skipping non-text parts such as function calls or responses. ([js/owrap.ai.jsL519-R537](https://github.com/OpenAF/openaf/pull/1631/files#diff-184d2cbefa7d7f91b25bf24fe2d80ee95cbe8ed9824a9c6582bfb72d4221d165L519-R537))
  • Updated response text extraction logic to filter out non-text parts, ensuring only meaningful content is returned in places where model responses are processed. ([[1]](https://github.com/OpenAF/openaf/pull/1631/files#diff-184d2cbefa7d7f91b25bf24fe2d80ee95cbe8ed9824a9c6582bfb72d4221d165L579-R590), [[2]](https://github.com/OpenAF/openaf/pull/1631/files#diff-184d2cbefa7d7f91b25bf24fe2d80ee95cbe8ed9824a9c6582bfb72d4221d165L625-R638))

Conversation history management:

  • Refactored prompt merging logic to preserve new user prompts separately before merging with conversation history, ensuring they are correctly added when function calls occur. ([js/owrap.ai.jsR667-R675](https://github.com/OpenAF/openaf/pull/1631/files#diff-184d2cbefa7d7f91b25bf24fe2d80ee95cbe8ed9824a9c6582bfb72d4221d165R667-R675))
  • Adjusted how new user prompts and model responses are appended to the conversation, ensuring the conversation starts with a user turn and maintains proper order when handling function calls or responses. ([js/owrap.ai.jsL767-R790](https://github.com/OpenAF/openaf/pull/1631/files#diff-184d2cbefa7d7f91b25bf24fe2d80ee95cbe8ed9824a9c6582bfb72d4221d165L767-R790))…ct text from parts, and preserve new user prompts in conversation history

…ct text from parts, and preserve new user prompts in conversation history
@nmaguiar nmaguiar requested a review from Copilot January 16, 2026 04:59
@nmaguiar nmaguiar self-assigned this Jan 16, 2026
@nmaguiar nmaguiar merged commit 408408d into master Jan 16, 2026
5 checks passed
@github-project-automation github-project-automation bot moved this from Backlog to Done in Continuous Enhancement Jan 16, 2026
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR improves conversation handling in the Gemini AI integration by preventing mutation of the original conversation history and ensuring only text content is extracted from message parts. The changes address issues with how conversation entries are cloned, how text is filtered from multi-part messages, and how new user prompts are preserved when function calls occur.

Changes:

  • Clone conversation entries before processing to avoid mutating the original data
  • Filter message parts to extract only text content, skipping function calls and responses
  • Preserve new user prompts separately to ensure they're correctly added to conversation history during function call handling

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

// Only extract text from parts that have text (skip functionCall, functionResponse, etc.)
var textParts = _entry.parts.filter(p => isDef(p) && isDef(p.text))
if (textParts.length > 0) {
_entry.content = textParts.reduce((aC, aV) => aC + "\n" + aV.text, "")
Copy link

Copilot AI Jan 16, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The reduce operation always prepends a newline, resulting in leading whitespace when concatenating text parts. Initialize the accumulator with the first element's text or use join('\n') instead.

Suggested change
_entry.content = textParts.reduce((aC, aV) => aC + "\n" + aV.text, "")
_entry.content = textParts.map(p => p.text).join("\n")

Copilot uses AI. Check for mistakes.
return __r.candidates[0].content.parts.reduce((aC, aV) => aC + "\n" + aV.text, "")
// Only extract text from parts that have text (skip functionCall, functionResponse, etc.)
var textParts = __r.candidates[0].content.parts.filter(p => isDef(p) && isDef(p.text))
return textParts.reduce((aC, aV) => aC + "\n" + aV.text, "")
Copy link

Copilot AI Jan 16, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The reduce operation always prepends a newline, resulting in leading whitespace when concatenating text parts. Initialize the accumulator with the first element's text or use join('\n') instead.

Copilot uses AI. Check for mistakes.
return __r.candidates[0].content.parts.reduce((aC, aV) => aC + "\n" + aV.text, "")
// Only extract text from parts that have text (skip functionCall, functionResponse, etc.)
var textParts = __r.candidates[0].content.parts.filter(p => isDef(p) && isDef(p.text))
return textParts.reduce((aC, aV) => aC + "\n" + aV.text, "")
Copy link

Copilot AI Jan 16, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The reduce operation always prepends a newline, resulting in leading whitespace when concatenating text parts. Initialize the accumulator with the first element's text or use join('\n') instead.

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

Development

Successfully merging this pull request may close these issues.

2 participants