Skip to content

Feature Request: Setting to preserve Reasoning Content in WebUI #19449

@MaxKruse

Description

@MaxKruse

Prerequisites

  • I am running the latest code. Mention the version if possible as well.
  • I carefully followed the README.md.
  • I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
  • I reviewed the Discussions, and have a new and useful enhancement to share.

Feature Description

Running on self-compiled build: 7972 (e06088d)

In the WebUI, when ending a message in a chat that has reasoning content, said reasoning content is NOT send along to the LLM (where jinja promps may or may not strip previous thinking blocks). A setting of some sort to make the reasoning content be sent along would be useful, if model templates give the option to choose (e.g. GLM4.7 Flash)

Motivation

Whenever i send a few messages to GLM4.7 Flash in the webui, it does its usual reasoning, and then outputs the content. However, every turn of the user prompt will purge that reasoning content from the context (by simply not being included in the payload), making it "forget" its previous reasonings. Even if not directly helpful in all cases, an option to A-B test on a user level wouldnt hurt.

Possible Implementation

From the little i dug in, the conversion from DatabaseMessages to APIMessage payloads only considers content, not reasoning content

Metadata

Metadata

Labels

enhancementNew feature or request

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions