Skip to content

OpenAI Vllm temperature param fix#124

Merged
vizsatiz merged 1 commit into
developfrom
fix/vllm_openai_temperature
Aug 26, 2025
Merged

OpenAI Vllm temperature param fix#124
vizsatiz merged 1 commit into
developfrom
fix/vllm_openai_temperature

Conversation

@vizsatiz
Copy link
Copy Markdown
Member

No description provided.

@vizsatiz vizsatiz merged commit 8f69ee8 into develop Aug 26, 2025
4 checks passed
@vizsatiz vizsatiz deleted the fix/vllm_openai_temperature branch August 26, 2025 15:35
vizsatiz added a commit that referenced this pull request Aug 26, 2025
vizsatiz added a commit that referenced this pull request Aug 26, 2025
OpenAI Vllm temperature param fix (#124)
vizsatiz added a commit that referenced this pull request Aug 31, 2025
* Updating 1.0.0 version

* Patterns: Reflection & Plan-Execute Pattern (#120)

* Adding support for Reflection Pattern

* Plan&Execute router

* Studio: Visual Designer to create AI agents (#121)

* Arium yaml chanegs

* fix readme for format

* Studio first commit

* Custom agent builder

* Basic multi agent routing

* Fix studio, and have first agent created through it to run

* Add studio details in READme

* Adding flo studio imahe

* Plan execute pattern on UI (#122)

* Implement plan and execute pattern in clean way

* Plan and execute and workflow templates on UI

* Start and end node config

* Basic UI clean up

* OpenAI Vllm temperature param fix (#124)

* fix: response_formatter doesnt work well in vllm + phi4 (#127)

* Fix: minor bugs and code clean up (#129)

* fix: to make sure that temperature is used

* add more unitests for llms

* Updating to latest version
thomastomy5 pushed a commit that referenced this pull request Apr 27, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant