Skip to content

Fixed response schema for vllm#116

Merged
vizsatiz merged 7 commits into
developfrom
feat_openai_vllm
Aug 8, 2025
Merged

Fixed response schema for vllm#116
vizsatiz merged 7 commits into
developfrom
feat_openai_vllm

Conversation

@rootflo-hardik
Copy link
Copy Markdown
Contributor

No description provided.

@rootflo-hardik rootflo-hardik requested a review from vizsatiz August 7, 2025 13:00
@vizsatiz vizsatiz merged commit d0c9f95 into develop Aug 8, 2025
3 checks passed
@vizsatiz vizsatiz deleted the feat_openai_vllm branch August 8, 2025 06:18
thomastomy5 pushed a commit that referenced this pull request Apr 27, 2026
* vllm integration

- added vllm openai integration
- added vllm agent usage example
- fixed retry_count bug

* made openai_vllm required

* exporting ImageMessage

* added OpenAIVLLM to flo_ai init

* fixed response_format for openai and vllmopenai

* examples -> output_schema 'name' changed to 'title'
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants