Skip to content

added rootflo_llm for llm proxy calls#148

Merged
vizsatiz merged 4 commits into
developfrom
llm_proxy_call
Oct 29, 2025
Merged

added rootflo_llm for llm proxy calls#148
vizsatiz merged 4 commits into
developfrom
llm_proxy_call

Conversation

@rootflo-hardik
Copy link
Copy Markdown
Contributor

  • added LLMProvider Enum, and having model_id, base_url, llm_model, llm_provider and api_token as RootFloLLM parameters
  • initialized llm according to the llm_provider.
  • modified generate and stream signature of gemini_llm, openai_llm and anthropic_llm for consistency
  • modified gemini_llm client initialized for proper base_url passing alongside the Authorization header.
  • passed kwargs after self.kwargs in openai_llm

- added LLMProvider Enum, and having model_id, base_url, llm_model, llm_provider and api_token as RootFloLLM parameters
- initialized llm according to the llm_provider.
- modified generate and stream signature of gemini_llm, openai_llm and anthropic_llm for consistency
- modified gemini_llm client initialized for proper base_url passing alongside the Authorization header.
- passed kwargs after self.kwargs in openai_llm
Comment thread flo_ai/flo_ai/llm/anthropic_llm.py
- have issuer and audience as parameters as well
- also have a custom header X-Rootflo-Key passed
- added jwt and cryptography package
# Conflicts:
#	flo_ai/poetry.lock
#	flo_ai/pyproject.toml
@vizsatiz vizsatiz merged commit 2357094 into develop Oct 29, 2025
5 checks passed
@vizsatiz vizsatiz deleted the llm_proxy_call branch October 29, 2025 07:14
thomastomy5 pushed a commit that referenced this pull request Apr 27, 2026
* added rootflo_llm for llm proxy calls

- added LLMProvider Enum, and having model_id, base_url, llm_model, llm_provider and api_token as RootFloLLM parameters
- initialized llm according to the llm_provider.
- modified generate and stream signature of gemini_llm, openai_llm and anthropic_llm for consistency
- modified gemini_llm client initialized for proper base_url passing alongside the Authorization header.
- passed kwargs after self.kwargs in openai_llm

* gemini_llm client initialization fix

* have app_key, app_secret to create token

- have issuer and audience as parameters as well
- also have a custom header X-Rootflo-Key passed
- added jwt and cryptography package
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants