Skip to content

rootflo_llm as llm provider#163

Merged
vizsatiz merged 7 commits into
developfrom
feat_rootflo_llm_yaml
Nov 18, 2025
Merged

rootflo_llm as llm provider#163
vizsatiz merged 7 commits into
developfrom
feat_rootflo_llm_yaml

Conversation

@rootflo-hardik
Copy link
Copy Markdown
Contributor

No description provided.

vishnurk6247
vishnurk6247 previously approved these changes Nov 12, 2025
yaml_str: str,
tools: Optional[List[Tool]] = None,
base_llm: Optional[BaseLLM] = None,
tool_registry: Optional[Dict[str, Tool]] = None,
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

kwargs

base_url: The base URL of the proxy server
model_id: The model identifier
llm_provider: Type of LLM SDK to use (LLMProvider enum)
model_id: The model identifier (config_id)
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you add support for taking access_token directly, this will be useful when we do direct calls from authenticated users

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also throw exception at init, if there are missing stuff

Comment thread flo_ai/flo_ai/arium/builder.py Outdated
)

# if access_token is not provided
if not access_token:
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you move this to a method and use it both here and in arium ?

Comment thread flo_ai/flo_ai/arium/builder.py Outdated
f'These can be provided via kwargs or environment variables (ROOTFLO_BASE_URL, ROOTFLO_APP_KEY, ROOTFLO_APP_SECRET, ROOTFLO_ISSUER, ROOTFLO_AUDIENCE).'
)
else:
if not all([base_url, app_key]):
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Even this I guess

Comment thread flo_ai/flo_ai/builder/agent_builder.py Outdated
else:
raise ValueError(f'Unsupported model provider: {provider}')
if not model_name:
raise ValueError(
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Even the model choosing code can be made common

# Access token flow - only needs base_url and app_key
required_params = {
'base_url': base_url,
'app_key': app_key,
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is this app_key ?

@vizsatiz vizsatiz merged commit 703ccf9 into develop Nov 18, 2025
5 checks passed
@vizsatiz vizsatiz deleted the feat_rootflo_llm_yaml branch November 18, 2025 09:33
thomastomy5 pushed a commit that referenced this pull request Apr 27, 2026
* rootflo_llm as llm provider

* access_token option

* utils -> llm factory

- using llm factory for llm instance creation in both arium and agent builder

* fix for failing tests

* circular dependency fix for failing tests

* ImageMessage -> ImageMessageContent
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants