feat: Add a Unified Model Providers configuration#10565
Conversation
|
Important Review skippedAuto incremental reviews are disabled on this repository. Please check the settings in the CodeRabbit UI or the You can disable this status message by setting the WalkthroughIntroduces new v1 API endpoints for model management and provider discovery, integrating with a DatabaseVariableService to track user-specific model enablement and defaults. Concurrently, updates 17 starter project JSON templates to replace provider-specific language model inputs with a unified, dynamic ModelInput-based configuration, and removes re-exported OpenAI model constants. Changes
Sequence Diagram(s)sequenceDiagram
actor User
participant Client
participant ModelsAPI as /api/v1/models
participant VariableService as DatabaseVariableService
participant UnifiedModels as Unified Models Catalog
User->>Client: Request list of enabled models for language type
Client->>ModelsAPI: GET /models/enabled?model_names=...
ModelsAPI->>UnifiedModels: get_unified_models_detailed()
UnifiedModels-->>ModelsAPI: Model metadata (provider, deprecated, etc.)
ModelsAPI->>VariableService: Fetch user disabled-models list
VariableService-->>ModelsAPI: Disabled model IDs
ModelsAPI->>ModelsAPI: Filter by provider status + disabled list
ModelsAPI-->>Client: {provider: {model_id: enabled}}
User->>Client: Enable/disable specific model
Client->>ModelsAPI: POST /models/enabled (ModelStatusUpdate[])
ModelsAPI->>VariableService: Update disabled-models variable
VariableService-->>ModelsAPI: ✓ Persisted
ModelsAPI-->>Client: Success
User->>Client: Set default language model
Client->>ModelsAPI: POST /models/default (DefaultModelRequest)
ModelsAPI->>VariableService: Store default-model JSON
VariableService-->>ModelsAPI: ✓ Persisted
ModelsAPI-->>Client: {provider, model_name, model_type}
Estimated code review effort🎯 4 (Complex) | ⏱️ ~75 minutes Areas requiring extra attention:
Possibly related PRs
Suggested labels
Suggested reviewers
Pre-merge checks and finishing touches❌ Failed checks (1 error, 2 warnings)
✅ Passed checks (4 passed)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 3
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (18)
src/backend/base/langflow/initial_setup/starter_projects/Text Sentiment Analysis.json (2)
1432-1448: API key label mismatch between code and template.The embedded Python code at lines 1466, 1794, and 2121 declares
SecretStrInput(name="api_key", display_name="API Key", ...), but the template definitions retaindisplay_name: "OpenAI API Key". This inconsistency will cause the UI to display "OpenAI API Key" instead of the intended "API Key".Apply this fix to align all three api_key field definitions with the code:
"api_key": { "_input_type": "SecretStrInput", "advanced": false, - "display_name": "OpenAI API Key", + "display_name": "API Key", "dynamic": false, "info": "Model Provider API key", ... }Also applies to: 1760-1776, 2087-2103
1347-1355: Template field definitions conflict with new unified model code.The embedded code (lines 1466, 1794, 2121) declares inputs using a single
ModelInput(name="model", ...)and does not reference "provider" or "model_name" fields. However, the template definitions still include separateproviderandmodel_namedropdown fields. Additionally,field_orderlists these old field names, which will not match the new input declarations and may cause unexpected UI behavior or loss of configuration state.Update the template to remove the obsolete
providerandmodel_namefield definitions, and updatefield_orderto reflect the actual inputs declared in the code:"field_order": [ - "provider", - "model_name", "api_key", "input_value", "system_message", "stream", "temperature" ],And remove the
providerandmodel_nameentries from the template field definitions.Also applies to: 1675-1683, 2002-2010
src/backend/base/langflow/initial_setup/starter_projects/Nvidia Remix.json (3)
781-807: Agent field_order still lists legacy provider-specific keysThe Agent moved to unified ModelInput, but field_order references removed/renamed fields (agent_llm, model_name, openai_api_base, temperature, seed, timeout, etc.). This can cause odd ordering and phantom entries in the UI.
Please update to place model first and drop legacy keys.
Apply this minimal diff:
- "field_order": [ - "agent_llm", - "max_tokens", - "model_kwargs", - "json_mode", - "model_name", - "openai_api_base", - "api_key", - "temperature", - "seed", - "max_retries", - "timeout", - "system_prompt", - "tools", - "input_value", - "handle_parsing_errors", - "verbose", - "max_iterations", - "agent_description", - "memory", - "sender", - "sender_name", - "n_messages", - "session_id", - "order", - "template", - "add_current_date_tool" - ], + "field_order": [ + "model", + "api_key", + "system_prompt", + "tools", + "input_value", + "handle_parsing_errors", + "verbose", + "max_iterations", + "format_instructions", + "output_schema", + "n_messages", + "context_id", + "sender", + "sender_name", + "session_id", + "add_current_date_tool" + ],
1156-1179: Truncated Agent system_prompt defaultThe value ends with “Before ” which looks truncated. Restore the intended default text.
- "value": "You are a helpful assistant that must use tools to answer questions and perform tasks regarding RTX Remix.\n\nBefore " + "value": "You are a helpful assistant that must use tools to answer questions and perform tasks regarding RTX Remix.\n\nFollow the documentation search instructions first. Cite sources. Ask for clarification when ambiguous."
1778-1797: Critical: EmbeddingModel.model is still a DropdownInput; must be ModelInputThis conflicts with the component code (ModelInput expected) and will break unified provider selection and dynamic options.
Replace the block with ModelInput and clear the hardcoded OpenAI default:
- "model": { - "_input_type": "DropdownInput", - "advanced": false, - "display_name": "Embedding Model", - "dynamic": false, - "info": "Select your model provider", - "name": "model", - "options": [], - "options_metadata": [], - "placeholder": "", - "required": true, - "show": true, - "title_case": false, - "toggle": false, - "tool_mode": false, - "trace_as_metadata": true, - "type": "str", - "value": "text-embedding-3-small" - }, + "model": { + "_input_type": "ModelInput", + "advanced": false, + "display_name": "Embedding Model", + "dynamic": false, + "info": "Select your model provider", + "input_types": ["Embeddings"], + "list": false, + "list_add_label": "Add More", + "model_type": "embedding", + "name": "model", + "options": [], + "providers": [], + "placeholder": "", + "real_time_refresh": true, + "required": true, + "show": true, + "title_case": false, + "tool_mode": false, + "trace_as_input": true, + "type": "model", + "value": "" + },src/backend/base/langflow/initial_setup/starter_projects/Financial Report Parser.json (1)
703-708: Replacellmwithmodelinfield_order.The Structured Output node now exposes a required
modelinput, but thefield_orderarray still references the removedllmfield. The UI relies onfield_orderto decide which inputs to render and in what order, so the newmodelselector never shows up—users can’t choose a provider/model and the template breaks. Please swapllmformodelso the field renders correctly.- "llm", + "model", "input_value", "system_prompt", "schema_name", "output_schema"src/backend/base/langflow/initial_setup/starter_projects/Pokédex Agent.json (2)
1213-1233: field_order still lists removed inputs; align to the new ModelInput schema.Remove legacy entries (agent_llm, model_name, openai_api_base, temperature, seed, max_retries, timeout, max_tokens, model_kwargs, json_mode) to avoid broken ordering and UI confusion. Keep only fields that exist (model, api_key, system_prompt, n_messages, tools, input_value, handle_parsing_errors, verbose, max_iterations, agent_description, add_current_date_tool).
- "field_order": [ - "agent_llm", - "max_tokens", - "model_kwargs", - "json_mode", - "model_name", - "openai_api_base", - "api_key", - "temperature", - "seed", - "max_retries", - "timeout", - "system_prompt", - "n_messages", - "tools", - "input_value", - "handle_parsing_errors", - "verbose", - "max_iterations", - "agent_description", - "add_current_date_tool" - ], + "field_order": [ + "model", + "api_key", + "system_prompt", + "n_messages", + "tools", + "input_value", + "handle_parsing_errors", + "verbose", + "max_iterations", + "agent_description", + "add_current_date_tool" + ],
1586-1609: Minor typo in system prompt.“mispelling” → “misspelling”.
- "Fix user pokemon name mispelling." + "Fix user pokemon name misspelling."src/backend/base/langflow/initial_setup/starter_projects/Basic Prompt Chaining.json (3)
1236-1272: Blocking: Component code expects ModelInputmodel, but instance lacks amodelfield and still exposes legacyprovider/model_name.Without a
modelinput, get_llm(model=self.model, …) may fail and the UI won’t show the unified selector. Replace legacy fields with a ModelInput and align labels."api_key": { "_input_type": "SecretStrInput", - "advanced": false, - "display_name": "OpenAI API Key", + "advanced": true, + "display_name": "API Key", "dynamic": false, "info": "Model Provider API key", "input_types": [], "load_from_db": true, "name": "api_key", "password": true, "placeholder": "", "real_time_refresh": true, "required": false, "show": true, "title_case": false, "type": "str", - "value": "OPENAI_API_KEY" + "value": "" }, + "model": { + "_input_type": "ModelInput", + "advanced": false, + "display_name": "Language Model", + "dynamic": false, + "external_options": { + "fields": { "data": { "node": { "display_name": "Connect other models", "icon": "CornerDownLeft", "name": "connect_other_models" } } } + }, + "info": "Select your model provider", + "input_types": ["LanguageModel"], + "list": false, + "model_type": "language", + "name": "model", + "options": [], + "providers": [], + "real_time_refresh": true, + "required": true, + "show": true, + "title_case": false, + "tool_mode": false, + "trace_as_input": true, + "type": "model", + "value": "" + }, - "model_name": { ... }, - "provider": { ... },Also update field_order for this node (and the other two LanguageModelComponent nodes) to replace provider/model_name with model and move api_key under it.
- "field_order": ["provider","model_name","api_key","input_value","system_message","stream","temperature"] + "field_order": ["model","api_key","input_value","system_message","stream","temperature"]Apply the same edits to the nodes at Lines 1558-1594 and 1879-1915.
1558-1594: Replicate the ModelInput migration for all LanguageModelComponent instances.Both additional LanguageModelComponent nodes still use legacy provider/model_name and lack
model. Apply the same diff as above.Also applies to: 1879-1915
1238-1256: Normalize API key label everywhere.Change “OpenAI API Key” to “API Key” for consistency with unified model providers.
- "display_name": "OpenAI API Key", + "display_name": "API Key",Also applies to: 1560-1576, 1882-1897
src/backend/base/langflow/initial_setup/starter_projects/Invoice Summarizer.json (2)
1128-1148: field_order still lists legacy inputs; update to match ModelInput.Prune removed fields and add “model” at the top for clarity.
- "field_order": [ - "agent_llm","max_tokens","model_kwargs","json_mode","model_name","openai_api_base", - "api_key","temperature","seed","max_retries","timeout", - "system_prompt","n_messages","tools","input_value", - "handle_parsing_errors","verbose","max_iterations","agent_description","add_current_date_tool" - ], + "field_order": [ + "model","api_key","system_prompt","n_messages","tools","input_value", + "handle_parsing_errors","verbose","max_iterations","agent_description","add_current_date_tool" + ],
1196-1276: Add "model" field to all 33 Agent component templates in starter projects.The verification revealed a critical issue: ALL Agent components across starter project JSON files (33 total) are missing the required
modelfield in their templates. This contradicts the code, which definesModelInputwithrequired=True. The template structure must include the model field configuration for the Agent components to function correctly at runtime. Affected files include Invoice Summarizer.json, Instagram Copywriter.json, Search agent.json, and 30 others insrc/backend/base/langflow/initial_setup/starter_projects/.src/backend/base/langflow/initial_setup/starter_projects/Memory Chatbot.json (2)
541-563: Incorrect input type for ChatOutput.input_value (breaks handle wiring).Component code defines HandleInput, but this template uses MessageInput. Connections may fail at runtime.
Apply:
- "input_value": { - "_input_type": "MessageInput", + "input_value": { + "_input_type": "HandleInput", "advanced": false, "display_name": "Inputs", "dynamic": false, "info": "Message to be passed as output.", "input_types": ["Data","DataFrame","Message"], "list": false, "load_from_db": false, "name": "input_value", "placeholder": "", "required": true, "show": true, "title_case": false, - "trace_as_input": true, - "trace_as_metadata": true, - "type": "str", + "trace_as_metadata": true, + "type": "other", "value": "" },
1261-1284: Update langchain package versions in Memory Chatbot.json to match project dependencies.The JSON file references outdated versions that conflict with declared project dependencies:
langchain_ollama: 0.2.1→ update to0.3.10(currently pinned in pyproject.toml)langchain_ibm: 0.3.19→ update to>=0.3.8(project constraint)langchain_openai: 0.3.23→ update to>=0.2.12,<1.0.0(project constraint, or use latest compatible)langchain_anthropic: 0.3.14✓ (already aligned)src/backend/base/langflow/initial_setup/starter_projects/SEO Keyword Generator.json (1)
636-705: Fix isinstance() union usage to avoid runtime TypeError on Python <3.12.Using
Message | Data | DataFrame | strinsideisinstanceraisesTypeErroron Python 3.10/3.11. Replace with a tuple.- if isinstance(self.input_value, list) and not all( - isinstance(item, Message | Data | DataFrame | str) for item in self.input_value - ): + if isinstance(self.input_value, list) and not all( + isinstance(item, (Message, Data, DataFrame, str)) for item in self.input_value + ): @@ - if not isinstance( - self.input_value, - Message | Data | DataFrame | str | list | Generator | type(None), - ): + if not isinstance( + self.input_value, + (Message, Data, DataFrame, str, list, Generator, type(None)), + ):src/backend/base/langflow/initial_setup/starter_projects/Basic Prompting.json (1)
101-110: Field order key mismatch hides setting.
field_orderusesstore_messagebut the input isshould_store_message. Align to show it correctly.- "field_order": [ - "input_value", - "store_message", + "field_order": [ + "input_value", + "should_store_message", "sender", "sender_name", "session_id", "files"src/backend/base/langflow/initial_setup/starter_projects/Blog Writer.json (1)
463-470: Field order key mismatch hides setting.
field_orderusesstore_messagebut input isshould_store_message. Align to display it.- "field_order": [ - "input_value", - "store_message", + "field_order": [ + "input_value", + "should_store_message", "sender", "sender_name", "session_id", "data_template"
♻️ Duplicate comments (2)
src/backend/base/langflow/initial_setup/starter_projects/Basic Prompting.json (1)
621-725: ChatOutput isinstance union bug already flagged in SEO file.Same
isinstanceunion issue appears here; apply the earlier fix.src/backend/base/langflow/initial_setup/starter_projects/Blog Writer.json (1)
515-616: ChatOutput isinstance union bug already flagged in SEO file.Same
isinstanceunion issue; apply the earlier fix.
🧹 Nitpick comments (27)
src/backend/base/langflow/api/v1/variable.py (1)
73-77: Good addition: correctly filters internal variables.The filtering logic properly excludes variables with the internal naming pattern (
__name__) while preserving all user-defined variables.For slightly improved readability, you could extract the condition into a helper:
def is_internal_variable(var: VariableRead) -> bool: """Check if a variable is internal (name starts and ends with __).""" return bool(var.name and var.name.startswith("__") and var.name.endswith("__")) # Then use: return [var for var in all_variables if not is_internal_variable(var)]However, the current inline implementation is also perfectly clear and concise.
src/backend/base/langflow/initial_setup/starter_projects/Nvidia Remix.json (4)
898-913: Agent api_key should be advanced to match code and reduce clutterTemplate marks api_key as advanced: false, while code sets it advanced=True. Make it advanced in the template for consistency.
"api_key": { "_input_type": "SecretStrInput", - "advanced": false, + "advanced": true, "display_name": "API Key", "dynamic": false, "info": "Model Provider API key",
1658-1674: EmbeddingModel api_key should be advanced to match codeCode sets SecretStrInput(advanced=True); template shows advanced: false. Align it.
"api_key": { "_input_type": "SecretStrInput", - "advanced": false, + "advanced": true, "display_name": "API Key", "dynamic": false, "info": "Model Provider API key", "input_types": [], "load_from_db": false, "name": "api_key", "password": true, "placeholder": "", "real_time_refresh": true, "required": false, "show": true, "title_case": false, "type": "str", "value": "" },
1580-1591: EmbeddingModel field_order still includes provider; promote model to firstField order should reflect unified ModelInput and remove “provider”.
- "field_order": [ - "provider", - "model", + "field_order": [ + "model", "api_key", "api_base", "dimensions", "chunk_size", "request_timeout", "max_retries", "show_progress_bar", "model_kwargs" ],
1963-1980: Defaulting FAISS allow_dangerous_deserialization to true is riskySafer default is false; let advanced users opt-in.
"allow_dangerous_deserialization": { "_input_type": "BoolInput", "advanced": true, "display_name": "Allow Dangerous Deserialization", "dynamic": false, "info": "Set to True to allow loading pickle files from untrusted sources. Only enable this if you trust the source of the data.", "list": false, "list_add_label": "Add More", "name": "allow_dangerous_deserialization", "placeholder": "", "required": false, "show": true, "title_case": false, "tool_mode": false, "trace_as_metadata": true, "type": "bool", - "value": true + "value": false },src/backend/base/langflow/initial_setup/starter_projects/Pokédex Agent.json (2)
706-733: Update note copy to be provider‑agnostic.Replace “Add your OpenAI API key here” with “Add your Model Provider API key here” to match the unified model input.
- "description": "### 💡 Add your OpenAI API key here", + "description": "### 💡 Add your Model Provider API key here",
682-704: Quickstart still OpenAI‑specific; make it generic.Change prerequisite/steps to “Model Provider API key” and “Language Model” selection to reflect ModelInput.
- "* An [OpenAI API key](https://platform.openai.com/)" + "* A Model Provider API key (e.g., OpenAI, Anthropic, Google)" - "1. Paste your OpenAI API key in the **Agent** component." + "1. Paste your Model Provider API key in the **Agent** component and select a model."src/backend/base/langflow/initial_setup/starter_projects/Basic Prompt Chaining.json (1)
2110-2133: Notes still reference OpenAI only; align to unified providers.Update prerequisites and step text to “Model Provider API key” and instruct selecting a model in each Language Model component.
-* [OpenAI API Key](https://platform.openai.com/) -1. In all **Language Model** components, add your OpenAI API key. +* A Model Provider API key (e.g., OpenAI, Anthropic, Google) +1. In all **Language Model** components, add your API key and select a model.src/backend/base/langflow/initial_setup/starter_projects/Invoice Summarizer.json (2)
566-586: Notes still mention “OpenAI API key”; update to provider‑agnostic wording.Use “Model Provider API key” in prerequisites and note bubbles.
- "### 💡 Add your OpenAI API key here 👇" + "### 💡 Add your Model Provider API key here 👇" - "* An [OpenAI API key](https://platform.openai.com/)" + "* A Model Provider API key (e.g., OpenAI, Anthropic, Google)"Also applies to: 118-144
118-144: Optional copy tweak: step 3 mentions “OpenAI Key”.Change to “Model Provider API key” to match the new UI.
-3. In the **Agent** component, add your **OpenAI Key**. +3. In the **Agent** component, add your **Model Provider API key** and select a model.src/backend/base/langflow/initial_setup/starter_projects/Social Media Agent.json (2)
1229-1236: Update note to provider-agnostic wording.Note still says “Add your OpenAI API key here.” With unified ModelInput, suggest “Add your model provider API key here” to avoid confusion.
- "description": "### 💡 Add your OpenAI API key here ", + "description": "### 💡 Add your model provider API key here ",
1266-1286: Clean up legacy field_order entries.field_order lists removed, provider-specific fields (agent_llm, model_name, openai_api_base, etc.). Replace with the new unified set including “model” to keep UI ordering consistent.
- "field_order": [ - "agent_llm","max_tokens","model_kwargs","json_mode","model_name","openai_api_base", - "api_key","temperature","seed","max_retries","timeout","system_prompt","n_messages", - "tools","input_value","handle_parsing_errors","verbose","max_iterations", - "agent_description","add_current_date_tool" - ], + "field_order": [ + "model","api_key","system_prompt","n_messages","tools","input_value", + "handle_parsing_errors","verbose","max_iterations","agent_description","add_current_date_tool" + ],src/backend/base/langflow/initial_setup/starter_projects/SaaS Pricing.json (1)
842-862: Remove legacy fields from field_order.Same as other starters, field_order includes deprecated provider-specific fields. Replace with unified list including “model”.
- "field_order": ["agent_llm","max_tokens","model_kwargs","json_mode","model_name", - "openai_api_base","api_key","temperature","seed","max_retries","timeout","system_prompt", - "n_messages","tools","input_value","handle_parsing_errors","verbose","max_iterations", - "agent_description","add_current_date_tool"], + "field_order": ["model","api_key","system_prompt","n_messages","tools","input_value", + "handle_parsing_errors","verbose","max_iterations","agent_description","add_current_date_tool"],src/backend/base/langflow/initial_setup/starter_projects/Memory Chatbot.json (1)
1330-1345: Enable credential reuse via variables.api_key has load_from_db=false here, unlike Agent components. Set to true to integrate with provider-variable mapping.
- "load_from_db": false, + "load_from_db": true,src/backend/base/langflow/initial_setup/starter_projects/SEO Keyword Generator.json (4)
1511-1545: Deduplicate providers list to unique values.
model.providersrepeats provider names many times; this adds UI noise and costs. Keep unique providers only.- "providers": [ - "OpenAI", "OpenAI", "OpenAI", "OpenAI", "OpenAI", "OpenAI", "OpenAI", "OpenAI", "OpenAI", - "Anthropic","Anthropic","Anthropic","Anthropic","Anthropic","Anthropic","Anthropic","Anthropic", - "Google","Google","Google","Google","Google","Google","Google","Google","Google","Google","Google","Google","Google","Google","Google", - "Ollama","IBM WatsonX" - ], + "providers": ["OpenAI", "Anthropic", "Google", "Ollama", "IBM WatsonX"],
815-821: Make note provider‑agnostic.Template now supports multiple providers. Update note from “OpenAI API key” to “API key for the selected provider” and link to Models docs.
- "description": "### 💡 Add your OpenAI API key here", + "description": "### 💡 Add your API key for the selected provider here",
933-950: Auto-load stored secrets for unified model providers.Set
api_key.load_from_dbto true so saved provider keys auto-populate.- "load_from_db": false, + "load_from_db": true,
992-1554: Optional: set a sensible default model value.Leaving
model.valueempty forces an extra step for newcomers. Consider defaulting to a widely available model (e.g.,gpt-4o-mini) and let users change it.src/backend/base/langflow/initial_setup/starter_projects/Basic Prompting.json (4)
1506-1539: Deduplicate providers list to unique values.Reduce
model.providersto a unique set to avoid UI clutter.- "providers": [ - "OpenAI","OpenAI","OpenAI","OpenAI","OpenAI","OpenAI","OpenAI","OpenAI","OpenAI", - "Anthropic","Anthropic","Anthropic","Anthropic","Anthropic","Anthropic","Anthropic","Anthropic", - "Google","Google","Google","Google","Google","Google","Google","Google","Google","Google","Google","Google","Google","Google","Google", - "Ollama","IBM WatsonX" - ], + "providers": ["OpenAI", "Anthropic", "Google", "Ollama", "IBM WatsonX"],
519-526: Make the sticky note provider‑agnostic.Change “OpenAI API key” to “API key for the selected provider.”
- "description": "### 💡 Add your OpenAI API key here 👇", + "description": "### 💡 Add your API key for the selected provider here 👇",
481-490: Update README text to reflect unified providers.Replace “OpenAI API Key” with “API key for the selected provider.” Keeps instructions accurate.
928-944: Auto-load stored secrets for unified model providers.Enable
api_key.load_from_dbso saved keys are prefilled.- "load_from_db": false, + "load_from_db": true,src/backend/base/langflow/initial_setup/starter_projects/Blog Writer.json (5)
718-723: Make overview provider‑agnostic.Replace “OpenAI API key” with “API key for the selected provider” to match unified ModelInput.
1718-1737: Prefer dynamic model options over static lists.Since
update_build_configpopulates options dynamically, consider removing the large staticoptionslist (and the long duplicatedproviderslist) to reduce drift and maintenance. Alternatively, keep but dedupe providers to unique values.Also applies to: 1759-1779, 1795-1817, 1831-1853, 1870-1891, 1903-1925, 1936-1953, 1963-1973
2008-2017: Deduplicate providers list to unique values.Trim
model.providersto the set of provider names only once.- "providers": ["OpenAI","OpenAI",...,"Google"], + "providers": ["OpenAI", "Anthropic", "Google"],
1390-1418: Expose consistent outputs metadata or drop nulls.
optionsandrequired_inputsarenullfor outputs here but omitted in other templates. Standardize (omit when unknown) to avoid schema variance.
826-918: ParserComponent: scope Clean Data toggle to Stringify mode only.
update_build_configaddsclean_datawheneverfield_valueis truthy (always). Gate it onself.mode == "Stringify"and remove otherwise.- if field_value: + if self.mode == "Stringify": clean_data = BoolInput( name="clean_data", display_name="Clean Data", @@ - else: + else: build_config.pop("clean_data", None)
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (23)
src/backend/base/langflow/api/router.py(2 hunks)src/backend/base/langflow/api/v1/__init__.py(2 hunks)src/backend/base/langflow/api/v1/model_options.py(1 hunks)src/backend/base/langflow/api/v1/models.py(1 hunks)src/backend/base/langflow/api/v1/variable.py(4 hunks)src/backend/base/langflow/base/models/openai_constants.py(0 hunks)src/backend/base/langflow/initial_setup/starter_projects/Basic Prompt Chaining.json(3 hunks)src/backend/base/langflow/initial_setup/starter_projects/Basic Prompting.json(19 hunks)src/backend/base/langflow/initial_setup/starter_projects/Blog Writer.json(26 hunks)src/backend/base/langflow/initial_setup/starter_projects/Custom Component Generator.json(1 hunks)src/backend/base/langflow/initial_setup/starter_projects/Financial Report Parser.json(16 hunks)src/backend/base/langflow/initial_setup/starter_projects/Invoice Summarizer.json(4 hunks)src/backend/base/langflow/initial_setup/starter_projects/Memory Chatbot.json(20 hunks)src/backend/base/langflow/initial_setup/starter_projects/News Aggregator.json(4 hunks)src/backend/base/langflow/initial_setup/starter_projects/Nvidia Remix.json(9 hunks)src/backend/base/langflow/initial_setup/starter_projects/Pokédex Agent.json(4 hunks)src/backend/base/langflow/initial_setup/starter_projects/Price Deal Finder.json(4 hunks)src/backend/base/langflow/initial_setup/starter_projects/Research Translation Loop.json(1 hunks)src/backend/base/langflow/initial_setup/starter_projects/SEO Keyword Generator.json(19 hunks)src/backend/base/langflow/initial_setup/starter_projects/SaaS Pricing.json(4 hunks)src/backend/base/langflow/initial_setup/starter_projects/Search agent.json(4 hunks)src/backend/base/langflow/initial_setup/starter_projects/Social Media Agent.json(4 hunks)src/backend/base/langflow/initial_setup/starter_projects/Text Sentiment Analysis.json(3 hunks)
💤 Files with no reviewable changes (1)
- src/backend/base/langflow/base/models/openai_constants.py
🧰 Additional context used
📓 Path-based instructions (1)
{src/backend/**/*.py,tests/**/*.py,Makefile}
📄 CodeRabbit inference engine (.cursor/rules/backend_development.mdc)
{src/backend/**/*.py,tests/**/*.py,Makefile}: Run make format_backend to format Python code before linting or committing changes
Run make lint to perform linting checks on backend Python code
Files:
src/backend/base/langflow/api/v1/__init__.pysrc/backend/base/langflow/api/v1/variable.pysrc/backend/base/langflow/api/v1/model_options.pysrc/backend/base/langflow/api/router.pysrc/backend/base/langflow/api/v1/models.py
🧠 Learnings (8)
📚 Learning: 2025-07-18T18:25:54.486Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/backend_development.mdc:0-0
Timestamp: 2025-07-18T18:25:54.486Z
Learning: Applies to src/backend/base/langflow/services/database/models/**/*.py : Place database models in src/backend/base/langflow/services/database/models/
Applied to files:
src/backend/base/langflow/api/v1/__init__.pysrc/backend/base/langflow/api/router.pysrc/backend/base/langflow/api/v1/models.py
📚 Learning: 2025-07-18T18:25:54.486Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/backend_development.mdc:0-0
Timestamp: 2025-07-18T18:25:54.486Z
Learning: Applies to src/backend/base/langflow/components/**/__init__.py : Update __init__.py with alphabetical imports when adding new components
Applied to files:
src/backend/base/langflow/api/v1/__init__.pysrc/backend/base/langflow/initial_setup/starter_projects/Basic Prompt Chaining.json
📚 Learning: 2025-08-11T16:52:26.755Z
Learnt from: edwinjosechittilappilly
Repo: langflow-ai/langflow PR: 9336
File: src/backend/base/langflow/base/models/openai_constants.py:29-33
Timestamp: 2025-08-11T16:52:26.755Z
Learning: The "gpt-5-chat-latest" model in the OpenAI models configuration does not support tool calling, so tool_calling should be set to False for this model in src/backend/base/langflow/base/models/openai_constants.py.
Applied to files:
src/backend/base/langflow/initial_setup/starter_projects/Pokédex Agent.jsonsrc/backend/base/langflow/initial_setup/starter_projects/Search agent.jsonsrc/backend/base/langflow/initial_setup/starter_projects/News Aggregator.jsonsrc/backend/base/langflow/initial_setup/starter_projects/Invoice Summarizer.jsonsrc/backend/base/langflow/initial_setup/starter_projects/Basic Prompting.jsonsrc/backend/base/langflow/initial_setup/starter_projects/Memory Chatbot.jsonsrc/backend/base/langflow/initial_setup/starter_projects/SaaS Pricing.jsonsrc/backend/base/langflow/initial_setup/starter_projects/Price Deal Finder.jsonsrc/backend/base/langflow/initial_setup/starter_projects/Nvidia Remix.jsonsrc/backend/base/langflow/initial_setup/starter_projects/Blog Writer.jsonsrc/backend/base/langflow/initial_setup/starter_projects/SEO Keyword Generator.json
📚 Learning: 2025-07-18T18:25:54.486Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/backend_development.mdc:0-0
Timestamp: 2025-07-18T18:25:54.486Z
Learning: Starter project files auto-format after langflow run; these formatting changes can be committed or ignored
Applied to files:
src/backend/base/langflow/initial_setup/starter_projects/Basic Prompt Chaining.jsonsrc/backend/base/langflow/initial_setup/starter_projects/Invoice Summarizer.jsonsrc/backend/base/langflow/initial_setup/starter_projects/Basic Prompting.jsonsrc/backend/base/langflow/initial_setup/starter_projects/Memory Chatbot.jsonsrc/backend/base/langflow/initial_setup/starter_projects/Custom Component Generator.jsonsrc/backend/base/langflow/initial_setup/starter_projects/Text Sentiment Analysis.jsonsrc/backend/base/langflow/initial_setup/starter_projects/Blog Writer.jsonsrc/backend/base/langflow/initial_setup/starter_projects/SEO Keyword Generator.json
📚 Learning: 2025-06-26T19:43:18.260Z
Learnt from: ogabrielluiz
Repo: langflow-ai/langflow PR: 0
File: :0-0
Timestamp: 2025-06-26T19:43:18.260Z
Learning: In langflow custom components, the `module_name` parameter is now propagated through template building functions to add module metadata and code hashes to frontend nodes for better component tracking and debugging.
Applied to files:
src/backend/base/langflow/initial_setup/starter_projects/Basic Prompting.jsonsrc/backend/base/langflow/initial_setup/starter_projects/Custom Component Generator.jsonsrc/backend/base/langflow/initial_setup/starter_projects/Research Translation Loop.json
📚 Learning: 2025-09-30T00:09:51.631Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/docs_development.mdc:0-0
Timestamp: 2025-09-30T00:09:51.631Z
Learning: Applies to docs/**/*.{md,mdx} : Use correct terminology capitalization: Langflow, Component, Flow, API, JSON
Applied to files:
src/backend/base/langflow/initial_setup/starter_projects/Basic Prompting.jsonsrc/backend/base/langflow/initial_setup/starter_projects/Custom Component Generator.jsonsrc/backend/base/langflow/initial_setup/starter_projects/Blog Writer.jsonsrc/backend/base/langflow/initial_setup/starter_projects/SEO Keyword Generator.json
📚 Learning: 2025-06-23T12:46:42.048Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/frontend_development.mdc:0-0
Timestamp: 2025-06-23T12:46:42.048Z
Learning: Custom React Flow node types should be implemented as memoized components, using Handle components for connection points and supporting optional icons and labels.
Applied to files:
src/backend/base/langflow/initial_setup/starter_projects/Basic Prompting.jsonsrc/backend/base/langflow/initial_setup/starter_projects/Memory Chatbot.jsonsrc/backend/base/langflow/initial_setup/starter_projects/Blog Writer.jsonsrc/backend/base/langflow/initial_setup/starter_projects/SEO Keyword Generator.jsonsrc/backend/base/langflow/initial_setup/starter_projects/Financial Report Parser.json
📚 Learning: 2025-06-23T12:46:42.048Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/frontend_development.mdc:0-0
Timestamp: 2025-06-23T12:46:42.048Z
Learning: React Flow should be used for flow graph visualization, with nodes and edges passed as props, and changes handled via onNodesChange and onEdgesChange callbacks.
Applied to files:
src/backend/base/langflow/initial_setup/starter_projects/Basic Prompting.jsonsrc/backend/base/langflow/initial_setup/starter_projects/Financial Report Parser.json
🧬 Code graph analysis (3)
src/backend/base/langflow/api/v1/variable.py (2)
src/lfx/src/lfx/base/models/unified_models.py (2)
get_model_provider_variable_mapping(102-103)validate_model_provider_key(248-319)src/backend/base/langflow/services/variable/service.py (1)
get_all(152-171)
src/backend/base/langflow/api/v1/model_options.py (2)
src/lfx/src/lfx/base/models/unified_models.py (2)
get_embedding_model_options(498-680)get_language_model_options(322-495)src/backend/tests/unit/api/v2/test_mcp_servers_file.py (1)
current_user(83-87)
src/backend/base/langflow/api/v1/models.py (4)
src/lfx/src/lfx/base/models/unified_models.py (3)
get_model_provider_variable_mapping(102-103)get_model_providers(106-108)get_unified_models_detailed(111-198)src/backend/base/langflow/services/deps.py (1)
get_variable_service(101-110)src/backend/base/langflow/services/variable/service.py (5)
get_all(152-171)get_variable_object(115-129)update_variable_fields(213-242)create_variable(267-293)delete_variable(244-256)src/backend/base/langflow/services/database/models/variable/model.py (1)
VariableUpdate(67-72)
🔇 Additional comments (33)
src/backend/base/langflow/initial_setup/starter_projects/Price Deal Finder.json (4)
1693-1710: API Key configuration update is appropriate.The api_key field has been correctly updated to be provider-agnostic and marked as advanced. The changes from "OpenAI API Key" to "Model Provider API key" and the addition of
real_time_refresh: truealign well with the unified model provider approach.
1836-1871: ModelInput configuration correctly replaces agent_llm.The new
modelinput properly replaces the previous OpenAI-specificagent_llmdropdown with a unified ModelInput that supports dynamic provider selection. Theexternal_optionsfield enables the "Connect other models" workflow, andreal_time_refresh: trueensures provider options stay current.Confirm that the ModelInput class supports the
external_optionsfield with the node structure shown in lines 1841-1851.
1711-1727: Code implementation updated for unified model management.The code field has been updated to import and use the new unified model functions (
get_llm,get_language_model_options,update_model_options_in_build_config). The integration of ModelInput alongside these utilities appears correct and supports the provider-agnostic model selection approach.The code implementation is substantial. Verify that:
- The new
update_model_options_in_build_configfunction handles the field_value parameter correctly (noted in AI summary as acceptinglist[dict]instead of string)- All new imports from
lfx.base.models.unified_modelsare exported and available
1605-1625: ****langchain_core version 0.3.79 is a valid release from October 9, 2025. The dependency specification in the JSON is correct and compatible. Pydantic 2.10.6 satisfies langchain_core's requirement for pydantic>=2.7.4. No changes are needed.
src/backend/base/langflow/api/v1/variable.py (1)
4-4: LGTM: Clean integration with unified model providers.The import and module-level initialization correctly set up the infrastructure for model provider variable validation.
Also applies to: 14-14
src/backend/base/langflow/initial_setup/starter_projects/Nvidia Remix.json (2)
1039-1074: ModelInput (language) block looks correct; verify dynamic loadingThe ModelInput config for the Agent aligns with unified providers (input_types, providers, options, real_time_refresh). Please verify options/providers populate via the new /models endpoints in the Playground.
Steps:
- Open this starter, toggle model dropdown, confirm providers/options load; switch users with different enabled providers and re-check.
1597-1610: EmbeddingModel metadata OK but template still uses Dropdown for modelThe component code switched to ModelInput; see fix below for the template block to avoid runtime/UI mismatch.
src/backend/base/langflow/initial_setup/starter_projects/Pokédex Agent.json (1)
1327-1343: Good: API key label generalized.Secret input renamed to “API Key” with provider-agnostic info. No blockers.
src/backend/base/langflow/initial_setup/starter_projects/Invoice Summarizer.json (2)
1242-1258: Good: API key label generalized.Matches unified approach. LGTM.
1384-1419: ModelInput block present and correct.Includes providers/external_options and required flags. LGTM.
src/backend/base/langflow/initial_setup/starter_projects/Basic Prompt Chaining.json (1)
1269-1272: Templates still use old provider/model_name configuration and must be updated to work with migrated LanguageModelComponent.The component code is correctly migrated to the unified flow, but the starter_projects templates still configure separate "provider" and "model_name" fields instead of the unified ModelInput "model" field. This mismatch will cause rendering failures in the Playground. Affected templates: Research Translation Loop.json, Youtube Analysis.json, Twitter Thread Generator.json, Text Sentiment Analysis.json, and SEO Keyword Generator.json each have LanguageModelComponent nodes with incompatible field configurations.
⛔ Skipped due to learnings
Learnt from: edwinjosechittilappilly Repo: langflow-ai/langflow PR: 0 File: :0-0 Timestamp: 2025-08-07T20:23:23.569Z Learning: Some Langflow starter project files and components still use `from loguru import logger` instead of the centralized structlog logger from `langflow.logging.logger`. These should be updated to ensure consistent structured logging across the entire codebase.Learnt from: ogabrielluiz Repo: langflow-ai/langflow PR: 0 File: :0-0 Timestamp: 2025-06-26T19:43:18.260Z Learning: In langflow custom components, the `module_name` parameter is now propagated through template building functions to add module metadata and code hashes to frontend nodes for better component tracking and debugging.Learnt from: CR Repo: langflow-ai/langflow PR: 0 File: .cursor/rules/testing.mdc:0-0 Timestamp: 2025-07-21T14:16:14.125Z Learning: Applies to src/backend/tests/**/*.py : Use 'MockLanguageModel' for testing language model components without external API calls in backend Python tests.src/backend/base/langflow/initial_setup/starter_projects/Custom Component Generator.json (4)
2653-2681: LGTM: Provider and model_name fields properly removed.The removal of static provider/model_name dropdown fields in favor of dynamic ModelInput is correct and aligns with the PR's unified model provider approach. The old fields are no longer present in the template, which is expected.
2175-2210: The review comment appears to be based on incorrect assumptions about the codebase.The Custom Component Generator.json file does not contain
external_optionsfields or "Connect other models" configurations at lines 2175-2210 or anywhere else in the file. The specified lines contain a ChatInput node configuration. Theexternal_optionsstructure exists only at the Python code level insrc/lfx/src/lfx/inputs/inputs.py(ModelInput class definition), not in this JSON starter project file.Likely an incorrect or invalid review comment.
2628-2629: The web search returned generic component documentation that isn't specific to the Langflow codebase. Let me search directly in the repository for Component and user_id.Now let me search the repository directly for the Component and LCModelComponent classes to verify user_id availability:
Based on my verification, I found evidence that user_id is an optional component attribute defined as "Identifier of the user associated with the flow, if any" and can be
str | None. Additionally, Langflow code defensively checkshasattr(self, "_user_id")before accessing user_id, suggesting that direct use without checks could be problematic.However, the code in the review comment calls
get_llm(..., user_id=self.user_id, ...)without verifying ifuser_idexists or is not None, which could pass None to the function. The actual severity depends on howget_llmhandles None values.
2630-2651: All helper functions verified at expected import paths with matching signatures.The functions exist in
lfx.base.models.unified_modelsand match expected signatures:
get_llm(line 683): Accepts model, user_id, api_key, temperature, and stream as keyword-only parameterget_language_model_options(line 322): Accepts optional user_id parameterupdate_model_options_in_build_config(line 747): Accepts component, build_config, cache_key_prefix, get_options_func, and optional field_name/field_valueThe embedded Python code in the JSON file correctly imports and invokes all three functions with proper parameter passing.
src/backend/base/langflow/initial_setup/starter_projects/Search agent.json (9)
945-945: Note code_hash update.The code_hash has been updated from a previous version to
0ee0559dee10, reflecting substantial changes to the Agent component code. Ensure this hash is correctly computed and matches the actual code content.
1032-1049: Update api_key field presentation for model agnostic approach.The api_key field has been updated:
- Label changed from provider-specific (e.g., "OpenAI API Key") to generic "API Key"
- Info updated to "Model Provider API key"
- Advanced flag set to
trueThis properly reflects the unified model provider approach. However, verify that the generic label doesn't cause user confusion when different providers require different API key types.
1175-1210: LGTM: ModelInput configuration for unified model selection.The ModelInput definition properly replaces provider-specific fields with:
- Dynamic model and provider options (populated at runtime)
model_type: "language"for language modelsreal_time_refresh: truefor dynamic updatesexternal_optionsfor connecting other model outputs- Proper input_types and metadata
This aligns well with the new unified model provider API.
1065-1067: Question: Verify memory inputs removal doesn't break existing flows.Line 1066 includes a comment:
# removed memory inputs from agent component. Ensure that:
- Existing Agent flows that previously connected memory components still work
- The new
get_memory_data()method provides equivalent functionality- No data loss or behavioral changes for users upgrading
The removal of memory inputs is a breaking change. Verify in migration notes or upgrade documentation that users are informed about this change and any required flow adjustments.
1088-1116: LGTM: Format instructions and schema validation properly integrated.The new fields for structured output (format_instructions, output_schema) are well-designed:
- Format instructions provide clear guidance for JSON extraction
- Output schema table allows users to define expected structure
- Default format instructions are sensible and comprehensive
This enables the Agent to support structured outputs, which is a valuable feature addition.
1156-1210: LGTM: ModelInput integration with external_options properly configured.The ModelInput includes:
external_optionsfor connecting other language modelsmodel_type: "language"designation- Empty
optionsandprovidersarrays (dynamically populated)- Proper metadata for frontend rendering
This maintains backward compatibility with flows that might connect external model components.
952-955: langchain_core 0.3.79 is a safe patch release with no breaking changes.Version 0.3.79 is a patch release that fixes chunk construction with tool calls, with no breaking changes introduced. The dependency update is compatible with the agent implementation.
1066-1067: Based on the verification results from the script execution, I can now provide the final assessment. The shell output confirms that the three key functions referenced in the review comment are properly implemented in the codebase:
get_llm()function: Confirmed in multiple components with the correct signatureget_llm(model=..., user_id=..., api_key=...)get_memory_data()method: Verified inagent.pyat line 406 and is properly called inget_agent_requirements()at line 177build_structured_output_base()method: Confirmed as async inagent.pyat line 250 and properly integrated into thejson_response()methodThe implementation shows proper error handling with fallbacks (the try-except blocks in the code catch various exception types), and the memory retrieval includes logic to filter duplicate messages using
getattr()for safety. The structured output method handles schema validation with proper error propagation and returns appropriate fallback values when validation fails.
All verifications passed - no actionable issues detected.
The Agent component refactoring demonstrates:
- ✓ Correct use of
get_llm()with appropriate parameters- ✓ Proper memory retrieval implementation with duplicate handling
- ✓ Comprehensive structured output validation with fallback mechanisms
The code is well-integrated with appropriate exception handling for edge cases (schema validation errors, missing schemas, JSON parsing failures).
1197-1208: Verify if templatemodel_typefield is actually respected by backend or frontend when populating model options.Investigation findings:
real_time_refresh: true— ✓ Correctly implemented. The frontend checks this flag in three locations and triggers dynamic updates/mutations when set.
model_type: "language"—⚠️ Partially verified. The frontend ModelInputComponent infers model type from the first option's metadata (options[0]?.metadata?.model_type) rather than consuming the template field directly. The component does not acceptmodel_typeas a prop parameter. This means:
- If the backend pre-filters options based on the template's
model_typefield, filtering works correctly.- If the backend ignores this field and returns all models, the component still infers the correct type from the data received.
- Either way, the template field may be unused or redundant.
Action: Confirm with the backend team or code review that the template's
model_typefield influences the options array returned, or document if it's metadata-only.src/backend/base/langflow/initial_setup/starter_projects/Research Translation Loop.json (1)
1404-1405: Verify model field default value and dynamic population strategy.Line 1404 shows the LanguageModelComponent code value transitioning to the new unified model approach. The model field (lines 1408-1442 in the template) has
"options": []and"providers": [], indicating dynamic population at runtime. Ensure:
- The frontend correctly handles empty initial options and populates them via API call
- Users see appropriate model options on component load (not a blank dropdown)
- The update_build_config logic properly integrates with the new model options endpoint
If dynamic loading is not working, consider adding a sensible default or first-loaded option.
src/backend/base/langflow/initial_setup/starter_projects/News Aggregator.json (4)
1178-1178: Code hash and dependency updates reflect Agent component refactoring.The code_hash update to "0ee0559dee10" and the addition of langchain_core 0.3.79 to dependencies indicate substantial changes in the Agent component implementation. This is appropriate given the shift to the unified model approach. However, ensure:
- The version langchain_core 0.3.79 is compatible with other langchain dependencies (especially langchain_openai 0.3.23)
- Integration tests verify Agent behavior with the new get_llm-based initialization
- Any breaking changes in langchain_core are handled in the Agent code
Also applies to: 1185-1188
1267-1281: API key field properly generalized for multi-provider support.The api_key field changes are appropriate:
- Display name changed from "OpenAI API Key" to generic "API Key" ✓
- Marked as advanced=true (appropriate for optional provider-specific config) ✓
- Info text updated to "Model Provider API key" ✓
This allows the same Agent component to work with OpenAI, Anthropic, Google, and other providers. The integration with the unified model system should automatically handle provider-specific secrets.
1408-1442: Model input configuration supports dynamic provider and model discovery.The new model field uses ModelInput type with:
"external_options"(lines 1413-1423) allowing connection of non-standard model sources"model_type": "language"for scoping to language models- Empty
"options"and"providers"arrays for runtime populationThis design is sound for the centralized model discovery pattern. Key implementation details to verify:
- The
update_model_options_in_build_configcall within Agent.update_build_config() correctly retrieves both providers and model options from the /models API- The
real_time_refresh: trueflag ensures options update when field_value changes- The
external_optionssection provides UI affordance for connecting alternative models
1299-1300: Agent code implementation properly transitions to unified model approach.The embedded Agent component code (lines 1299-1300) shows:
- Correct imports from
lfx.base.models.unified_models✓- Proper use of
get_llm()for model instantiation with user context ✓- Integration of
update_model_options_in_build_config()in update_build_config method ✓- Removal of provider-specific branching logic (proper simplification) ✓
However, the code snippet is truncated in the provided context. To fully verify:
- Ensure all references to old fields (provider, model_name, openai_api_base, etc.) have been removed from the execution path
- Verify error handling for missing model or invalid model selection
- Confirm that
get_llm()correctly propagates temperature and stream parameterssrc/backend/base/langflow/initial_setup/starter_projects/Social Media Agent.json (1)
1518-1553: ModelInput migration looks good.Unified “model” input with external_options and real_time_refresh is aligned with the new provider flow.
Please confirm the options are being populated via update_model_options_in_build_config at runtime in this starter (no hard-coded options expected).
src/backend/base/langflow/initial_setup/starter_projects/SaaS Pricing.json (1)
1098-1133: Unified ModelInput adoption LGTM.ModelInput with external_options/real_time_refresh is correct for the centralized providers API.
src/backend/base/langflow/initial_setup/starter_projects/Memory Chatbot.json (2)
1245-1264: LanguageModelComponent migration looks solid overall.Docs link, inputs (model, api_key, system_message, stream, temperature) and build_model() using get_llm are consistent with unified flow.
Also applies to: 1330-1363
1399-1441: Revert cautionary note: "gpt-5-chat-latest" does support tool calling.The review comment warns against tool-calling support for "gpt-5-chat-latest" based on a learning from August. However, OpenAI's GPT-5 (including gpt-5-chat-latest) supports tool calling, and LangChain's ChatOpenAI already exposes gpt-5-chat-latest. The underlying assumption is outdated; remove the warning.
Likely an incorrect or invalid review comment.
|
@coderabbitai Can you do an assessment of the (1) quality, (2) security robustness, and (3) merge worthiness of this PR? and please ensure you're using the LATEST commit hash: 03183df |
|
Tip For best results, initiate chat on the files or code changes. 🧩 Analysis chain🏁 Script executed: Length of output: 413 🏁 Script executed: Length of output: 105 🏁 Script executed: Length of output: 119 🏁 Script executed: Length of output: 3848 🏁 Script executed: Length of output: 2425 🏁 Script executed: Length of output: 230 🏁 Script executed: Length of output: 129 🏁 Script executed: Length of output: 1880 🏁 Script executed: Length of output: 101 🏁 Script executed: Length of output: 15409 🏁 Script executed: Length of output: 27815 🏁 Script executed: Length of output: 849 🏁 Script executed: Length of output: 912 🏁 Script executed: Length of output: 9197 🏁 Script executed: Length of output: 1099 🏁 Script executed: Length of output: 589 🏁 Script executed: Length of output: 878 🏁 Script executed: Length of output: 1194 🏁 Script executed: Length of output: 553 🏁 Script executed: Length of output: 1575 🏁 Script executed: Length of output: 2940 🏁 Script executed: Length of output: 1863 🏁 Script executed: Length of output: 57 🏁 Script executed: Length of output: 1652 🏁 Script executed: Length of output: 1188 🏁 Script executed: Length of output: 855 @erichare I've completed a comprehensive assessment of PR #10565. Here are my findings:
|
PR Summary: Unified Model Providers Configuration
Overview
This PR introduces a centralized system for managing AI model providers and configurations across Langflow, moving from provider-specific implementations to a unified, dynamic approach.
Key Features Added
1. New API Endpoints (
/modelsand/model_options)Model Catalog Management:
User-Specific Model Management:
Model Options:
2. Variable Management
__prefix__) from API responses3. Component Refactoring (20+ Starter Projects)
Unified all language model components from provider-specific implementations to a generic approach:
Before:
After:
Affected Projects:
4. Cleanup
openai_constants.pyTechnical Highlights
DatabaseVariableServicefor user-scoped variable storage (disabled models, default models)get_llm()andupdate_model_options_in_build_config()helpers for runtime model resolutionImpact
This is a significant architectural improvement that standardizes model handling across the entire Langflow platform.