Conversation
Ollama models now work with the AI SDK provider format (e.g., ollama/qwen3:1.7b). The fix addresses two issues: 1. Filter out undefined values when checking if clientOptions has meaningful data 2. Skip error logging for providers that intentionally don't need API keys 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
|
Greptile SummaryFixes Ollama support by properly detecting when Key Changes:
The root cause was that when using format like Confidence Score: 5/5
Important Files Changed
Sequence DiagramsequenceDiagram
participant User
participant LLMProvider
participant getAISDKLanguageModel
participant AISDKProviders
participant AISDKProvidersWithAPIKey
participant loadApiKeyFromEnv
User->>LLMProvider: getClient("ollama/qwen3:1.7b", clientOptions)
LLMProvider->>getAISDKLanguageModel: call with subProvider="ollama", clientOptions
Note over getAISDKLanguageModel: NEW: Check if clientOptions<br/>has meaningful values
getAISDKLanguageModel->>getAISDKLanguageModel: Object.values(clientOptions)<br/>.some(v !== undefined)
alt clientOptions has meaningful values
getAISDKLanguageModel->>AISDKProvidersWithAPIKey: lookup provider with API key
AISDKProvidersWithAPIKey-->>getAISDKLanguageModel: throw error (ollama not in map)
else no meaningful values (ollama case)
getAISDKLanguageModel->>AISDKProviders: lookup default provider
AISDKProviders-->>getAISDKLanguageModel: return ollama provider
end
getAISDKLanguageModel-->>LLMProvider: return language model
User->>loadApiKeyFromEnv: load API key for "ollama"
Note over loadApiKeyFromEnv: NEW: Check if provider<br/>needs API key
loadApiKeyFromEnv->>loadApiKeyFromEnv: providersWithoutApiKey.has("ollama")
alt provider needs API key
loadApiKeyFromEnv->>loadApiKeyFromEnv: log warning if env var not found
else provider doesn't need API key (ollama)
loadApiKeyFromEnv->>loadApiKeyFromEnv: skip warning log
end
loadApiKeyFromEnv-->>User: return undefined (no error)
|
Tests verify that getAISDKLanguageModel correctly handles: - ollama without clientOptions - ollama with empty clientOptions - ollama with undefined/null apiKey values 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
This ensures ollama works even when users mistakenly provide an API key or want to set a custom baseURL for a remote ollama server. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
cb197ac to
d3001d5
Compare
Summary
Ollama models now work with the AI SDK provider format (e.g.,
ollama/qwen3:1.7b). Previously, using ollama would fail withUnsupportedAISDKModelProviderError. Closes #1164Changes
clientOptionshas meaningful data ingetAISDKLanguageModelTesting
Tested with
model: "ollama/qwen3:1.7b"and verified Stagehand initializes successfully without errors.Summary by cubic
Enable Ollama models to work with the AI SDK provider format (e.g., ollama/qwen3:1.7b). Fixes initialization errors and removes unnecessary API key warnings.
Written for commit d3001d5. Summary will update on new commits.