I'm running into some issues related to my provider: LiteLLM proxy. I realize this is a somewhat niche case, sorry 😬. See my config below.
Is this somehow something I can resolve client side?
AI_APICallError: litellm.UnsupportedParamsError: Anthropic doesn't support tool calling without tools= param specified. Pass tools= param OR set litellm.modify_params = True // litellm_settings::modify_params: True to add dummy tool to the request.
Dummy session with the error (error not shown in web UI for some reason?): https://opencode.ai/s/SWAsOXyz
My config
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"my-litellm-provider": {
"name": "My LiteLLM Provider",
"npm": "@ai-sdk/openai-compatible",
"options": {
"baseURL": "https://my-provider.com",
"apiKey": "...",
"includeUsage": true
},
"models": {
"claude-4-5-sonnet": {
"name": "Anthropic Claude 4.5 Sonnet",
"release_date": "2025-09-29",
"attachment": true,
"reasoning": true,
"temperature": true,
"tool_call": true,
"limit": {
"context": 200000,
"output": 64000
},
"options": {
"thinking": {
"type": "enabled",
"budget_tokens": 16000,
},
}
},
}
}
}
}
I'm running into some issues related to my provider: LiteLLM proxy. I realize this is a somewhat niche case, sorry 😬. See my config below.
Is this somehow something I can resolve client side?
Dummy session with the error (error not shown in web UI for some reason?): https://opencode.ai/s/SWAsOXyz
My config
{ "$schema": "https://opencode.ai/config.json", "provider": { "my-litellm-provider": { "name": "My LiteLLM Provider", "npm": "@ai-sdk/openai-compatible", "options": { "baseURL": "https://my-provider.com", "apiKey": "...", "includeUsage": true }, "models": { "claude-4-5-sonnet": { "name": "Anthropic Claude 4.5 Sonnet", "release_date": "2025-09-29", "attachment": true, "reasoning": true, "temperature": true, "tool_call": true, "limit": { "context": 200000, "output": 64000 }, "options": { "thinking": { "type": "enabled", "budget_tokens": 16000, }, } }, } } } }