Describe the bug
When using custom provider with a local API like Ollama that requires no authentication, the plugin refuses to send any request with the error:
No API key or provider configuration.
Root cause
In AIAssistantService.qml, buildCurlCommand() returns null unconditionally when no API key is resolved:
function buildCurlCommand(payload) {
const key = resolveApiKey();
if (!key)
return null; // blocks keyless providers
Additionally, resolveApiKey() hardcodes an empty fallback for custom in commonEnv(), so there is no way to satisfy the check without entering a dummy key.
Steps to reproduce
- Set provider to
custom
- Set Base URL to
http://localhost:11434 (Ollama)
- Leave API Key empty
- Send any message
- Error:
No API key or provider configuration.
Expected behavior
The custom provider should be able to make requests without an API key, since many local inference servers (Ollama, LM Studio, llama.cpp, etc.) do not require authentication.
Suggested fix
In AIAssistantService.qml, change:
To:
if (!key && provider !== "custom")
return null;
Describe the bug
When using
customprovider with a local API like Ollama that requires no authentication, the plugin refuses to send any request with the error:Root cause
In
AIAssistantService.qml,buildCurlCommand()returnsnullunconditionally when no API key is resolved:Additionally,
resolveApiKey()hardcodes an empty fallback forcustomincommonEnv(), so there is no way to satisfy the check without entering a dummy key.Steps to reproduce
customhttp://localhost:11434(Ollama)No API key or provider configuration.Expected behavior
The
customprovider should be able to make requests without an API key, since many local inference servers (Ollama, LM Studio, llama.cpp, etc.) do not require authentication.Suggested fix
In
AIAssistantService.qml, change:To: