Skip to content

Custom provider fails without API key (Ollama/local LLMs) #9

@Nosirus

Description

@Nosirus

Describe the bug

When using custom provider with a local API like Ollama that requires no authentication, the plugin refuses to send any request with the error:

No API key or provider configuration.

Root cause

In AIAssistantService.qml, buildCurlCommand() returns null unconditionally when no API key is resolved:

function buildCurlCommand(payload) {
    const key = resolveApiKey();
    if (!key)
        return null; // blocks keyless providers

Additionally, resolveApiKey() hardcodes an empty fallback for custom in commonEnv(), so there is no way to satisfy the check without entering a dummy key.

Steps to reproduce

  1. Set provider to custom
  2. Set Base URL to http://localhost:11434 (Ollama)
  3. Leave API Key empty
  4. Send any message
  5. Error: No API key or provider configuration.

Expected behavior

The custom provider should be able to make requests without an API key, since many local inference servers (Ollama, LM Studio, llama.cpp, etc.) do not require authentication.

Suggested fix

In AIAssistantService.qml, change:

if (!key)
    return null;

To:

if (!key && provider !== "custom")
    return null;

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions