Skip to content

Bug: Cortex attempts Ollama connection when not configured #316

@mikejmorgan-ai

Description

@mikejmorgan-ai

Description

Cortex attempts to connect to Ollama at http://localhost:11434/ even when no local LLM is installed or configured.

Expected Behavior

  • If Ollama is not configured, Cortex should skip the connection attempt
  • Should fail gracefully with a clear message, not an unexpected error
  • Should proceed directly to configured cloud providers (OpenAI/Anthropic)

Actual Behavior

  • Cortex tries localhost:11434 regardless of configuration
  • Causes confusion when users have cloud API keys configured but no local LLM

Steps to Reproduce

  1. Configure OpenAI or Anthropic API key
  2. Do NOT install Ollama
  3. Run any cortex command
  4. Observe connection attempt to localhost:11434

Suggested Fix

Check for Ollama configuration before attempting connection. If not explicitly enabled, skip local LLM detection.

Environment

  • Cortex v0.1.0
  • Ubuntu (Effiti team testing)

Reporter

Effiti team via Discord

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions