-
-
Notifications
You must be signed in to change notification settings - Fork 52
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
Description
Cortex attempts to connect to Ollama at http://localhost:11434/ even when no local LLM is installed or configured.
Expected Behavior
- If Ollama is not configured, Cortex should skip the connection attempt
- Should fail gracefully with a clear message, not an unexpected error
- Should proceed directly to configured cloud providers (OpenAI/Anthropic)
Actual Behavior
- Cortex tries localhost:11434 regardless of configuration
- Causes confusion when users have cloud API keys configured but no local LLM
Steps to Reproduce
- Configure OpenAI or Anthropic API key
- Do NOT install Ollama
- Run any cortex command
- Observe connection attempt to localhost:11434
Suggested Fix
Check for Ollama configuration before attempting connection. If not explicitly enabled, skip local LLM detection.
Environment
- Cortex v0.1.0
- Ubuntu (Effiti team testing)
Reporter
Effiti team via Discord
coderabbitai
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working