-
-
Notifications
You must be signed in to change notification settings - Fork 52
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
What happened?
The Ollama model is hardcoded to llama3.2 in multiple places (ask.py and tests), preventing users from specifying alternative Ollama models like llama3.1:8b, mistral, phi3, or other compatible models.
How to reproduce
- Set environment variable: export OLLAMA_MODEL=mistral
- Run:
cortex ask "what is nginx?" - Expected: Cortex uses the mistral model specified in OLLAMA_MODEL
- Actually happened: Cortex ignores the environment variable and defaults to llama3.2 hardcoded in ask.py
Error output
No error is thrown, but the hardcoded model is used instead of respecting configuration:
Current hardcoded locations:
ask.py - _default_model() returns hardcoded "llama3.2"
test_ask.py - Test expects hardcoded model
test_ollama_integration.py - Multiple test instances with hardcoded ollama_model="llama3.2"
Expected behavior
Read from OLLAMA_MODEL environment variable or config file, allowing users to specify any Ollama model.
coderabbitai
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working