This repository was archived by the owner on Nov 10, 2025. It is now read-only.
Fix MDXSearchTool to work with local Ollama models without OpenAI API key #471
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Fix MDXSearchTool to work with local Ollama models without OpenAI API key
Summary
Fixes issue #3622 where MDXSearchTool required an OpenAI API key even when configured to use local Ollama models for both LLM and embedder. The tool now works entirely offline with local models.
Key Changes:
_parse_configin RagTool to properly handle configurations with bothllmandembedderspecified as non-OpenAI providersOllamaEmbeddingFunctioninstantiation for Ollama embeddings to bypass OpenAI validationollama>=0.6.0dependency to support local embedding functionsRoot Cause: The config parsing logic wasn't properly extracting embedding configuration when both LLM and embedder were configured as local providers, causing it to fall back to OpenAI validation.
Review & Testing Checklist for Human (3 items - Yellow Risk)
model→EMBEDDINGS_OLLAMA_MODEL_NAME,url→EMBEDDINGS_OLLAMA_URL) match what the CrewAI RAG system actually expectsTest Plan Recommendation
Notes
Note
Enable MDXSearchTool to run fully offline with local Ollama LLM/embeddings by updating config parsing and adding tests, plus
ollamadependency.crewai_tools/tools/rag/rag_tool.pyconfig parsing to support non-OpenAIllmandembedderproviders.OllamaEmbeddingFunctiondirectly for local embeddings; preserve existing OpenAI flow.tests/tools/test_mdx_search_tool_local_config.pycovering local Ollama configurations for LLM and embeddings.ollama>=0.6.0inpyproject.toml.Written by Cursor Bugbot for commit 36a2f46. This will update automatically on new commits. Configure here.