Skip to content

fix: [Security Fix] remove litellm #1240

Merged
edwinjosechittilappilly merged 3 commits into
release-0.4.0from
fix-litellm
Mar 25, 2026
Merged

fix: [Security Fix] remove litellm #1240
edwinjosechittilappilly merged 3 commits into
release-0.4.0from
fix-litellm

Conversation

@edwinjosechittilappilly
Copy link
Copy Markdown
Collaborator

Remove the litellm package from both Dockerfile.langflow and Dockerfile.langflow.dev to avoid conflicts/compatibility issues. In Dockerfile.langflow the pip uninstall -y litellm was added to the RUN that installs uv and prepares /app/langflow-data; in Dockerfile.langflow.dev a RUN uv pip uninstall litellm line was added after the dependency sync. This ensures built images do not include litellm.

ref: Approved PR https://github.com/langflow-ai/openrag/pull/1239/changes

Remove the `litellm` package from both Dockerfile.langflow and Dockerfile.langflow.dev to avoid conflicts/compatibility issues. In Dockerfile.langflow the `pip uninstall -y litellm` was added to the RUN that installs `uv` and prepares /app/langflow-data; in Dockerfile.langflow.dev a `RUN uv pip uninstall litellm` line was added after the dependency sync. This ensures built images do not include `litellm`.
Remove usage of the agentd library and its OpenAI patching/tool decorator. Instantiate AsyncOpenAI directly (HTTP/2 or HTTP/1.1 fallback) and remove imports of agentd.patch and agentd.tool_decorator. Add runtime dependencies for openai, pyyaml, and tiktoken in pyproject.toml to support direct OpenAI client usage.
@github-actions github-actions Bot added backend 🔷 Issues related to backend services (OpenSearch, Langflow, APIs) docker labels Mar 24, 2026
@github-actions github-actions Bot added the bug 🔴 Something isn't working. label Mar 24, 2026
@github-actions github-actions Bot added the lgtm label Mar 24, 2026
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR removes the agentd integration and litellm from the dependency set / Docker images, and updates the runtime dependency list accordingly (notably adding openai, pyyaml, and tiktoken).

Changes:

  • Remove agentd imports/decorators and MCP patching integration.
  • Ensure litellm is uninstalled in Langflow Docker images.
  • Update dependency manifests (pyproject.toml, uv.lock) to reflect the new dependency set.

Reviewed changes

Copilot reviewed 5 out of 6 changed files in this pull request and generated 4 comments.

Show a summary per file
File Description
Dockerfile.langflow Uninstalls litellm from the base Langflow image layer.
Dockerfile.langflow.dev Adds a uv pip uninstall litellm step after syncing deps.
pyproject.toml Drops agentd and adds direct deps (openai, pyyaml, tiktoken).
src/config/settings.py Removes patch_openai_with_mcp usage and instantiates AsyncOpenAI directly.
src/services/search_service.py Removes agentd tool decorator/import.
uv.lock Lockfile updated to remove agentd/litellm (and related transitive deps) and include the new direct deps.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment thread Dockerfile.langflow.dev
# Return to app directory and install the project
WORKDIR /app
RUN uv sync --frozen --no-dev --no-editable --extra postgresql
RUN uv pip uninstall litellm
Copy link

Copilot AI Mar 24, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

uv pip uninstall litellm will run interactively and prompt for confirmation in a Docker build, which can hang/fail the build. Use the non-interactive flag (e.g., -y/--yes), and consider making the step tolerant when the package is already absent (e.g., avoid failing the layer if litellm is not installed).

Suggested change
RUN uv pip uninstall litellm
RUN uv pip uninstall -y litellm || true

Copilot uses AI. Check for mistakes.
Comment thread src/config/settings.py
Comment on lines 538 to +546
if use_http2:
self._patched_async_client = patch_openai_with_mcp(AsyncOpenAI())
self._patched_async_client = AsyncOpenAI()
logger.info("OpenAI client initialized with HTTP/2")
else:
http_client = httpx.AsyncClient(
http2=False,
timeout=httpx.Timeout(60.0, connect=10.0)
)
self._patched_async_client = patch_openai_with_mcp(
AsyncOpenAI(http_client=http_client)
)
self._patched_async_client = AsyncOpenAI(http_client=http_client)
Copy link

Copilot AI Mar 24, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

After removing patch_openai_with_mcp(...), clients.patched_*_client now returns a vanilla AsyncOpenAI. Other parts of the codebase still format model names with provider prefixes (e.g., ollama/..., watsonx/...) expecting LiteLLM-style routing; those requests will fail against the OpenAI API. Either reintroduce a routing layer (or provider-specific clients) or update the embedding/search code to only send OpenAI-compatible model names here.

Copilot uses AI. Check for mistakes.
Comment thread src/config/settings.py
Comment on lines +539 to +546
self._patched_async_client = AsyncOpenAI()
logger.info("OpenAI client initialized with HTTP/2")
else:
http_client = httpx.AsyncClient(
http2=False,
timeout=httpx.Timeout(60.0, connect=10.0)
)
self._patched_async_client = patch_openai_with_mcp(
AsyncOpenAI(http_client=http_client)
)
self._patched_async_client = AsyncOpenAI(http_client=http_client)
Copy link

Copilot AI Mar 24, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The surrounding docstring/comments still describe this client as “patched with LiteLLM support” and mention LiteLLM routing, but the implementation now constructs a plain AsyncOpenAI. Please update the docstring/comments (and potentially the patched_* naming) so they match the new behavior to avoid misleading future changes.

Copilot uses AI. Check for mistakes.
Comment thread pyproject.toml
Comment on lines +44 to +47
"zxcvbn>=4.5.0",
"openai>=1.0.0",
"pyyaml>=6.0",
"tiktoken>=0.7.0",
Copy link

Copilot AI Mar 24, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PR description focuses on removing litellm from Docker images, but this PR also removes agentd usage and adds new runtime dependencies (openai, pyyaml, tiktoken). Please update the PR description to reflect these additional changes so reviewers understand the full scope and impact.

Copilot uses AI. Check for mistakes.
@edwinjosechittilappilly edwinjosechittilappilly merged commit 54809a8 into release-0.4.0 Mar 25, 2026
14 checks passed
@github-actions github-actions Bot deleted the fix-litellm branch March 25, 2026 17:10
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

backend 🔷 Issues related to backend services (OpenSearch, Langflow, APIs) bug 🔴 Something isn't working. docker lgtm

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants