Skip to content

fix(llm): Add LLM_STREAMING_MODE option for custom endpoints#1

Merged
trevormells merged 1 commit intotrevormells:mmigrate_to_litellmfrom
rothnic:fix/litellm-custom-endpoints
Mar 3, 2026
Merged

fix(llm): Add LLM_STREAMING_MODE option for custom endpoints#1
trevormells merged 1 commit intotrevormells:mmigrate_to_litellmfrom
rothnic:fix/litellm-custom-endpoints

Conversation

@rothnic
Copy link
Copy Markdown

@rothnic rothnic commented Mar 2, 2026

Adds LLM_STREAMING_MODE environment variable to enable streaming mode for LLM proxies that require it. When enabled, uses LiteLLM with stream=True and accumulates chunks into plain text response.

I discovered the need for this while testing out the litellm migration while using the openai-compatible proxy, 9router so that I could leverage my chatgpt plus, kimi for coding, and copilot subscriptions through one endpoint with fallbacks, etc.

Adds LLM_STREAMING_MODE environment variable to enable streaming mode
for LLM proxies that require it. When enabled, uses LiteLLM with
stream=True and accumulates chunks into plain text response.
@trevormells trevormells merged commit 6f6c844 into trevormells:mmigrate_to_litellm Mar 3, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants