Skip to content

add use_streaming config to use chat completion streaming#41

Open
CasparChou wants to merge 1 commit into
umputun:masterfrom
joinserve:feat/streaming-support
Open

add use_streaming config to use chat completion streaming#41
CasparChou wants to merge 1 commit into
umputun:masterfrom
joinserve:feat/streaming-support

Conversation

@CasparChou
Copy link
Copy Markdown

@CasparChou CasparChou commented Apr 19, 2026

Changes Summary

📝 Overview

Adds use_streaming configuration option to enable chat completion streaming mode. Required for LLM providers (e.g., ChatGPT subscription via litellm) where non-streaming API is unavailable.

🔧 Changes

Modified Files: 3 | Additions: +152 | Deletions: -3

1. pkg/config/config.go

  • Added UseStreaming boolean field to LLMConfig (default: false)

2. pkg/llm/classifier.go

  • New createChatCompletion() method: routes to streaming or non-streaming based on config
  • Streaming mode: accumulates streamed chunks into standard API response format
  • Updated 3 API calls to use new method

3. pkg/llm/classifier_test.go

  • Added TestClassifier_Streaming() - validates streaming functionality
  • Added TestClassifier_Streaming_DefaultOff() - ensures backward compatibility (disabled by default)

@CasparChou CasparChou closed this Apr 19, 2026
@CasparChou
Copy link
Copy Markdown
Author

hi @umputun I accidentally set the wrong base branch when opening the PR after forking. I also wanted to check whether you’re interested in this feature. It allows using a litellm proxy to connect to the ChatGPT subscription plan, which is a cost‑saving approach that won’t incur additional token-based fees.

@CasparChou CasparChou reopened this Apr 22, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant