Skip to content

server: add --log-output option#19807

Closed
tarruda wants to merge 1 commit intoggml-org:masterfrom
tarruda:implement-log-output-option
Closed

server: add --log-output option#19807
tarruda wants to merge 1 commit intoggml-org:masterfrom
tarruda:implement-log-output-option

Conversation

@tarruda
Copy link
Copy Markdown

@tarruda tarruda commented Feb 22, 2026

This options enables logging of the LLM output to stdout, which makes it more convenient to inspect what is being generated on the server side. I'm aware of --verbose, but it also mixes llm output with a lot of other stats.

Here's an example of it being used with lm-evaluation-harness:

image

This options enables logging of the LLM output to stdout, which makes it more
convenient to inspect what is being generated on the server side.
@ngxson
Copy link
Copy Markdown
Contributor

ngxson commented Feb 22, 2026

IMO this flag is just too narrow use case, I don't quite like adding it. We already had too many rarely-used / poorly-documented arguments.

This is better to be implemented in a proxy sitting between llama-server and your application, you will have total control about what's being logged.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants