Skip to content

Revert to OAI-compatible args#20213

Merged
pwilkin merged 2 commits intoggml-org:masterfrom
pwilkin:oai-compat-args
Mar 8, 2026
Merged

Revert to OAI-compatible args#20213
pwilkin merged 2 commits intoggml-org:masterfrom
pwilkin:oai-compat-args

Conversation

@pwilkin
Copy link
Copy Markdown
Member

@pwilkin pwilkin commented Mar 7, 2026

Reverts the output of function arguments to OpenAI-compatible format. Supersedes #20202

@pwilkin pwilkin requested a review from aldehir March 7, 2026 20:12
@pwilkin
Copy link
Copy Markdown
Member Author

pwilkin commented Mar 7, 2026

Fixes #20198

@aldehir aldehir closed this Mar 7, 2026
@aldehir aldehir reopened this Mar 7, 2026
@aldehir
Copy link
Copy Markdown
Contributor

aldehir commented Mar 7, 2026

What the heck... My keyboard went crazy, probably because my cat walked over it earlier.

@aldehir
Copy link
Copy Markdown
Contributor

aldehir commented Mar 7, 2026

Need to apply workaround::func_args_not_string() otherwise templates fail on tool_call.arguments | items.

@github-actions github-actions Bot added the testing Everything test related label Mar 7, 2026
@tarruda
Copy link
Copy Markdown

tarruda commented Mar 7, 2026

@pwilkin should I create a separate issue for #20198 (comment) ? BTW, not having structured JSON output working means that tool calls are more brittle since the LLM can potentially output tool calls that don't match the tool parameter JSON schema.

@pwilkin
Copy link
Copy Markdown
Member Author

pwilkin commented Mar 8, 2026

@tarruda nah, schema for tools is enforced separately. Yeah, please make a separate issue for structured output.

@pwilkin
Copy link
Copy Markdown
Member Author

pwilkin commented Mar 8, 2026

@aldehir done

@tarruda
Copy link
Copy Markdown

tarruda commented Mar 8, 2026

@pwilkin done: #20221

I ran git bisect and confirm it was introduced by the Autoparser PR

@pwilkin
Copy link
Copy Markdown
Member Author

pwilkin commented Mar 8, 2026

@tarruda yeah, I know, people already mentioned it in the autoparser thread and I had a mental note to do a PR for it but forgot somehow :/

@pwilkin pwilkin merged commit b283f6d into ggml-org:master Mar 8, 2026
76 of 78 checks passed
@alvis233
Copy link
Copy Markdown

alvis233 commented Mar 9, 2026

Huge thx to pwilkin and aldehir🙏🏻

bartowski1182 pushed a commit to bartowski1182/llama.cpp that referenced this pull request Mar 10, 2026
* Revert to OAI-compatible args

* Apply workaround::func_args_not_string
Ethan-a2 pushed a commit to Ethan-a2/llama.cpp that referenced this pull request Mar 20, 2026
* Revert to OAI-compatible args

* Apply workaround::func_args_not_string
Seunghhon pushed a commit to Seunghhon/llama.cpp that referenced this pull request Apr 26, 2026
* Revert to OAI-compatible args

* Apply workaround::func_args_not_string
rsenthilkumar6 pushed a commit to rsenthilkumar6/llama.cpp that referenced this pull request May 1, 2026
* Revert to OAI-compatible args

* Apply workaround::func_args_not_string
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

testing Everything test related

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Eval bug: llama-server tool_calls returns arguments as JSON object instead of string, breaking OpenAI compatibility

4 participants