Skip to content

llama: -fa 1/0/-1 aliases for -fa on/off/auto#15746

Merged
JohannesGaessler merged 1 commit intoggml-org:masterfrom
JohannesGaessler:llama-fa-alias
Sep 2, 2025
Merged

llama: -fa 1/0/-1 aliases for -fa on/off/auto#15746
JohannesGaessler merged 1 commit intoggml-org:masterfrom
JohannesGaessler:llama-fa-alias

Conversation

@JohannesGaessler
Copy link
Copy Markdown
Contributor

Out of habit from llama-bench I'm trying to set FlashAttention via e.g. -fa 0. This PR adds numeric aliases for on/off/auto.

Comment thread common/arg.cpp Outdated
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
if (value == "on" || value == "enabled" || "1") {
if (value == "on" || value == "enabled" || value == "1") {

Same for the rest.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you.

@JohannesGaessler JohannesGaessler merged commit c466abe into ggml-org:master Sep 2, 2025
48 checks passed
walidbr pushed a commit to walidbr/llama.cpp that referenced this pull request Sep 7, 2025
Nexesenex added a commit to Nexesenex/croco.cpp that referenced this pull request Oct 7, 2025
Nexesenex added a commit to Nexesenex/croco.cpp that referenced this pull request Oct 26, 2025
Seunghhon pushed a commit to Seunghhon/llama.cpp that referenced this pull request Apr 26, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants