Skip to content

server : do not create checkpoints right after mtmd chunks#20232

Merged
ggerganov merged 1 commit intomasterfrom
gg/server-fix-mtmd-ckpt
Mar 8, 2026
Merged

server : do not create checkpoints right after mtmd chunks#20232
ggerganov merged 1 commit intomasterfrom
gg/server-fix-mtmd-ckpt

Conversation

@ggerganov
Copy link
Copy Markdown
Member

@ggerganov ggerganov commented Mar 8, 2026

fix #20222

The checkpoint logic requires at least one text token to be present at the end of the checkpoint.

@ggerganov ggerganov marked this pull request as ready for review March 8, 2026 20:16
@ggerganov ggerganov requested a review from ngxson as a code owner March 8, 2026 20:16
@ggerganov ggerganov merged commit d417bc4 into master Mar 8, 2026
76 of 78 checks passed
@ggerganov ggerganov deleted the gg/server-fix-mtmd-ckpt branch March 8, 2026 20:16
bartowski1182 pushed a commit to bartowski1182/llama.cpp that referenced this pull request Mar 10, 2026
Ethan-a2 pushed a commit to Ethan-a2/llama.cpp that referenced this pull request Mar 20, 2026
Seunghhon pushed a commit to Seunghhon/llama.cpp that referenced this pull request Apr 26, 2026
rsenthilkumar6 pushed a commit to rsenthilkumar6/llama.cpp that referenced this pull request May 1, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Eval bug: "Chunk not found" crash with hybrid attention models (Qwen3.5) under parallel load

1 participant