Skip to content

Conversation

@ckl117
Copy link
Collaborator

@ckl117 ckl117 commented Nov 7, 2025

Motivation

The mapping of max_logprobs=-1 to vobal_size will cause tokenizer to decode illegal tokens(exceed tokenizer len).

Modifications

max_lgprobes=-1 maps to ori_vobal_size

Usage or Command

max_lgprobes=-1 maps to ori_vobal_size

Accuracy Tests

max_lgprobes=-1 maps to ori_vobal_size

Checklist

  • Add at least a tag in the PR title.
    • Tag list: [[FDConfig],[APIServer],[Engine], [Scheduler], [PD Disaggregation], [Executor], [Graph Optimization], [Speculative Decoding], [RL], [Models], [Quantization], [Loader], [OP], [KVCache], [DataProcessor], [BugFix], [Docs], [CI], [Optimization], [Feature], [Benchmark], [Others], [XPU], [HPU], [GCU], [DCU], [Iluvatar], [Metax]]
    • You can add new tags based on the PR content, but the semantics must be clear.
  • Format your code, run pre-commit before commit.
  • Add unit tests. Please write the reason in this PR if no unit tests.
  • Provide accuracy results.
  • If the current PR is submitting to the release branch, make sure the PR has been submitted to the develop branch, then cherry-pick it to the release branch with the [Cherry-Pick] PR tag.

@paddle-bot
Copy link

paddle-bot bot commented Nov 7, 2025

Thanks for your contribution!

@ckl117 ckl117 changed the title [Others] max_lgprobes=-1 maps to ori_vobal_size [Others] max_lgprobes=-1 maps to ori_vocab_size Nov 7, 2025
@ckl117 ckl117 changed the title [Others] max_lgprobes=-1 maps to ori_vocab_size [BugFix] max_lgprobes=-1 maps to ori_vocab_size Nov 7, 2025
@Jiang-Jia-Jun Jiang-Jia-Jun merged commit 80aedb8 into PaddlePaddle:develop Nov 7, 2025
13 of 15 checks passed
juncaipeng pushed a commit to juncaipeng/FastDeploy that referenced this pull request Nov 10, 2025
* -1 ori_vobal_size

* check

* check

* check

* revert config.py
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants