🚨 Remove Group Beam Search decoding strategy#40495
🚨 Remove Group Beam Search decoding strategy#40495manueldeprada merged 6 commits intohuggingface:mainfrom
Conversation
| "exponential_decay_length_penalty": None, | ||
| "suppress_tokens": None, | ||
| "begin_suppress_tokens": None, | ||
| # Deprecated arguments (moved to the Hub). TODO joao, manuel: remove in v4.62.0 |
There was a problem hiding this comment.
Note that if we remove this, initializing config arguments on the model will stop working, as in
transformers/tests/generation/test_utils.py
Lines 2837 to 2848 in 52aaa3f
| "hf-internal-testing/tiny-random-bart", | ||
| eos_token_id=None, | ||
| ) | ||
| generation_config = GenerationConfig( |
There was a problem hiding this comment.
I moved the gen args to a gen config instead of model config, see previous comment.
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
|
[For maintainers] Suggested jobs to run (before merge) run-slow: csm, dia, janus, musicgen, musicgen_melody, recurrent_gemma, rwkv |
|
build-doc |
gante
left a comment
There was a problem hiding this comment.
(from a quick glance, the changes are the same as in the CS PR, which I already approved)
Removes Group Beam Search generation strategy from the codebase. Directs users to the
transformers-community/group-beam-searchrepository.It has been a warning for a few releases, but now
trust_remote_code=Trueis required to run group beam search.Depends on #40480