Skip to content

FIX Adoption prompt: New way to obtain position embeddings#2276

Merged
BenjaminBossan merged 1 commit intohuggingface:mainfrom
BenjaminBossan:fix-adoption-prompt-rotary-emb
Dec 13, 2024
Merged

FIX Adoption prompt: New way to obtain position embeddings#2276
BenjaminBossan merged 1 commit intohuggingface:mainfrom
BenjaminBossan:fix-adoption-prompt-rotary-emb

Conversation

@BenjaminBossan
Copy link
Copy Markdown
Member

This PR resolves the failing adoption prompt tests in the CI using transformers installed from source.

In this transformers PR:

huggingface/transformers#34858

the module.rotary_emb attribute has been removed, which adoption prompt so far assumed was present. Instead, the position_embeddings are now already computed and can be taken directly from the kwargs.

This PR resolves the failing adoption prompt tests in the CI using
transformers installed from source.

In this transformers PR:

huggingface/transformers#34858

the module.rotary_emb attribute has been removed, which adoption prompt
so far assumed was present. Instead, the position_embeddings are now
already computed and can be taken directly from the kwargs.
@HuggingFaceDocBuilderDev
Copy link
Copy Markdown

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@BenjaminBossan
Copy link
Copy Markdown
Member Author

@Cyrilvallez The refactoring of rotary embeddings broke a function in PEFT that calculates the query states. This PR should hopefully fix that. It would be great if you could check if my changes make sense.

Copy link
Copy Markdown
Member

@Cyrilvallez Cyrilvallez left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Makes sense to me! If needed, the rotary_emb module can be retrieved from the Model module, not the Attention module anymore!

@Cyrilvallez
Copy link
Copy Markdown
Member

But cos and sin are guaranteed to be present during the model's forward anyway 👌

@BenjaminBossan BenjaminBossan merged commit ae55fdc into huggingface:main Dec 13, 2024
@BenjaminBossan BenjaminBossan deleted the fix-adoption-prompt-rotary-emb branch December 13, 2024 17:19
Guy-Bilitski pushed a commit to Guy-Bilitski/peft that referenced this pull request May 13, 2025
This PR resolves the failing adoption prompt tests in the CI using
transformers installed from source.

In this transformers PR:

huggingface/transformers#34858

the module.rotary_emb attribute has been removed, which adoption prompt
so far assumed was present. Instead, the position_embeddings are now
already computed and can be taken directly from the kwargs.
cyyever pushed a commit to cyyever/peft that referenced this pull request Sep 4, 2025
* Refactor reward processing in OnlineDPOTrainer

* Refactor completion decoding and reward processing

* remove strip

* remove warning

* Add reward_tokenizer to training script

* Add reward_tokenizer and reward_processing_class to OnlineDPOTrainer test

* propagate to xpo and nash

* style

* reduce memory requirement with inference_mode

* fix tests

* pairrm judge llmblender

* setUpClass(cls)

* Add setUpClass method to TestJudges class

* truncation left for reward tokenizer

* don't logcompletion without eval dataset

* only eval when possible
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants