FIX Bug with PeftConfig.from_pretrained#2397
Merged
BenjaminBossan merged 1 commit intohuggingface:mainfrom Mar 4, 2025
Merged
Conversation
In huggingface#2038, we added a change to PEFT to make PEFT configs forward compatible. To recap, when we add a new config value, say foo, for the LoraConfig, normally users of older PEFT versions would get an error when trying to load it because LoraConfig would not accept a foo argument. Now, we remove this unknown arg and just give a warning. In general, this worked well, but there was a bug when using PeftConfig.from_pretrained instead of the more specific LoraConfig.from_pretrained etc. In that case, we would check the known arguments from the PeftConfig type, which are only a few. This means that we would ignore parameters like the rank for LoRA. With this PR, that bug is fixed. As we know the specific PEFT config, we can use that instead of the PeftConfig super type to determine the unknown parameters. Therefore, PeftConfig.from_pretrained will work the same as LoraConfig.from_pretrained. Note that when a user uses PeftModel.from_pretrained, under the hood it will use the more specific PEFT config, i.e. LoraConfig etc. Therefore, the described bug would not occur there. It is thus very unlikely that this bug affected many (or any) users in the wild.
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
githubnemo
approved these changes
Mar 4, 2025
Guy-Bilitski
pushed a commit
to Guy-Bilitski/peft
that referenced
this pull request
May 13, 2025
In huggingface#2038, we added a change to PEFT to make PEFT configs forward compatible. To recap, when we add a new config value, say foo, for the LoraConfig, normally users of older PEFT versions would get an error when trying to load it because LoraConfig would not accept a foo argument. Now, we remove this unknown arg and just give a warning. In general, this worked well, but there was a bug when using PeftConfig.from_pretrained instead of the more specific LoraConfig.from_pretrained etc. In that case, we would check the known arguments from the PeftConfig type, which are only a few. This means that we would ignore parameters like the rank for LoRA. With this PR, that bug is fixed. As we know the specific PEFT config, we can use that instead of the PeftConfig super type to determine the unknown parameters. Therefore, PeftConfig.from_pretrained will work the same as LoraConfig.from_pretrained. Note that when a user uses PeftModel.from_pretrained, under the hood it will use the more specific PEFT config, i.e. LoraConfig etc. Therefore, the described bug would not occur there. It is thus very unlikely that this bug affected many (or any) users in the wild.
cyyever
pushed a commit
to cyyever/peft
that referenced
this pull request
Sep 4, 2025
* fix slow CI * fix dpo * formatting * Apply suggestions from code review * `setup_chat_format` may add a pad token --------- Co-authored-by: Quentin Gallouédec <45557362+qgallouedec@users.noreply.github.com> Co-authored-by: Quentin Gallouédec <quentin.gallouedec@huggingface.co>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
In #2038, we added a change to PEFT to make PEFT configs forward compatible. To recap, when we add a new config value, say
foo, for theLoraConfig, normally users of older PEFT versions would get an error when trying to load it becauseLoraConfigwould not accept afooargument. Now, we remove this unknown arg and just give a warning.In general, this worked well, but there was a bug when using
PeftConfig.from_pretrainedinstead of the more specificLoraConfig.from_pretrainedetc. In that case, we would check the known arguments from thePeftConfigtype, which are only a few. This means that we would ignore parameters like the rankrfor LoRA.With this PR, that bug is fixed. As we know the specific PEFT config, we can use that instead of the
PeftConfigsuper type to determine the unknown parameters. Therefore,PeftConfig.from_pretrainedwill work the same asLoraConfig.from_pretrainedetc.Note that when a user uses
PeftModel.from_pretrained, under the hood it will use the more specific PEFT config, i.e.LoraConfigetc. Therefore, the described bug would not occur there. It is thus very unlikely that this bug affected many (or any) users in the wild.