Skip to content

FIX Bug with PeftConfig.from_pretrained#2397

Merged
BenjaminBossan merged 1 commit intohuggingface:mainfrom
BenjaminBossan:fix-loading-from-peft-config-forward-compatibility
Mar 4, 2025
Merged

FIX Bug with PeftConfig.from_pretrained#2397
BenjaminBossan merged 1 commit intohuggingface:mainfrom
BenjaminBossan:fix-loading-from-peft-config-forward-compatibility

Conversation

@BenjaminBossan
Copy link
Copy Markdown
Member

In #2038, we added a change to PEFT to make PEFT configs forward compatible. To recap, when we add a new config value, say foo, for the LoraConfig, normally users of older PEFT versions would get an error when trying to load it because LoraConfig would not accept a foo argument. Now, we remove this unknown arg and just give a warning.

In general, this worked well, but there was a bug when using PeftConfig.from_pretrained instead of the more specific LoraConfig.from_pretrained etc. In that case, we would check the known arguments from the PeftConfig type, which are only a few. This means that we would ignore parameters like the rank r for LoRA.

With this PR, that bug is fixed. As we know the specific PEFT config, we can use that instead of the PeftConfig super type to determine the unknown parameters. Therefore, PeftConfig.from_pretrained will work the same as LoraConfig.from_pretrained etc.

Note that when a user uses PeftModel.from_pretrained, under the hood it will use the more specific PEFT config, i.e. LoraConfig etc. Therefore, the described bug would not occur there. It is thus very unlikely that this bug affected many (or any) users in the wild.

In huggingface#2038, we added a change to PEFT to make PEFT configs forward
compatible. To recap, when we add a new config value, say foo, for the
LoraConfig, normally users of older PEFT versions would get an error
when trying to load it because LoraConfig would not accept a foo
argument. Now, we remove this unknown arg and just give a warning.

In general, this worked well, but there was a bug when using
PeftConfig.from_pretrained instead of the more specific
LoraConfig.from_pretrained etc. In that case, we would check the known
arguments from the PeftConfig type, which are only a few. This means
that we would ignore parameters like the rank for LoRA.

With this PR, that bug is fixed. As we know the specific PEFT config, we
can use that instead of the PeftConfig super type to determine the
unknown parameters. Therefore, PeftConfig.from_pretrained will work the
same as LoraConfig.from_pretrained.

Note that when a user uses PeftModel.from_pretrained, under the hood it
will use the more specific PEFT config, i.e. LoraConfig etc. Therefore,
the described bug would not occur there. It is thus very unlikely that
this bug affected many (or any) users in the wild.
@HuggingFaceDocBuilderDev
Copy link
Copy Markdown

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@BenjaminBossan BenjaminBossan merged commit d5f5e35 into huggingface:main Mar 4, 2025
@BenjaminBossan BenjaminBossan deleted the fix-loading-from-peft-config-forward-compatibility branch March 4, 2025 16:16
Guy-Bilitski pushed a commit to Guy-Bilitski/peft that referenced this pull request May 13, 2025
In huggingface#2038, we added a change to PEFT to make PEFT configs forward
compatible. To recap, when we add a new config value, say foo, for the
LoraConfig, normally users of older PEFT versions would get an error
when trying to load it because LoraConfig would not accept a foo
argument. Now, we remove this unknown arg and just give a warning.

In general, this worked well, but there was a bug when using
PeftConfig.from_pretrained instead of the more specific
LoraConfig.from_pretrained etc. In that case, we would check the known
arguments from the PeftConfig type, which are only a few. This means
that we would ignore parameters like the rank for LoRA.

With this PR, that bug is fixed. As we know the specific PEFT config, we
can use that instead of the PeftConfig super type to determine the
unknown parameters. Therefore, PeftConfig.from_pretrained will work the
same as LoraConfig.from_pretrained.

Note that when a user uses PeftModel.from_pretrained, under the hood it
will use the more specific PEFT config, i.e. LoraConfig etc. Therefore,
the described bug would not occur there. It is thus very unlikely that
this bug affected many (or any) users in the wild.
cyyever pushed a commit to cyyever/peft that referenced this pull request Sep 4, 2025
* fix slow CI

* fix dpo

* formatting

* Apply suggestions from code review

* `setup_chat_format` may add a pad token

---------

Co-authored-by: Quentin Gallouédec <45557362+qgallouedec@users.noreply.github.com>
Co-authored-by: Quentin Gallouédec <quentin.gallouedec@huggingface.co>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants