Skip to content

Fixes the bug with not being able to handle loras targeting different sets of modulestosave#2423

Closed
saeid93 wants to merge 1 commit intohuggingface:mainfrom
saeid93:different-modulestosave-bug-fix
Closed

Fixes the bug with not being able to handle loras targeting different sets of modulestosave#2423
saeid93 wants to merge 1 commit intohuggingface:mainfrom
saeid93:different-modulestosave-bug-fix

Conversation

@saeid93
Copy link
Copy Markdown
Contributor

@saeid93 saeid93 commented Mar 11, 2025

Fixes #2422

@BenjaminBossan
Copy link
Copy Markdown
Member

Thanks for the bug report and for providing a fix. I think this is a regression introduced by #2376, since your example works when I checkout PEFT v0.14.0.

Before proceeding, could you please create a unit test to catch the bug? It could be based on your original example and be placed here. Just note that you should not re-use the same base model for get_peft_model but create a new one each time, since the base model is mutated, which can have unintended side effects.

@githubnemo
Copy link
Copy Markdown
Collaborator

Hey @saeid93, thanks for your fix. I spent some time working on that issue. Good catch on the disabled adapter! I think that I've found a fix that doesn't limit the fix to ModulesToSaveWrapper and also supports other AuxiliaryTrainingWrapper such as TrainableTokensWrapper. It'd be great if you take a look at #2430 as well :)

@saeid93
Copy link
Copy Markdown
Contributor Author

saeid93 commented Mar 14, 2025

Hi @BenjaminBossan and @githubnemo

Thank you both for your feedback.
I'll add the test and check #2340 when I get a chance. Just a note about your comment this is partially related to the recent changes by #2376 but even in older versions although it doesn't return an error it returns the wrong response as the LoRAs get disabled when iterated on a non-existant moduletosave at

module.enable_adapters(False)
. Even in previous version I had to access the module to save layer manually to enable the lora like below to make it enable it in my maincode:

            model.classifiers.enable_adapters(True)

I just added a new line to ensure the lora is always enabled as @githubnemo also mentioned.

@githubnemo
Copy link
Copy Markdown
Collaborator

Hi @BenjaminBossan and @githubnemo

Thank you both for your feedback. I'll add the test and check #2340 when I get a chance.

Before doing that, maybe take a look at PR #2430 first to avoid doing more work than necessary, I think it fixes both the issues you mentioned.

@githubnemo
Copy link
Copy Markdown
Collaborator

We're going ahead with #2430 which supersedes this, so I'm closing this PR. Thank you again for this PR :)

If you have any additional comments or think that we've missed something, do not hesitate to say so.

@githubnemo githubnemo closed this Mar 18, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

load_adapter Fails When modules_to_save Are Different for Each Adapter

3 participants