Fixes the bug with not being able to handle loras targeting different sets of modulestosave#2423
Fixes the bug with not being able to handle loras targeting different sets of modulestosave#2423saeid93 wants to merge 1 commit intohuggingface:mainfrom
Conversation
… sets of modulestosave
|
Thanks for the bug report and for providing a fix. I think this is a regression introduced by #2376, since your example works when I checkout PEFT v0.14.0. Before proceeding, could you please create a unit test to catch the bug? It could be based on your original example and be placed here. Just note that you should not re-use the same base model for |
|
Hey @saeid93, thanks for your fix. I spent some time working on that issue. Good catch on the disabled adapter! I think that I've found a fix that doesn't limit the fix to |
|
Hi @BenjaminBossan and @githubnemo Thank you both for your feedback. Line 781 in 8edaae9 model.classifiers.enable_adapters(True)I just added a new line to ensure the lora is always enabled as @githubnemo also mentioned. |
Before doing that, maybe take a look at PR #2430 first to avoid doing more work than necessary, I think it fixes both the issues you mentioned. |
|
We're going ahead with #2430 which supersedes this, so I'm closing this PR. Thank you again for this PR :) If you have any additional comments or think that we've missed something, do not hesitate to say so. |
Fixes #2422