Skip to content

Fixed multirank + multialpha for sequential LoRAs, added correct support of LoRA-C3Lier conversion#937

Merged
pacman100 merged 4 commits intohuggingface:smangrul/lora-supoort-multirank-multialphafrom
kovalexal:lora-supoort-multirank-multialpha
Sep 18, 2023
Merged

Fixed multirank + multialpha for sequential LoRAs, added correct support of LoRA-C3Lier conversion#937
pacman100 merged 4 commits intohuggingface:smangrul/lora-supoort-multirank-multialphafrom
kovalexal:lora-supoort-multirank-multialpha

Conversation

@kovalexal
Copy link
Contributor

@pacman100 Hi!

As discussed in #873, here are the modifications I've done to fix the issue I had mentioned.

Also, I would like to add proper support for special type of LoRAs for StableDiffusion - LoRA-C3Lier - this is just a LoRA with different rank and alpha for conv2d layers, so I refactored the conversion script.
The new conversion script works better - it automatically detects whether the provided LoRA contains conv2d (1x1, 3x3) layers or not and provides custom ranks / alphas for these layers if needed.
This script can be easily modified for proper support of SDXL LoRAs, so I was going to add it in separate PR if you are comfortable with it.

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint.

@BenjaminBossan
Copy link
Member

Thanks @kovalexal. Could you please run make style?

@kovalexal
Copy link
Contributor Author

@BenjaminBossan Hi!

Sorry for that, fixed styling.

Copy link
Contributor

@pacman100 pacman100 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you @kovalexal for all the work on support multirank and multialpha LoRA layers. Left a comment

@kovalexal
Copy link
Contributor Author

@pacman100 seems strange that this test is failing on Windows. What diffusers version is used for running tests?
As far as I see, diffusers have changed the signature to LoRACompatibleConv - earlier it accepted only a single argument, but now they've added a second argument - scale - probably the error in the test is related to it.

@BenjaminBossan
Copy link
Member

BenjaminBossan commented Sep 18, 2023

That could be the explanation. We're working on a proper fix, in the meantime we have fixed the diffusers version in #936. So probably this branch needs to merge main for the test to pass.

Edit: Btw. the test failing is probably unrelated to Windows, it just happened that Windows was the first CI run to fail, which cancelled all other runs.

@kovalexal
Copy link
Contributor Author

@BenjaminBossan ok, got it.

Should I merge main into this branch or we will do it in parent PR #873?

@BenjaminBossan
Copy link
Member

If @pacman100 is fine with the PR as is, I'd suggest we merge it and do a full on deep review on his PR once it's finished.

@pacman100
Copy link
Contributor

Hello, Thank you @kovalexal 🤗! Merging it.

@pacman100 pacman100 merged commit 2433abb into huggingface:smangrul/lora-supoort-multirank-multialpha Sep 18, 2023
pacman100 added a commit that referenced this pull request Sep 22, 2023
* support multiple ranks and alphas

* Update lora.py

* Update lora.py

* commit suggestions

Co-Authored-By: Benjamin Bossan <BenjaminBossan@users.noreply.github.com>

* address comments

Co-Authored-By: Benjamin Bossan <BenjaminBossan@users.noreply.github.com>

* Fixed multirank + multialpha for sequential LoRAs, added correct support of LoRA-C3Lier conversion (#937)

* Fixed multirank multialpha for sequential loras, added tests, fixed docs

* Refactored kohya_ss conversion script for proper support of LoRA-C3Lier

* Fixed styling

* Removed old comment from docstring

* shift `scale_layer`/`unscale_layer` to `LoraLayer` class to support all the child classes

* support multiple active adapters

* add `active_adapters` property

Co-Authored-By: Benjamin Bossan <BenjaminBossan@users.noreply.github.com>

* fix bug related to active adapter of `ModulesToSaveWrapper`

* revert the change wrt active_adapter assignment

Co-Authored-By: Younes Belkada <49240599+younesbelkada@users.noreply.github.com>

* Apply suggestions from code review

Co-authored-by: Younes Belkada <49240599+younesbelkada@users.noreply.github.com>

* Apply suggestions from code review

Co-authored-by: Younes Belkada <49240599+younesbelkada@users.noreply.github.com>

* addressing comments

* address comments

* address comment

---------

Co-authored-by: Benjamin Bossan <BenjaminBossan@users.noreply.github.com>
Co-authored-by: Alexander Kovalchuk <kovalexal@gmail.com>
Co-authored-by: Younes Belkada <49240599+younesbelkada@users.noreply.github.com>
Guy-Bilitski pushed a commit to Guy-Bilitski/peft that referenced this pull request May 13, 2025
* support multiple ranks and alphas

* Update lora.py

* Update lora.py

* commit suggestions

Co-Authored-By: Benjamin Bossan <BenjaminBossan@users.noreply.github.com>

* address comments

Co-Authored-By: Benjamin Bossan <BenjaminBossan@users.noreply.github.com>

* Fixed multirank + multialpha for sequential LoRAs, added correct support of LoRA-C3Lier conversion (huggingface#937)

* Fixed multirank multialpha for sequential loras, added tests, fixed docs

* Refactored kohya_ss conversion script for proper support of LoRA-C3Lier

* Fixed styling

* Removed old comment from docstring

* shift `scale_layer`/`unscale_layer` to `LoraLayer` class to support all the child classes

* support multiple active adapters

* add `active_adapters` property

Co-Authored-By: Benjamin Bossan <BenjaminBossan@users.noreply.github.com>

* fix bug related to active adapter of `ModulesToSaveWrapper`

* revert the change wrt active_adapter assignment

Co-Authored-By: Younes Belkada <49240599+younesbelkada@users.noreply.github.com>

* Apply suggestions from code review

Co-authored-by: Younes Belkada <49240599+younesbelkada@users.noreply.github.com>

* Apply suggestions from code review

Co-authored-by: Younes Belkada <49240599+younesbelkada@users.noreply.github.com>

* addressing comments

* address comments

* address comment

---------

Co-authored-by: Benjamin Bossan <BenjaminBossan@users.noreply.github.com>
Co-authored-by: Alexander Kovalchuk <kovalexal@gmail.com>
Co-authored-by: Younes Belkada <49240599+younesbelkada@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants

Comments