Skip to content

Comments

support fuse/unfuse multiple lora#5129

Closed
AnyISalIn wants to merge 1 commit intohuggingface:mainfrom
AnyISalIn:main
Closed

support fuse/unfuse multiple lora#5129
AnyISalIn wants to merge 1 commit intohuggingface:mainfrom
AnyISalIn:main

Conversation

@AnyISalIn
Copy link
Contributor

@AnyISalIn AnyISalIn commented Sep 21, 2023

What does this PR do?

This pull request adds support for multiple LORA fuse and unfuse operations. It introduces an optional parameter lora_name in the fuse_lora/unfuse_lora functions. Here is an example:

JUPYTER EXAMPLE

prompt = "masterpiece, best quality, 1girl, at dusk"
negative_prompt = ("(low quality, worst quality:1.4), (bad anatomy), (inaccurate limb:1.2), "
                   "bad composition, inaccurate eyes, extra digit, fewer digits, (extra arms:1.2), large breasts")

def generate_image(pipeline, images_list):
  images_list.extend(pipeline(prompt=prompt, 
    negative_prompt=negative_prompt, 
    width=512, 
    height=768, 
    num_inference_steps=15, 
    num_images_per_prompt=1,
    generator=torch.manual_seed(0)
).images)
pipeline.load_lora_weights(".", weight_name="light_and_shadow.safetensors")
pipeline.fuse_lora(lora_scale=0.5, lora_name="light_and_shadow")

generate_image(pipeline, images_list)
pipeline.load_lora_weights(".", weight_name="more_details.safetensors")
pipeline.fuse_lora(lora_scale=0.5, lora_name="more_details")

generate_image(pipeline, images_list)

unfuse a lora

pipeline.unfuse_lora(lora_name="more_details")

generate_image(pipeline, images_list)

NOTE: If you don’t specify a lora_name in fuse_lora, it will be set to the default. If you don’t specify one in unfuse_lora, it will unfuse the last LORA.

Fixes # (https://github.com/huggingface/diffusers/issues/5032)

Before submitting

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

Signed-off-by: AnyISalIn <anyisalin@gmail.com>
@patrickvonplaten
Copy link
Contributor

Hey @AnyISalIn,

Sorry we're in the process of replacing all of the LoRA logic with PEFT, see:

@AnyISalIn
Copy link
Contributor Author

@patrickvonplaten awesome! Is Kohya Lora still supporte?

@patrickvonplaten
Copy link
Contributor

@patrickvonplaten awesome! Is Kohya Lora still supporte?

Yes 100%

@AnyISalIn AnyISalIn closed this Sep 28, 2023
@leopedroso45
Copy link

Hey @AnyISalIn,

Sorry we're in the process of replacing all of the LoRA logic with PEFT, see:

that means we have support for multiple loras? Couldnt find docs for that

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

How to unfuse_lora only the first one after I have added multiple lora?

3 participants