Skip to content

How can I change alpha of LoRA? #2117

@hysts

Description

@hysts

I'm not that familiar with the implementation detail of LoRA and wondering how I can change alpha in inference time.
I found a field scale here:


so I though I can change it by implementing something like tune_lora_scale in the original implementation:
https://github.com/cloneofsimo/lora/blob/e19f6ae22c3fa4d6da252c5b7443fc0942088cd7/lora_diffusion/lora.py#L597-L600
but actually it doesn't seem to be used:
self, attn: CrossAttention, hidden_states, encoder_hidden_states=None, attention_mask=None, scale=1.0

It looks like scale has to be passed as an argument to this module, but how can I do that?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions