-
Notifications
You must be signed in to change notification settings - Fork 6.7k
Closed
Description
I'm not that familiar with the implementation detail of LoRA and wondering how I can change alpha in inference time.
I found a field scale here:
| self.scale = 1.0 |
so I though I can change it by implementing something like
tune_lora_scale in the original implementation:https://github.com/cloneofsimo/lora/blob/e19f6ae22c3fa4d6da252c5b7443fc0942088cd7/lora_diffusion/lora.py#L597-L600
but actually it doesn't seem to be used:
| self, attn: CrossAttention, hidden_states, encoder_hidden_states=None, attention_mask=None, scale=1.0 |
It looks like scale has to be passed as an argument to this module, but how can I do that?
Metadata
Metadata
Assignees
Labels
No labels