Skip to content

[lora] fix zimage lora conversion to support for more lora.#13209

Merged
sayakpaul merged 2 commits intomainfrom
fix-zimage-lora
Mar 5, 2026
Merged

[lora] fix zimage lora conversion to support for more lora.#13209
sayakpaul merged 2 commits intomainfrom
fix-zimage-lora

Conversation

@sayakpaul
Copy link
Member

What does this PR do?

Fixes #13203

Additionally, fixes how "alpha"s are handled in the diffusers format path.

@sayakpaul sayakpaul requested review from asomoza and dg845 March 4, 2026 10:58
@sayakpaul
Copy link
Member Author

@christopher5106 does this PR work for you?


if has_non_diffusers_lora_id:

def get_alpha_scales(down_weight, alpha_key):
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just moving it out of the if block since it can be generally used.

lora_dot_up_key = ".lora.up.weight"
has_lora_dot_format = any(lora_dot_down_key in k for k in state_dict)

if has_lora_dot_format:
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Main additional to support this LoRA checkpoint structure.

@christopher5106
Copy link
Contributor

yes, works for me

Copy link
Member

@asomoza asomoza left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks, LGTM

@asomoza
Copy link
Member

asomoza commented Mar 4, 2026

jsut because I was curious, I stripped all the redundant keys and the lora went down from 761MB to 261MB.... why do people do this?

@christopher5106
Copy link
Contributor

Strange, let me ask if someone knows.

Did you manage to divide more than 2 by deduplicating their q/k/v + fused version + common down lora part, right ?

@asomoza
Copy link
Member

asomoza commented Mar 4, 2026

Did you manage to divide more than 2 by deduplicating their q/k/v + fused version + common down lora part, right ?

@christopher5106 yeah, if you want I can upload the weights.

Also the lora quality isn't nowhere near of the example but I didn't lower the scale

original lora striped lora
test_zimage_lora test_zimage_lora_stripped

@christopher5106
Copy link
Contributor

I'm fine, thanks, but I'm still wondering where it comes from.

In the past, I saw that you managed mixtures of loras for flux1 but I believe that some users do manual hacks on these loras to merge multiple loras into one, without knowing that half of their keys will be dropped, and to me, it makes little sense to support these edge cases.

Copy link
Collaborator

@dg845 dg845 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

@sayakpaul sayakpaul merged commit 3902145 into main Mar 5, 2026
11 of 12 checks passed
@sayakpaul sayakpaul deleted the fix-zimage-lora branch March 5, 2026 02:54
@sayakpaul
Copy link
Member Author

Failing test is unrelated.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Zimage lora support issue

4 participants