Skip to content

Fixing Kohya loras loading: Flux.1-dev loras with TE ("lora_te1_" prefix)#13188

Merged
sayakpaul merged 2 commits intohuggingface:mainfrom
scenario-labs:fix_lora_flux1
Feb 27, 2026
Merged

Fixing Kohya loras loading: Flux.1-dev loras with TE ("lora_te1_" prefix)#13188
sayakpaul merged 2 commits intohuggingface:mainfrom
scenario-labs:fix_lora_flux1

Conversation

@christopher5106
Copy link
Contributor

@christopher5106 christopher5106 commented Feb 26, 2026

Text encoder lora layers are dropped for some loras such as this one
A log message confirms it:
No LoRA keys associated to CLIPTextModel found with the prefix='text_encoder'. This is safe to ignore if LoRA state dict didn't originally have any CLIPTextModel related params. You can also try specifying prefix=None to resolve the warning. Otherwise, open an issue if you think it's unexpected: https://github.com/huggingface/diffusers/issues/new
At least, this PR brings more consistency (wherever there is lora_te_, there should be also lora_te1_)

Closes #12053

Reproducers:

FLUX.1 Dev

import torch
from diffusers import FluxPipeline

pipe = FluxPipeline.from_pretrained(
    "black-forest-labs/FLUX.1-dev", torch_dtype=torch.bfloat16
)

pipe.load_lora_weights(
    "scenario-labs/kohya-sd-scripts-loras", weight_name="flux_lora.safetensors",
)

FLUX.1 Kontext

import torch
from diffusers import FluxKontextPipeline

pipe = FluxKontextPipeline.from_pretrained(
    "black-forest-labs/FLUX.1-Kontext-dev", torch_dtype=torch.bfloat16
)

pipe.load_lora_weights(
    "scenario-labs/big-head-kontext-lora", weight_name="flux_kontext_lora.safetensors",
)

The Loras have been trained with Kohya/sd-scripts framework.
With the fix of this PR, the warning message disappears, solving the drop of layers for the text encoder.

@sayakpaul

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@sayakpaul sayakpaul merged commit 5910a1c into huggingface:main Feb 27, 2026
10 of 11 checks passed
@christopher5106 christopher5106 deleted the fix_lora_flux1 branch March 5, 2026 18:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Flux1.Dev Kohya Loras text encoder layers no more supported

3 participants