Skip to content

Chroma#945

Merged
dxqb merged 22 commits intoNerogar:masterfrom
dxqb:chroma
Aug 31, 2025
Merged

Chroma#945
dxqb merged 22 commits intoNerogar:masterfrom
dxqb:chroma

Conversation

@dxqb
Copy link
Copy Markdown
Collaborator

@dxqb dxqb commented Aug 20, 2025

Use and test:
git clone https://github.com/dxqb/OneTrainer -b chroma
./update.sh
Select the #chroma preset

select a LoRA layer filter, not full, otherwise distilled_guidance_layer is trained which is not recommended.

code comments:

  • Chroma is different enough from Flux that I made it a separate model, instead of changing the Flux model code
  • therefore there is a lot of code duplication - but this duplication also exists between other models, so I've decided against touching that in this PR. More code sharing between models can be done as a separate PR

open points:

Chroma bs2 512 2s/it
Chroma bs2 1024 10.3 s/it
Flux bs2 512 2.1s/it
Flux bs2 1024 8.7 s/it
  • "Force Attention Mask" is not useful with Chroma; remove
  • implement Model loading from huggingface using parallel downloads #770 for Chroma
  • there are reports that embedding training fails on Chroma - could be an issue with the code, or inherent to the model. Check embedding vectors during training - everything seems ok
  • Loading Chroma from single file safetensors fails, probably a diffusers bug

Comment thread modules/modelSaver/chroma/ChromaEmbeddingSaver.py Outdated
@rktvr

This comment was marked as resolved.

@dxqb
Copy link
Copy Markdown
Collaborator Author

dxqb commented Aug 26, 2025

thank you for your work. is this stable enough to test/try training a lora with on chroma?

several people have tested it successfully. all known open issues you can find above in the task list.

@dxqb dxqb marked this pull request as ready for review August 30, 2025 22:20
@dxqb dxqb merged commit 014c276 into Nerogar:master Aug 31, 2025
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Feat]: Add support for training Chroma Flux

2 participants