Add basic Ministral 3 3B support (Ernie Image)#436
Add basic Ministral 3 3B support (Ernie Image)#436deinferno wants to merge 2 commits intocity96:mainfrom
Conversation
|
Co-authored-by: hpr <64584739+jarz76@users.noreply.github.com>
Do I need to download the mmproj file? |
|
@city96 Request to merge this into the main branch, thanks~ |
Look like there is no use for mmproj, Ernie image model family lacks editing support for now. In theory if "Ernie Image Edit" model comes out, maybe it's worth enabling mmproj loading now? But looking at qwen3vl loader logic answer is no. |
|
ops.py could be that it didnt work for me originally cause i run a rtx 2080 ti, but at least with this it works. |
I don’t think this is related to model quantization. Ernie’s image quality itself is quite poor — overly sharp and giving a dirty visual impression. You can reduce the steps from 8 to 4, and note that Ernie only shows better aesthetic performance at resolutions within 1K. At 2K, its aesthetics are also quite bad. |




Allows loading Ministral 3 3B model to inference Ernie Image and Ernie Image Turbo.
Used models:
Unet Model GGUF
CLIP Model Instruct GGUFCLIP Model Base GGUF Seems to work better that instruct
Default workflow provided in ComfyUI works just swap out Unet And Clip for GGUF loader nodes.