Is your feature request related to a problem? Please describe.
Currently your documentation lists minicpm-v-2.6 with MiniCPMv26ChatHandler.
Since MiniCPM-V 4.5 is out - could you please supprt it?
Describe the solution you'd like
MiniCPM-V 4.5 is supported in llama-cpp-python
Describe alternatives you've considered
N/A
Additional context
llama.cpp already supports it Add any other context or screenshots about the feature request here.
Is your feature request related to a problem? Please describe.
Currently your documentation lists
minicpm-v-2.6withMiniCPMv26ChatHandler.Since
MiniCPM-V 4.5is out - could you please supprt it?Describe the solution you'd like
MiniCPM-V 4.5is supported inllama-cpp-pythonDescribe alternatives you've considered
N/A
Additional context
llama.cpp already supports it Add any other context or screenshots about the feature request here.