update AutoGuess template#1627
Conversation
fix rwkv bug add vicuna, dots llm
|
did you test the vicuna one? I have not found a single model that matches it. if its not a real template in use we should not add it. |
|
wizardlm-2 (https://huggingface.co/alpindale/WizardLM-2-8x22B) uses a variant of Vicuna - I updated the AutoGuess template to match |
|
Also what's wrong with the rwkv template, that was from @henk717 Generally for these weird and uncommon formats I feel its not ideal to clutter the autoguess with something that literally nobody uses except for 1 tiny case, vicuna is basically unused in the wild and when necessary alpaca outperforms it in all cases. |
|
Rwkv should remain as is. It was directly given to me by one of the rwkv developers so only they should have the say on it. Just like with Mistral i like to keep the official templates official. |
|
The rwkv template in autoguess.json does not match the one in https://github.com/BlinkDL/ChatRWKV/blob/main/API_DEMO_CHAT.py. Will close this pr out as the added templates are for "uncommon" models, but I recommend fixing the rwkv template in a separate pr. |
|
+1 on fixing the RWKV chat templates. This is what the ChatRWKV demo code looks like: out = run_rnn("User: " + msg + "\n\nAssistant:")where |
fix rwkv bug
add vicuna, dots llm