You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Jan 24, 2024. It is now read-only.
I'm running Basaran via Docker and I have now tried using several different models at this point but every time after its downloaded and load everything, I'm facing with this error: The model weights are not tied. Please use the `tie_weights` method before using the `infer_auto_device` function
Am I missing something? I've tried multiple GPTQ models from TheBloke and even the official Llama2-13b model but this error is thrown every single time regardless of the model and it prevents me from using Basaran at all.