This repository has been archived by the owner on Jan 24, 2024. It is now read-only.
Tried multiple different models but get "The model weights are not tied..." error every time.. #266
Labels
question
Further information is requested
Hi,
I'm running Basaran via Docker and I have now tried using several different models at this point but every time after its downloaded and load everything, I'm facing with this error:
The model weights are not tied. Please use the `tie_weights` method before using the `infer_auto_device` function
Am I missing something? I've tried multiple GPTQ models from TheBloke and even the official Llama2-13b model but this error is thrown every single time regardless of the model and it prevents me from using Basaran at all.
Any help would be appreciated. Thanks in advance.
The text was updated successfully, but these errors were encountered: