You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
(autogptq) root@XXX:/mnt/e/Downloads/AutoGPTQ-API# python blocking_api.py
Traceback (most recent call last):
File "/mnt/e/Downloads/AutoGPTQ-API/blocking_api.py", line 29, in<module>
model = AutoGPTQForCausalLM.from_quantized(model_name_or_path,
File "/root/miniconda3/envs/autogptq/lib/python3.10/site-packages/auto_gptq/modeling/auto.py", line 108, in from_quantized
return quant_func(
File "/root/miniconda3/envs/autogptq/lib/python3.10/site-packages/auto_gptq/modeling/_base.py", line 791, in from_quantized
raise FileNotFoundError(f"Could not find model in {model_name_or_path}")
FileNotFoundError: Could not find model in ../models/WizardCoder-15B-1.0-GPTQ
And my self-check output under WSL2 + conda + python 3.10 following README.md in the repository itself:
Hi @bonuschild,
This is a pretty old repository. The way Auto-GPTQ loads the model may have changed. Please refer to the similar issue in the Auto-GPTQ repository. AutoGPTQ/AutoGPTQ#133 (comment)
Here is my output after executing:
And my self-check output under WSL2 + conda + python 3.10 following README.md in the repository itself:
The model is actually exists but why Auto-GPTQ can not find it?
The text was updated successfully, but these errors were encountered: