-
Notifications
You must be signed in to change notification settings - Fork 9.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Running convert fails with BadZipFile (Bad CRC-32) #4365
Comments
same issue for all of my models, worked before git pull + recomp |
This seems to have been fixed on the last commit. |
not for me still 'convert.py path/to/model' and 'convert.py path/to/model.bin' fail with the last to lines being:
zipfile.BadZipFile: Bad CRC-32 for file 'archive/data/13' |
I've got the same issue. Here's my test run after downloading the model from hf: python3.12 llama.cpp/convert.py aiopsmodel-hfv2 --outfile llama-2-7b-aiopsfinetunedv2-q8_0-gguf --outtype q8_0 It did start with the requirments.txt not installing. I had to replace the ~= with >= to make it install. |
This is due to ImpImporter being removed recently from pip (the Python thread where ImpImporter is deprecated). The zipfile issue also seems to be 3.12 specific. Switching to 3.10 in my environment fixed the issue for me. |
Thanks @JaCraig - Can confirm that switching from 3.12 to 3.10 fixed the issue for me as well. |
Facing the same issue. I don't want to downgrade my python just for this. I think llama.cpp should support Python 3.12. The latest pytorch version (2.2.1) now also supports Python 3.12 (pytorch/pytorch#110436 (comment)) so it shouldn't be a problem to support it. |
Same issue here, on windows 10 PC: python --version All requriements.txt installed. |
the same issue on macos: llmama.cpp version: python3.10 |
Thanks @JaCraig Solved my problem. |
@MarcoLv412 Same error message here on Windows 11, python=3.8. Try adding "--concurrency=1" which works for me. Seems some problems in multi-thread zipfile reading. |
@mofosyne This is a bug. It is python specific though. Add to docs for now? Using safetensors is a valid workaround. |
This issue was closed because it has been inactive for 14 days since being marked as stale. |
Is this still an issue or has it been fixed because it's marked as closed? I'd rather stick with Python 3.12 which I already have on macos, instead of having to switch to python 3.10 |
Prerequisites
Please answer the following questions for yourself before submitting an issue.
Expected Behavior
The convert runs successfully.
Current Behavior
The convert fails with error like
Environment and Context
Please provide detailed information about your computer setup. This is important in case the issue is not reproducible except for under certain specific conditions.
MacBook Pro, M1 Pro, MacOS Sonoma
Python: 3.12.0
Make: GNU Make 3.81
I tried to run for both OpenLlama-3B and Llama-7B-chat, same error.
Steps to Reproduce
Please provide detailed steps for reproducing the issue. We are not sitting in front of your screen, so the more detail the better.
Failure Logs
The text was updated successfully, but these errors were encountered: