-
Notifications
You must be signed in to change notification settings - Fork 28.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Llama2 models not loading (Using main branch) #25388
Comments
I have same issue with another Model. I think this is urgent to fix. |
Same here. Meanwhile downgrading to transfomers 4.31 version solves the problem |
I found this issue in the dev branch of transformers-4.32.0.dev0. By the way, I made a PR that could solve that. #25389 |
Same here! |
Hey everyone 👋 If you're hitting this exception, it means that there is something wrong with your model's config file 💔 Meanwhile, we are deciding internally how to massage this question into a more user-friendly solution. |
After the PR above gets merged, you will be able to do everything as before. The only difference from before is that you will see new warnings, related to poor |
System Info
Error when loading the model "meta-llama/Llama-2-7b-chat-hf" using the following code:
The error message was:
This is because the method GenerationConfig.validate() raises a ValueError and that Error is not controlled in modeling_utils.py file.
One possible solution is to add the the ValueError to the except clause in that file:
Who can help?
@gante
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
Using the main branch (install from source code)
Expected behavior
To be able to load the model
The text was updated successfully, but these errors were encountered: