-
Notifications
You must be signed in to change notification settings - Fork 4.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Solve BUG:AttributeError: module transformers has no attribute LLaMATokenizer #64
Comments
Am using transformers 4.27.1, is it a different version? |
even i got the same error , please suggest how to fix this |
68d640f7c368bcaaaecfc678f11908ebbd3d6176 |
we are getting this error, and would appreciate your help
|
I didn't install transformers in pip, I download transformers in github in branch "llama_push" an move the downloaded file into conda |
Similar to the previous answers, the following steps works for me:
|
Yeah you have to install from Transformers github. I had thought since it was merged it was in an updated pip package but its not yet. |
|
@ruian0 Thanks for your idea, I fixed this bug, but I faced an another bug: |
Another nice solution: |
transformers.LLaMATokenizer is change to transformers.LlamaTokenizer |
I wang to follow the guide below.
The text was updated successfully, but these errors were encountered: