You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Yes, it can work with some tedious modifications, but I recommend updating the readme or giving some hits to prevent future users from these errors. After all, LLAMA is an important model in PPDL.
Thanks a lot!
Issue Type
Build/Install
Modules Involved
SPU runtime
Have you reproduced the bug with SPU HEAD?
Yes
Have you searched existing issues?
Yes
SPU Version
latest
OS Platform and Distribution
Ubuntu 22.04
Python Version
3.10
Compiler Version
GCC 11.3
Current Behavior?
The EasyLM has been updated and the steps in README.md in examples/python/ml/flax_llama7b are incorrect.
Standalone code to reproduce the issue
Relevant log output
No response
The text was updated successfully, but these errors were encountered: