Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Could you update the configuration process for LLAMA-7B? EasyLM seems to be updated and the old configuration is not correct anymore. #929

Closed
Tianshi-Xu opened this issue Dec 9, 2024 · 3 comments

Comments

@Tianshi-Xu
Copy link

Issue Type

Build/Install

Modules Involved

SPU runtime

Have you reproduced the bug with SPU HEAD?

Yes

Have you searched existing issues?

Yes

SPU Version

latest

OS Platform and Distribution

Ubuntu 22.04

Python Version

3.10

Compiler Version

GCC 11.3

Current Behavior?

The EasyLM has been updated and the steps in README.md in examples/python/ml/flax_llama7b are incorrect.

Standalone code to reproduce the issue

Lots of errors

Relevant log output

No response

@Tianshi-Xu
Copy link
Author

Main errors come from

python convert_hf_to_easylm.py  \
   --hf_model     /home/code/models/Llama-7b    \
   --output_file flax-llama7b-EasyLM.msgpack  \
   --streaming false

@tpppppub
Copy link
Collaborator

tpppppub commented Dec 9, 2024

It seems to be a duplicate of #704

@Tianshi-Xu
Copy link
Author

It seems to be a duplicate of #704

Yes, it can work with some tedious modifications, but I recommend updating the readme or giving some hits to prevent future users from these errors. After all, LLAMA is an important model in PPDL.
Thanks a lot!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants