-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
运行quikstart进行模型推理报错 #337
Comments
安装不了这个包 |
。。。。mac m1 一样这个问题 |
如何解决? |
trust_remote_code=True, use_flash_attention_2=True 我在纯CPU环境,改成了AutoModelForCausalLM.from_pretrained(model_name_or_path, device_map=device_map, torch_dtype=torch.float16) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
This modeling file requires the following packages that were not found in your environment: flash_attn. Run
pip install flash_attn
The text was updated successfully, but these errors were encountered: