Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CUDA 12.1 Python 3.10 PyTorch 2.5.1 安装版本 #1327

Open
wuxi-dixi opened this issue Nov 11, 2024 · 1 comment
Open

CUDA 12.1 Python 3.10 PyTorch 2.5.1 安装版本 #1327

wuxi-dixi opened this issue Nov 11, 2024 · 1 comment

Comments

@wuxi-dixi
Copy link

没有CUDA 12.1的对应版本
安装了 flash_attn-2.6.3+cu123torch2.4cxx11abiTRUE-cp310-cp310-linux_x86_64.whl
但是报错 ImportError: /root/anaconda3/envs/loramoe/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN3c105ErrorC2ENS_14SourceLocationENSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEE
ImportError: FlashAttention-2 is not installed correctly. Please check the usage in https://github.com/Dao-AILab/flash-attention for more details.
我应该安装哪个版本呢?需要重新安装匹配的torch吗

@XCF-Mike
Copy link

没有CUDA 12.1的对应版本 安装了 flash_attn-2.6.3+cu123torch2.4cxx11abiTRUE-cp310-cp310-linux_x86_64.whl 但是报错 ImportError: /root/anaconda3/envs/loramoe/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN3c105ErrorC2ENS_14SourceLocationENSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEE ImportError: FlashAttention-2 is not installed correctly. Please check the usage in https://github.com/Dao-AILab/flash-attention for more details. 我应该安装哪个版本呢?需要重新安装匹配的torch吗

请问你找到正确的版本了吗

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants