-
Notifications
You must be signed in to change notification settings - Fork 120
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Questions about using docker images #73
Comments
It seems to be an issue when putting WSL + Nvidia driver + docker together. Can you confirm: can you run CUDA with torch outside docker? |
Thank you for your reply. The WSL I mentioned above is WSL2, and I have also confirmed that CUDA can be used normally in WSL2-Ubuntu22.04 without using docker. |
Would you mind checking microsoft/WSL#5663 And some StackExchange guys said the error is harmless https://superuser.com/questions/1707681/wsl-libcuda-is-not-a-symbolic-link |
I am closing this issue because it is no longer active. Please feel free to re-open it if the issue still exists! |
In WSL, When installing using the recommended docker image:
docker pull runzhongwang/thinkmatch:torch1.6.0-cuda10.1-cudnn7-pyg1.6.3-pygmtools0.5.1
torch.cuda is unavailable and there are problems with symbolic links:

I also tried other different versions, and the above problems still occurred
The text was updated successfully, but these errors were encountered: