You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I would like to use LAVA-dl with my PGPU. I installed following the procedure on Github. But the torch installed is version 2.0.0+cpu. The CUDA version used is 11.8:
Cuda compilation tools, release 11.8, V11.8.89
Build cuda_11.8.r11.8/compiler.31833905_0.
When I run the Oxford Dataset tutorial, the build returns the following error:
"No CUDA runtime is found, using CUDA_HOME='C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.8'
Traceback (most recent call last):
File "*\lava-nc\TUTO_SNN_SLAYER.py", line 73, in <module>
net = Network().to(device)
File "*\lava-nc\lava-dl\.venv\lib\site-packages\torch\nn\modules\module.py", line 1145, in to
return self._apply(convert)
File "*\lava-nc\lava-dl\.venv\lib\site-packages\torch\nn\modules\module.py", line 797, in _apply
module._apply(fn)
File "*\lava-nc\lava-dl\.venv\lib\site-packages\torch\nn\modules\module.py", line 797, in _apply
module._apply(fn)
File "*\lava-nc\lava-dl\.venv\lib\site-packages\torch\nn\modules\module.py", line 797, in _apply
module._apply(fn)
File "*\lava-nc\lava-dl\.venv\lib\site-packages\torch\nn\modules\module.py", line 820, in _apply
param_applied = fn(param)
File "*\lava-nc\lava-dl\.venv\lib\site-packages\torch\nn\modules\module.py", line 1143, in convert
return t.to(device, dtype if t.is_floating_point() or t.is_complex() else None, non_blocking)
File "*\lava-nc\lava-dl\.venv\lib\site-packages\torch\cuda\__init__.py", line 239, in _lazy_init
raise AssertionError("Torch not compiled with CUDA enabled")
AssertionError: Torch not compiled with CUDA enabled"
I have uninstalled torch in the venv and installed torch 2.0.0+cu118. Here the code recognizes the GPU during code execution, but a "where cl" error appears. The pytest error also appears:
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hello everyone,
I would like to use LAVA-dl with my PGPU. I installed following the procedure on Github. But the torch installed is version 2.0.0+cpu. The CUDA version used is 11.8:
When I run the Oxford Dataset tutorial, the build returns the following error:
I have uninstalled torch in the venv and installed torch 2.0.0+cu118. Here the code recognizes the GPU during code execution, but a "where cl" error appears. The pytest error also appears:
So I'm stuck and I don't know what to do.
Best
Beta Was this translation helpful? Give feedback.
All reactions