-
Notifications
You must be signed in to change notification settings - Fork 55
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Enhancement/select device #488
Changes from 21 commits
042bea1
590e765
1ae04f4
65012ea
32a9ab5
e3c3ed7
5832958
4d0c63a
b6b9d5e
f75b80c
8eea734
8010878
f1ae63f
fc3b38e
1227183
bd45727
680ad9c
7ac3282
df11692
fc02990
a44f03b
58ec0b1
335d955
768246f
c453aab
816ff85
613e597
9535102
54f3a85
9e8ed07
0af54b1
cfa24c7
04836d7
cbdaa6b
17e7293
2374ae8
bac8ec6
ffd6e7b
b355451
36efc80
c7c596e
a4ca8db
79925b5
093ab69
499cc1e
1b178cd
1ffe420
d9eac6c
f269458
6eb50f1
4e98d25
ff195ee
2d94115
920b973
6fb10f8
9799411
31f2438
fbda362
01eccf1
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,26 @@ | ||
import torch | ||
import unittest | ||
import os | ||
import heat as ht | ||
|
||
envar = os.getenv("HEAT_USE_DEVICE", "cpu") | ||
|
||
if envar == "cpu": | ||
ht.use_device("cpu") | ||
torch_device = ht.cpu.torch_device | ||
heat_device = None | ||
elif envar == "gpu" and torch.cuda.is_available(): | ||
ht.use_device("gpu") | ||
torch.cuda.set_device(torch.device(ht.gpu.torch_device)) | ||
torch_device = ht.gpu.torch_device | ||
heat_device = None | ||
elif envar == "lcpu" and torch.cuda.is_available(): | ||
ht.use_device("gpu") | ||
torch.cuda.set_device(torch.device(ht.gpu.torch_device)) | ||
torch_device = ht.cpu.torch_device | ||
heat_device = ht.cpu | ||
elif envar == "lgpu" and torch.cuda.is_available(): | ||
ht.use_device("cpu") | ||
torch.cuda.set_device(torch.device(ht.gpu.torch_device)) | ||
torch_device = ht.gpu.torch_device | ||
heat_device = ht.gpu | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Can we make this a function instead? and then put it in the device.py file within core? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I don't think that it should be a library function. It is only needed in the tests. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. i understand, but it goes against the programming model that we have everywhere else. and although this is only used in the unit tests at the moment, it may be useful as a tool for other developers if it is a library function. |
Large diffs are not rendered by default.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can you expand on this? if someone was to read this they likely wouldnt understand what this means. Its okay to use two sentences here