-
Notifications
You must be signed in to change notification settings - Fork 559
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
WIP: [Torch] Renamed Torch to TorchCUDA, and added a CPU-only Torch #4477
Conversation
0cd8cdb
to
52d868e
Compare
1802af7
to
89ed68a
Compare
89ed68a
to
2fc4de7
Compare
Does this need to be a different package? Or can we turn this into a variant? |
d241894
to
45ef494
Compare
If eventually both CPU, CUDA and ROCm builds are available, wouldn't the different sets of dependencies necessitate different packages? |
You still have the same issue that if Torch.jl includes |
Right, I see. Is there some discussion/registry of defined/reserved platform tags? I.e. that "cuda" is reserved for CUDA and so on for e.g. "amd_rocm", "intel_oneapi", "apple_coreml", ... |
f3eebc1
to
3b803f1
Compare
…ch CUDA version" This reverts commit 2c7c97e.
Superseded by #4554 |
Currently a WIP. Aims at resolving FluxML/Torch.jl#20.
Similar approach as for ONNXRuntime - separate binaries for CPU-only and CUDA etc. (CPU: #4369, CUDA: #4386).
Relates to: #1529