You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am looking to use the 3D_GMIC model with a batch size greater than 1 by using the PyTorch DataParallel class and multiple GPUs. I noticed there are assertions at the beginning of the forward function expecting a batch size of only 1. If a higher batch size has been successfully employed, assistance would be greatly appreciated!
The text was updated successfully, but these errors were encountered:
Tpool1
changed the title
Can multiple GPU
Can multiple GPU's be utilized with the PyTorch DataParallel class?
Jan 7, 2023
Tpool1
changed the title
Can multiple GPU's be utilized with the PyTorch DataParallel class?
Can multiple GPUs be utilized with the PyTorch DataParallel class?
Jan 7, 2023
Hello,
I am looking to use the 3D_GMIC model with a batch size greater than 1 by using the PyTorch DataParallel class and multiple GPUs. I noticed there are assertions at the beginning of the forward function expecting a batch size of only 1. If a higher batch size has been successfully employed, assistance would be greatly appreciated!
The text was updated successfully, but these errors were encountered: