You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When loading the out.yml to create an Inferencer object, the device from the output variables is employed for predictions. However, if the training was done with a GPU and the model is now loaded on a system that only has a CPU available, an error is raised. The Inferencer object should automatically detect, if a GPU is available or not.
The text was updated successfully, but these errors were encountered:
When loading the
out.yml
to create an Inferencer object, the device from the output variables is employed for predictions. However, if the training was done with a GPU and the model is now loaded on a system that only has a CPU available, an error is raised. The Inferencer object should automatically detect, if a GPU is available or not.The text was updated successfully, but these errors were encountered: