Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inferencer does not set correct device #73

Closed
SebieF opened this issue Mar 27, 2023 · 0 comments · Fixed by #80
Closed

Inferencer does not set correct device #73

SebieF opened this issue Mar 27, 2023 · 0 comments · Fixed by #80
Assignees
Labels
bug Something isn't working
Milestone

Comments

@SebieF
Copy link
Collaborator

SebieF commented Mar 27, 2023

When loading the out.yml to create an Inferencer object, the device from the output variables is employed for predictions. However, if the training was done with a GPU and the model is now loaded on a system that only has a CPU available, an error is raised. The Inferencer object should automatically detect, if a GPU is available or not.

@SebieF SebieF added the bug Something isn't working label Mar 27, 2023
@SebieF SebieF added this to the Version 1.0.0 milestone Apr 14, 2023
@SebieF SebieF self-assigned this Apr 14, 2023
@SebieF SebieF mentioned this issue May 30, 2023
@SebieF SebieF closed this as completed in #80 Jun 3, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant