Unable to init a TorchInferencer for EfficientAD model #2155
-
Hello everyone!! I recently came across this library and have currently trained an EfficientAD model by following the code snippets provided in the repository readme, but I am having trouble initializing a torch inferencer with the trained model.
Error:
The Torch Inferencer requires a .pt or .pth model while the training produces a .ckpt file, so I loaded the .ckpt file using Torch and saved it to a .pt file.
The code is looking for information under metadata and model key in the .pt files which are not present causing the error while loading the inferencer.
At first, I tried to save the state_dict with the name model and created a new checkpoint file, but when that didn't work,
But I am still getting the issue of metadata,
I did see a few comments discussing this issue but no solution provided. |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 4 replies
-
You have to export the model after training like |
Beta Was this translation helpful? Give feedback.
-
Since many users experience this issue, maybe we could have a clearer error message for this case to explain what is wrong and what to do |
Beta Was this translation helpful? Give feedback.
You have to export the model after training like
engine.export(model=model, export_type=ExportType.TORCH, export_root=os.path.join(root_folder, "export"))