-
Notifications
You must be signed in to change notification settings - Fork 690
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
run a model (e.g. padim) with export to onnx enabled #502
Comments
Since applying the OpenVINO optimization will generate an optimization:
openvino:
apply: true The I am creating a #509 PR to make this export more generic. You could also pull from top of |
The most recent up-to-date instructions for exporting a trained model can be found on this page in our documentation. I'm closing this issue now, but please feel free to re-open if you have any additional questions. |
Hello @djdameln , |
How do you convert the algorithm to onnx for model deployment?(e.g. Padim)
The text was updated successfully, but these errors were encountered: