-
Hi, I am trying to setup the mmdeploy on my Jetson AGX Xavier. Everything went fine including the conversion step. But when I try to inference with the command
I already tried reinstalling, and using python API, but the problem persists. Do you guys have any idea on the problem and how to fix it? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Please read https://github.com/open-mmlab/mmdeploy/blob/master/docs/en/get_started.md
How to get meta info? After exporting MMDeploy Model, pass the path specified by |
Beta Was this translation helpful? Give feedback.
Please read https://github.com/open-mmlab/mmdeploy/blob/master/docs/en/get_started.md
build/bin/object_detection
is implemented on MMDeploy Inference SDK.SDK performs model inference with not only the backend engine file but also meta info.
How to get meta info?
Use
--dump-info
when converting an OpenMMLab model bytools/deploy.py
.The generated meta info, as well as the backend engine file, are located in the directory specified by
--work-dir
They make up what we call MMDeploy Model.
After exporting MMDeploy Model, pass the path specified by
--work-dir
tobuild/bin/object_detection