Replies: 7 comments 10 replies
-
@grimoire So just to understand, the plan is to have a tool that can build MMDeploy + SDK for various platforms and with various backends? To allow pre-build releases of MMDeploy that can be easily downloaded by the user? Initially, I thought a single build script would be used, similar to how it is done with ppl.cv. This might be the easiest solution but it requires the user to build locally. This is what I ended up doing myself. If we want to do pre-build packages, my main concern is how to handle building for other platforms - eg. aarch64, as I assume we would crosscompile the packages on a non aarch64 build server? I have previously tried crosscompiling ppl-cv, MMDeploy and the dependencies, but I did not manage to get it working properly due to the way the dependencies are detected. Instead, I wrote a bash script that handles building and installing all dependencies (including spdlog, pplcv) and MMDeploy SDK on the Jetson itself. Then I was able to crosscompile my custom inference tools on my dev. machine using the Jetson rootfs containing the pre-built MMDeploy SDK libs. Decoupling By the way, how do we handle misc. dependencies that are not installed by MMDeploy pre-build package? Eg. mmcv-full and other python packages that are needed for mmdeploy? Can they be installed automatically as part of the installation of MMDeploy wheel? |
Beta Was this translation helpful? Give feedback.
-
Yes. We will install the dependencies which we need when we install the mmdeploy wheel package. |
Beta Was this translation helpful? Give feedback.
-
We need to use the aarch64 to build it and then relesae it, just like some pre-build package need to choose |
Beta Was this translation helpful? Give feedback.
-
@tehkillerbee |
Beta Was this translation helpful? Give feedback.
-
Yes, cuz some users have trouble when they deploy into some device. At the same time, they need to install all mmdeploy dependencies step by step according to the doc which is little harder for some users. I think this pre-build package will help more users to use mmdeploy more fluently. |
Beta Was this translation helpful? Give feedback.
-
Hi @grimoire , I want to pick |
Beta Was this translation helpful? Give feedback.
-
@grimoire I could probably assist with the build script, since I have already worked on something like that.
|
Beta Was this translation helpful? Give feedback.
-
We plan to create some pre-build package to ease the installation of MMDeploy. Here are some details:
Environment and Platform
environment
backend
We will support these combinations. More systems and backend would be added in the future ( if needed).
Plan
Model Converter and SDK would be built in two different packages.
Model Converter
Most of the code in the model converter is python, so we plan to align the pre-build package with PyTorch.
export IR
andconvert backend model
is a good choice. Since user might want to deploy their model in some edge device (jetson for example).It is not easy to install some third-party library on these devices. Instead, we can export ONNX on PC and convert TensorRT model on jetson device with NO third-party dependency.mmdeploy/lib
so they can be packed in the same wheel with the python code.mmdeploy
so we can use them likepython -m mmdeploy deploy ...
.pip install mmdeploy=={version} -f https://<url_to_wheels>
SDK
The behavior of SDK looks like an inference engine, so the package would be aligned with TensorRT pre-build package.
tar
orzip
package, which includes heads, libraries, python API, and anything that might be necessary. It would be published after each release of MMDeploy.mmdeploy_python
) will be another wheel, which is placed in<package_dir>/python
, just like TensorRT. User can optionally install the API.I have made a private repo mmdeploy_builder.py to do the model converter build. It is not well tested, and is buggy(I guess). SDK build tools have not yet been developed.
There are two PR related:
mmdeploy/lib
so we can pack them together with python code.I plan to close #273 since it is an old PR with a lot of conflict with the current master. It would be done in #347 again.
There is still a lot to do:
export IR
andconvert backend model
. (remove third-party dependency, make sure we can convert TensorRT model on the jetson platform without installing PyTorch)@tehkillerbee @PeterH0323 Please feel free to choose the tasks (or is there any task I missed?). Let's work together to make MMDeploy Better!
Beta Was this translation helpful? Give feedback.
All reactions