Skip to content

Commit

Permalink
Merge pull request #241 from DefTruth/dev
Browse files Browse the repository at this point in the history
fix(RVM): fixed the TNN preprocess for RVM (#240)
  • Loading branch information
DefTruth authored Mar 17, 2022
2 parents bbe6732 + 641e56b commit 54a4369
Show file tree
Hide file tree
Showing 31 changed files with 54 additions and 55 deletions.
47 changes: 22 additions & 25 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@

* **Simply and User friendly.** Simply and Consistent syntax like **lite::cv::Type::Class**, see [examples](#lite.ai.toolkit-Examples-for-Lite.AI.ToolKit).
* **Minimum Dependencies.** Only **OpenCV** and **ONNXRuntime** are required by default, see [build](#lite.ai.toolkit-Build-Lite.AI.ToolKit).
* **Lots of Algorithm Modules.** Contains 10+ modules with **[80+](https://github.com/DefTruth/lite.ai.toolkit/tree/main/docs/hub/lite.ai.toolkit.hub.onnx.md)** AI models and **[500+](https://github.com/DefTruth/lite.ai.toolkit/tree/main/docs/hub/lite.ai.toolkit.hub.onnx.md)** weights now.
* **Lots of Algorithm Modules.** Contains 10+ modules with **[80+](https://github.com/DefTruth/lite.ai.toolkit/tree/main/docs/hub/lite.ai.toolkit.hub.onnx.md)** AI models and **[500+](https://github.com/DefTruth/lite.ai.toolkit/tree/main/docs/hub/lite.ai.toolkit.hub.onnx.md)** weights now.

## Citations 🎉🎉

Expand All @@ -60,8 +60,15 @@ Consider to cite it as follows if you use **Lite.Ai.ToolKit** in your projects.
year={2021}
}
```
## About Training 🤓👀
A high level Training and Evaluating Toolkit for Face Landmarks Detection is available at [torchlm](https://github.com/DefTruth/torchlm).

## Downloads & RoadMap ✅

<div id="lite.ai.toolkit-RoadMap"></div>

![](docs/resources/lite.ai.toolkit-roadmap-v0.1.png)

## Downloads ✅
Some prebuilt lite.ai.toolkit libs for MacOS(x64) and Linux(x64) are available, you can download the libs from the release links. Further, prebuilt libs for Windows(x64) and Android will be coming soon ~ Please, see [issues#48](https://github.com/DefTruth/lite.ai.toolkit/issues/48) for more details of the prebuilt plan and refer to [releases](https://github.com/DefTruth/lite.ai.toolkit/releases) for more available prebuilt libs.

* [x] [lite0.1.1-osx10.15.x-ocv4.5.2-ffmpeg4.2.2-onnxruntime1.8.1.zip](https://github.com/DefTruth/lite.ai.toolkit/releases/download/v0.1.1/lite0.1.1-osx10.15.x-ocv4.5.2-ffmpeg4.2.2-onnxruntime1.8.1.zip)
Expand Down Expand Up @@ -90,7 +97,6 @@ add_executable(lite_yolov5 examples/test_lite_yolov5.cpp)
target_link_libraries(lite_yolov5 ${TOOLKIT_LIBS} ${OpenCV_LIBS})
```


## Contents 📖💡
* [Core Features](#lite.ai.toolkit-Core-Features)
* [Quick Start](#lite.ai.toolkit-Quick-Start)
Expand Down Expand Up @@ -128,20 +134,13 @@ static void test_default()
delete yolov5;
}
```

The output is:
<!----
<div align='center'>
<img src='logs/test_lite_yolov5_1.jpg' height="256px">
<img src='logs/test_lite_yolov5_2.jpg' height="256px">
</div>

## 2. RoadMap 👏👋
<div id="lite.ai.toolkit-RoadMap"></div>

![](docs/resources/lite.ai.toolkit-roadmap-v0.1.png)

<img src="docs/resources/scrfd-mgmatting-nanodetplus.jpg" height="250px" width="750px" >
</div>
---->

## 3. Important Updates!!
## 2. Important Updates 🆕
<div id="lite.ai.toolkit-Important-Updates"></div>

|Date|Model|C++|Paper|Code|Awesome|Type|
Expand All @@ -156,10 +155,8 @@ The output is:
|【2021/09/20】|[RobustVideoMatting](https://github.com/PeterL1n/RobustVideoMatting)|[[link](https://github.com/DefTruth/lite.ai.toolkit/blob/main/examples/lite/cv/test_lite_rvm.cpp)]|[[WACV 2022](https://arxiv.org/abs/2108.11515)]|[[code](https://github.com/PeterL1n/RobustVideoMatting)]|![](https://img.shields.io/github/stars/PeterL1n/RobustVideoMatting.svg?style=social)| matting |
|【2021/09/02】|[YOLOP](https://github.com/hustvl/YOLOP)|[[link](https://github.com/DefTruth/lite.ai.toolkit/blob/main/examples/lite/cv/test_lite_yolop.cpp)]|[[arXiv 2021](https://arxiv.org/abs/2108.11250)]|[[code](https://github.com/hustvl/YOLOP)]|![](https://img.shields.io/github/stars/hustvl/YOLOP.svg?style=social)| detection |

![](docs/resources/scrfd-mgmatting-nanodetplus.jpg)


## 4. Supported Models Matrix
## 3. Supported Models Matrix
<div id="lite.ai.toolkit-Supported-Models-Matrix"></div>

* / = not supported now.
Expand Down Expand Up @@ -245,7 +242,7 @@ The output is:
|[SubPixelCNN](https://github.com/niazwazir/SUB_PIXEL_CNN)|234K|*resolution*|[demo](https://github.com/DefTruth/lite.ai.toolkit/blob/main/examples/lite/cv/test_lite_subpixel_cnn.cpp)||| / |||✔️|✔️||


## 5. Build Docs.
## 4. Build Docs.
<div id="lite.ai.toolkit-Build-MacOS"></div>
<div id="lite.ai.toolkit-Build-Lite.AI.ToolKit"></div>

Expand Down Expand Up @@ -424,7 +421,7 @@ To link `lite.ai.toolkit` shared lib. You need to make sure that `OpenCV` and `o
</details>


## 6. Model Zoo.
## 5. Model Zoo.

<div id="lite.ai.toolkit-Model-Zoo"></div>
<div id="lite.ai.toolkit-2"></div>
Expand Down Expand Up @@ -546,7 +543,7 @@ auto *yolox = new lite::cv::detection::YoloX("yolox_nano.onnx"); // 3.5Mb only
</details>


## 7. Examples.
## 6. Examples.

<div id="lite.ai.toolkit-Examples-for-Lite.AI.ToolKit"></div>

Expand Down Expand Up @@ -1069,14 +1066,14 @@ More classes for style transfer (neural style transfer, others)
auto *transfer = new lite::cv::style::FastStyleTransfer(onnx_path); // 6.4Mb only
```

## 8. License.
## 7. License.

<div id="lite.ai.toolkit-License"></div>

The code of [Lite.Ai.ToolKit](#lite.ai.toolkit-Introduction) is released under the GPL-3.0 License.


## 9. References.
## 8. References.

<div id="lite.ai.toolkit-References"></div>

Expand Down Expand Up @@ -1123,7 +1120,7 @@ Many thanks to these following projects. All the Lite.AI.ToolKit's models are so
</details>


## 10. Compilation Options.
## 9. Compilation Options.

In addition, [MNN](https://github.com/alibaba/MNN), [NCNN](https://github.com/Tencent/ncnn) and [TNN](https://github.com/Tencent/TNN) support for some models will be added in the future, but due to operator compatibility and some other reasons, it is impossible to ensure that all models supported by [ONNXRuntime C++](https://github.com/microsoft/onnxruntime) can run through [MNN](https://github.com/alibaba/MNN), [NCNN](https://github.com/Tencent/ncnn) and [TNN](https://github.com/Tencent/TNN). So, if you want to use all the models supported by this repo and don't care about the performance gap of *1~2ms*, just let [ONNXRuntime](https://github.com/microsoft/onnxruntime) as default inference engine for this repo. However, you can follow the steps below if you want to build with [MNN](https://github.com/alibaba/MNN), [NCNN](https://github.com/Tencent/ncnn) or [TNN](https://github.com/Tencent/TNN) support.

Expand All @@ -1143,7 +1140,7 @@ auto *nanodet = new lite::mnn::cv::detection::NanoDet(mnn_path);
auto *nanodet = new lite::tnn::cv::detection::NanoDet(proto_path, model_path);
auto *nanodet = new lite::ncnn::cv::detection::NanoDet(param_path, bin_path);
```
## 11. Contribute
## 10. Contribute
<div id="lite.ai.toolkit-Contribute"></div>

How to add your own models and become a contributor? For specific steps, please refer to [CONTRIBUTING.zh.md](https://github.com/DefTruth/lite.ai.toolkit/issues/191), or if you like this project please ❤️ consider ⭐️🌟 star this repo, as it is the simplest way to support me.
38 changes: 17 additions & 21 deletions README.zh.md
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,12 @@
}
```

## 下载预编译库 ✅
## 关于训练 🤓👀
一个用于人脸关键点检测的训练和评估的工具箱已经开源,可通过pip一键安装,地址在[torchlm](https://github.com/DefTruth/torchlm).

## 预编译库 和 技术规划 ✅
![](docs/resources/lite.ai.toolkit-roadmap-v0.1.png)

目前,有一些预编译的MacOS(x64)和Linux(x64)下的lite.ai.toolkit动态库,可以直接从以下链接进行下载。Windows(x64)和Android下的预编译库,也会在最近发布出来。更多详情请参考[issues#48](https://github.com/DefTruth/lite.ai.toolkit/issues/48) . 更多可下载的的预编译库,请跳转到[releases](https://github.com/DefTruth/lite.ai.toolkit/releases) 查看。

* [x] [lite0.1.1-osx10.15.x-ocv4.5.2-ffmpeg4.2.2-onnxruntime1.8.1.zip](https://github.com/DefTruth/lite.ai.toolkit/releases/download/v0.1.1/lite0.1.1-osx10.15.x-ocv4.5.2-ffmpeg4.2.2-onnxruntime1.8.1.zip)
Expand Down Expand Up @@ -130,17 +135,8 @@ static void test_default()
}
```

输出的结果是:
<div align='center'>
<img src='logs/test_lite_yolov5_1.jpg' height="256px">
<img src='logs/test_lite_yolov5_2.jpg' height="256px">
</div>

## 2. 技术规划 👏👋
![](docs/resources/lite.ai.toolkit-roadmap-v0.1.png)


## 3. 重要更新 !!
## 2. 重要更新 🆕
<div id="lite.ai.toolkit-Important-Updates"></div>

|Date|Model|C++|Paper|Code|Awesome|Type|
Expand All @@ -155,11 +151,11 @@ static void test_default()
|【2021/09/20】|[RobustVideoMatting](https://github.com/PeterL1n/RobustVideoMatting)|[[link](https://github.com/DefTruth/lite.ai.toolkit/blob/main/examples/lite/cv/test_lite_rvm.cpp)]|[[WACV 2022](https://arxiv.org/abs/2108.11515)]|[[code](https://github.com/PeterL1n/RobustVideoMatting)]|![](https://img.shields.io/github/stars/PeterL1n/RobustVideoMatting.svg?style=social)| matting |
|【2021/09/02】|[YOLOP](https://github.com/hustvl/YOLOP)|[[link](https://github.com/DefTruth/lite.ai.toolkit/blob/main/examples/lite/cv/test_lite_yolop.cpp)]|[[arXiv 2021](https://arxiv.org/abs/2108.11250)]|[[code](https://github.com/hustvl/YOLOP)]|![](https://img.shields.io/github/stars/hustvl/YOLOP.svg?style=social)| detection |


<!---
![](docs/resources/scrfd-mgmatting-nanodetplus.jpg)
--->


## 4. 模型支持矩阵
## 3. 模型支持矩阵
<div id="lite.ai.toolkit-Supported-Models-Matrix"></div>

* / = 暂不支持.
Expand Down Expand Up @@ -246,7 +242,7 @@ static void test_default()



## 5. 编译文档
## 4. 编译文档
<div id="lite.ai.toolkit-Build-MacOS"></div>
<div id="lite.ai.toolkit-Build-Lite.AI.ToolKit"></div>

Expand Down Expand Up @@ -423,7 +419,7 @@ Default Version Detected Boxes Num: 5
</details>


## 6. 模型下载
## 5. 模型下载
<div id="lite.ai.toolkit-2"></div>
<div id="lite.ai.toolkit-Model-Zoo"></div>

Expand Down Expand Up @@ -543,7 +539,7 @@ auto *yolox = new lite::cv::detection::YoloX("yolox_nano.onnx"); // 3.5Mb only
</details>


## 7. 应用案例
## 6. 应用案例

<div id="lite.ai.toolkit-Examples-for-Lite.AI.ToolKit"></div>

Expand Down Expand Up @@ -1065,14 +1061,14 @@ static void test_default()
auto *transfer = new lite::cv::style::FastStyleTransfer(onnx_path); // 6.4Mb only
```

## 8. 开源协议
## 7. 开源协议

<div id="lite.ai.toolkit-License"></div>

[Lite.Ai.ToolKit](#lite.ai.toolkit-Introduction) 的代码采用GPL-3.0协议。


## 9. 引用参考
## 8. 引用参考

<div id="lite.ai.toolkit-References"></div>

Expand Down Expand Up @@ -1119,7 +1115,7 @@ auto *transfer = new lite::cv::style::FastStyleTransfer(onnx_path); // 6.4Mb onl
</details>


## 10. 编译选项
## 9. 编译选项
未来会增加一些模型的[MNN](https://github.com/alibaba/MNN)[NCNN](https://github.com/Tencent/ncnn)[TNN](https://github.com/Tencent/TNN) 支持,但由于算子兼容等原因,也无法确保所有被[ONNXRuntime C++](https://github.com/microsoft/onnxruntime) 支持的模型能够在[MNN](https://github.com/alibaba/MNN)[NCNN](https://github.com/Tencent/ncnn)[TNN](https://github.com/Tencent/TNN) 下跑通。所以,如果您想使用本项目支持的所有模型,并且不在意*1~2ms*的性能差距的话,请使用ONNXRuntime版本的实现。[ONNXRuntime](https://github.com/microsoft/onnxruntime) 是本仓库默认的推理引擎。但是如果你确实希望编译支持[MNN](https://github.com/alibaba/MNN)[NCNN](https://github.com/Tencent/ncnn)[TNN](https://github.com/Tencent/TNN) 支持的Lite.Ai.ToolKit动态库,你可以按照以下的步骤进行设置。

*`build.sh`中添加`DENABLE_MNN=ON``DENABLE_NCNN=ON``DENABLE_TNN=ON`,比如
Expand All @@ -1139,7 +1135,7 @@ auto *nanodet = new lite::tnn::cv::detection::NanoDet(proto_path, model_path);
auto *nanodet = new lite::ncnn::cv::detection::NanoDet(param_path, bin_path);
```

## 11. 如何添加您的模型
## 10. 如何添加您的模型
<div id="lite.ai.toolkit-Contribute"></div>

如何添加您自己的模型以及成为贡献者?具体步骤请参考 [CONTRIBUTING.zh.md](https://github.com/DefTruth/lite.ai.toolkit/issues/191) ,或者,❤️不妨给个⭐️🌟star,这应该是最简单的支持方式了。
Expand Down
2 changes: 1 addition & 1 deletion examples/lite/cv/test_lite_rvm.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -100,7 +100,7 @@ static void test_lite()
test_default();
test_onnxruntime();
test_mnn();
test_ncnn();
// test_ncnn();
test_tnn();
}

Expand Down
18 changes: 12 additions & 6 deletions lite/tnn/cv/tnn_rvm.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -178,17 +178,18 @@ void TNNRobustVideoMatting::initialize_context()
context_is_initialized = true;
}

void TNNRobustVideoMatting::transform(const cv::Mat &mat)
void TNNRobustVideoMatting::transform(const cv::Mat &mat_rs)
{
cv::Mat canvas;
cv::resize(mat, canvas, cv::Size(input_width, input_height));
cv::cvtColor(canvas, canvas, cv::COLOR_BGR2RGB);
// cv::Mat canvas;
// cv::resize(mat, canvas, cv::Size(input_width, input_height));
// cv::cvtColor(canvas, canvas, cv::COLOR_BGR2RGB);
// reference: https://github.com/DefTruth/lite.ai.toolkit/issues/240
// push into src_mat
src_mat = std::make_shared<tnn::Mat>(
input_device_type,
tnn::N8UC3,
input_shapes.at("src"),
(void *) canvas.data
(void *) mat_rs.data
);
if (!src_mat->GetData())
{
Expand All @@ -206,7 +207,12 @@ void TNNRobustVideoMatting::detect(const cv::Mat &mat, types::MattingContent &co
if (!context_is_initialized) return;

// 1. make input tensor
this->transform(mat);
cv::Mat mat_rs;
// resize mat outside 'transform' to prevent memory overflow
// reference: https://github.com/DefTruth/lite.ai.toolkit/issues/240
cv::resize(mat, mat_rs, cv::Size(input_width, input_height));
cv::cvtColor(mat_rs, mat_rs, cv::COLOR_BGR2RGB);
this->transform(mat_rs);
// 2. set input_mat
tnn::MatConvertParam src_cvt_param, ctx_cvt_param;
src_cvt_param.scale = scale_vals;
Expand Down
2 changes: 1 addition & 1 deletion lite/tnn/cv/tnn_rvm.h
Original file line number Diff line number Diff line change
Expand Up @@ -86,7 +86,7 @@ namespace tnncv
void print_debug_string(); // debug information

private:
void transform(const cv::Mat &mat); //
void transform(const cv::Mat &mat_rs); //

void initialize_instance(); // init net & instance

Expand Down
Empty file removed pylitex/pylitex/utils/__init__.py
Empty file.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
2 changes: 1 addition & 1 deletion pylitex/setup.py → python/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ def get_long_description():


setuptools.setup(
name="pylitex",
name="litex",
version="0.0.1",
author="DefTruth",
author_email="[email protected]",
Expand Down
File renamed without changes.

0 comments on commit 54a4369

Please sign in to comment.