中文: Chinese
[TOC]
Please prepare these important materials as below:
Index | Prepare | Example |
---|---|---|
1 | Hardware、BSP and Cross compilation tool chain |
K210 BSP xpack-riscv-none-embed-gcc-8.3.0-1.2 |
2 | RT-AK |
RT-AK |
3 | AI model | ./rt_ai_tools/Model/mnist.tflite |
4 | K210 offical tools |
NNCase KFlash |
-
Hardware
Kendryte
KD233
or BOYAYB-DKA01
,or other K210 based embed development boardor contact us: [email protected]
-
BSP
,Website -
Cross compilation tool chain (Windows)
xpack-riscv-none-embed-gcc-8.3.0-1.2-win32-x64.zip |
Version: v8.3.0-1.2
Please clone RT-AK
$ git clone https://github.com/RT-Thread/RT-AK.git edge-ai
k210
offical tool supports three kinds of AI model typies: TFLite
、Caffe
、ONNX
Transform the AI model into kmodel, which is located at RT-AK/rt_ai_tools/Models/mnist.tflite
-
NNCase
:NNCase is located atRT-AK/rt_ai_tools/platforms/plugin_k210/k_tools
You can also download it form Github
Please run aitools
at edge-ai/RT-AK/tools
During the running process of the RT-AK
- K210 plug-in will be pulled from
github
toRT-AK/rt_ai_tools/platforms
automatically - AI model will be integrated in the
BSP
, but not include the application codes of the AI model, an example is presented at the end of the doc - In the PATH of
RT-AK/rt_ai_tools/platforms/plugin_k210
,<model_name>.kmodel
andconvert_report.txt
will be generated<model_name>.kmodel
AI model for K210convert_report.txt
log
for AI model transformation
# Basic commands
python aitools.py --project=<your_project_path> --model=<your_model_path> --model_name=<your_model_name> --platform=k210 --clear
# Examples
$ D:\Project\edge-ai\RT-AK\rt_ai_tools>python aitools.py --project=D:\Project\K210_Demo\k210_rthread_bsp --model=.\Models\mnist.tflite --model_name=mnist --platform=k210 --embed_gcc=D:\Project\k210_third_tools\xpack-riscv-none-embed-gcc-8.3.0-1.2\bin --dataset=.\platforms\plugin_k210\datasets\mnist_datasets
An successful demo for RT-AK:
# no model quantization, --inference_type
$ python aitools.py --project=<your_project_path> --model=<your_model_path> --platform=k210 --inference_type=float
# no model quantization, set path for the cross compilation tool chain
$ python aitools.py --project=<your_project_path> --model=<your_model_path> --platform=k210 --embed_gcc=<your_RISCV-GNU-Compiler_path> --inference_type=float
# model quantization with uint8 type, accelerating computation with KPU, dataset format is image
$ python aitools.py --project=<your_project_path> --model=<your_model_path> --platform=k210 --embed_gcc=<your_RISCV-GNU-Compiler_path> --dataset=<your_val_dataset>
# model quantization with uint8 type, accelerating computation with KPU, dataset format is not image
$ python aitools.py --project=<your_project_path> --model=<your_model_path> --platform=k210 --embed_gcc=<your_RISCV-GNU-Compiler_path> --dataset=<your_val_dataset> --dataset_format=raw
# example
$ python aitools.py --project="D:\Project\k210_val" --model="./Models/facelandmark.tflite" --model_name=facelandmark --platform=k210 --embed_gcc="D:\Project\k210_third_tools\xpack-riscv-none-embed-gcc-8.3.0-1.2\bin" --dataset="./platforms/plugin_k210/datasets/images"
Parameter | Description |
---|---|
--project | OS+BSP project folder,should be assigned by user |
--model | AI model PATH. Default is ./Models/keras_mnist.h5 |
--model_name | New AI model name after model transformation. Default is network |
--platform | Target platform, such as: stm32, k210,Default is example |
--embed_gcc | PATH for cross compilation tool chain |
--dataset | Dataset for AI model quantization |
Compilation:
scons -j 6
rtthread.bin
will be generated in the folder
Download:
By using the K-Flash, you can download the rtthread.bin
into the test board
We have provide a open source test project, which you can download from here
System initialization:
- System clock initialization
Loading and running the AI model:
- Registering the AI model
- Finding AI model
- Initializing AI model
- Inferring AI model
- Getting results
// main.c
/* Set CPU clock */
sysctl_clock_enable(SYSCTL_CLOCK_AI); // System clock initialization
...
/* AI modol inference */
mymodel = rt_ai_find(MY_MODEL_NAME); // find AI model
if (rt_ai_init(mymodel, (rt_ai_buffer_t *)IMG9_CHW) != 0) // Initializing AI model
...
if(rt_ai_run(mymodel, ai_done, NULL) != 0) // Inferring
...
output = (float *)rt_ai_output(mymodel, 0); // Getting results
/* Getting the high */
for(int i = 0; i < 10 ; i++)
{
// printf("pred: %d, scores: %.2f%%\n", i, output[i]*100);
if(output[i] > scores && output[i] > 0.2)
{
prediction = i;
scores = output[i];
}
}
How to change the input image:
In applications
folder, you can only modify the line 18 and line 51
The MNIST demo can be download from Github