Tree-like Decision Distillation (CVPR 2021)
Note: In this paper introduces a Tree-like Decision Distillation strategy, enable student models to learn the process of hierarchical dissection decision-making by teachers through hierarchical decision constraints. Example codes are all in the folder
examples/distillation/tdd
.
Example configs are all in the folder examples/kd/TDD/config
.
The training of networks begin with loading config files. You can choose different config files to train different networks.
The config files of training teacher networks are in the folder valina
.
The config files of training student networks are in the folder tree
.
cd examples/distillation/tdd
sh train_teacher.sh
sh feature_preprocessing.sh
sh train_tdd.sh
@inproceedings{song2021tree,
title={Tree-like decision distillation},
author={Song, Jie and Zhang, Haofei and Wang, Xinchao and Xue, Mengqi and Chen, Ying and Sun, Li and Tao, Dacheng and Song, Mingli},
booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
pages={13488--13497},
year={2021}
}