Skip to content

Latest commit

 

History

History

Tree-like Decision Distillation

Tree-like Decision Distillation (CVPR 2021)

Note: In this paper introduces a Tree-like Decision Distillation strategy, enable student models to learn the process of hierarchical dissection decision-making by teachers through hierarchical decision constraints. Example codes are all in the folder examples/distillation/tdd.

Training Configs

Example configs are all in the folder examples/kd/TDD/config. The training of networks begin with loading config files. You can choose different config files to train different networks. The config files of training teacher networks are in the folder valina. The config files of training student networks are in the folder tree.

Quick Start

1. Enter the folder

cd examples/distillation/tdd

2. train teacher model

sh train_teacher.sh

3. Feature preprocessing

sh feature_preprocessing.sh

4. Tree-like decision distillation

sh train_tdd.sh

Citation

@inproceedings{song2021tree,
  title={Tree-like decision distillation},
  author={Song, Jie and Zhang, Haofei and Wang, Xinchao and Xue, Mengqi and Chen, Ying and Sun, Li and Tao, Dacheng and Song, Mingli},
  booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  pages={13488--13497},
  year={2021}
}