Skip to content

Latest commit

 

History

History
40 lines (25 loc) · 1.14 KB

README.md

File metadata and controls

40 lines (25 loc) · 1.14 KB

PyTorch implementation of DistillGCN

Paper: Distilling Knowledge From Graph Convolutional Networks, CVPR'20

Method Overview

Dependencies

Main packages

PyTorch = 1.1.0

DGL = 1.4.0

See requirment file for more information about how to install the dependencies.

Training and evaluation

The main.py file contains the code for training teacher model, training the student model using the LSP module.

Early stop is used when training both the student model and the teacher model.

Cite

@inproceedings{yang2020distilling,
  title={Distilling Knowledge From Graph Convolutional Networks},
  author={Yang, Yiding and Qiu, Jiayan and Song, Mingli and Tao, Dacheng and Wang, Xinchao},
  booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  pages={7074--7083},
  year={2020}
}

License

DistillGCN is released under the MIT license. Please see the LICENSE file for more information.