Implementation of the paper "Ensemble Modeling with Contrastive Knowledge Distillation for Sequential Recommendation".
python main.py --template train_bert --dataset_code beauty
python main.py --template train_bert --dataset_code toys
python main.py --template train_bert --dataset_code ml-1m
Training pipeline is implemented based on this repo https://github.com/jaywonchung/BERT4Rec-VAE-Pytorch . We would like to thank the contributors for their work.
Please cite our paper if you find our codes useful:
@inproceedings{EMKD,
author = {Hanwen Du and
Huanhuan Yuan and
Pengpeng Zhao and
Fuzhen Zhuang and
Guanfeng Liu and
Lei Zhao and
Victor S. Sheng},
title = {Ensemble Modeling with Contrastive Knowledge Distillation for Sequential Recommendation},
booktitle = {SIGIR},
year = {2023}
}