Skip to content

Latest commit

 

History

History
24 lines (20 loc) · 1 KB

README.md

File metadata and controls

24 lines (20 loc) · 1 KB

Nostalgic Adam

Code and supplements for "Nostalgic Adam: Weighting more of the past gradients when designing the adaptive learning rate"

Haiwen Huang, Chang Wang, Bin Dong (http://bicmr.pku.edu.cn/~dongbin/Publications/NosAdam.pdf)

Dependencies: Python >= 3.5, Pytorch >= 0.4.0

An introduction to the paper in Chinese: https://zhuanlan.zhihu.com/p/65625686

If you find this code useful, please cite:

@inproceedings{ijcai2019-355,
  title     = {Nostalgic Adam: Weighting More of the Past Gradients When Designing the Adaptive Learning Rate},
  author    = {Huang, Haiwen and Wang, Chang and Dong, Bin},
  booktitle = {Proceedings of the Twenty-Eighth International Joint Conference on
               Artificial Intelligence, {IJCAI-19}},
  publisher = {International Joint Conferences on Artificial Intelligence Organization},             
  pages     = {2556--2562},
  year      = {2019},
  month     = {7},
  doi       = {10.24963/ijcai.2019/355},
  url       = {https://doi.org/10.24963/ijcai.2019/355},
}