-
-
Notifications
You must be signed in to change notification settings - Fork 610
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can ExpDecay of learning rate start at some intermediate step? #1815
Comments
This is exactly the kind of thing a library like ParameterSchedulers.jl was designed to handle. See some of the surrounding discussion at https://github.com/FluxML/Flux.jl/search?q=scheduler&type=issues. |
Using ParameterSchedulers.jl this behavior can be obtained with a That said, a keyword argument is more convenient for these kinds of common cases. Since the PR is already there and it is very simple I'm ok with having it. |
1816: ExpDecay start step r=DhairyaLGandhi a=cossio Adds an option to `ExpDecay` which specifies the step at which the exponential decay of the learning rate starts. Fixes #1815. ### PR Checklist - [x] Tests are added - [ ] Entry in NEWS.md - [x] Documentation, if applicable - [ ] API changes require approval from a committer (different from the author, if applicable) Co-authored-by: cossio <[email protected]>
I would like to start training a model with a constant learning rate (LR) first, and then after some epochs, start decaying exponentially the LR. I think this is not an uncommon practice.
It would be nice to add this option to
ExpDecay
.The text was updated successfully, but these errors were encountered: