-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Log LR using LearningRateMonitor even when LR Scheduler is not defined. #9786
Merged
SkafteNicki
merged 19 commits into
Lightning-AI:master
from
VirajBagal:feature/7468_log_initial_lr_without_scheduler
Oct 14, 2021
Merged
Log LR using LearningRateMonitor even when LR Scheduler is not defined. #9786
SkafteNicki
merged 19 commits into
Lightning-AI:master
from
VirajBagal:feature/7468_log_initial_lr_without_scheduler
Oct 14, 2021
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
VirajBagal
requested review from
awaelchli,
Borda,
carmocca,
justusschock,
kaushikb11,
rohitgr7,
SeanNaren,
tchaton and
williamFalcon
as code owners
October 1, 2021 10:27
for more information, see https://pre-commit.ci
…lr_without_scheduler
Codecov Report
@@ Coverage Diff @@
## master #9786 +/- ##
=======================================
- Coverage 93% 89% -4%
=======================================
Files 178 179 +1
Lines 15652 15778 +126
=======================================
- Hits 14508 14009 -499
- Misses 1144 1769 +625 |
…://github.com/VirajBagal/pytorch-lightning into feature/7468_log_initial_lr_without_scheduler Pulled changes from master
for more information, see https://pre-commit.ci
ananthsub
reviewed
Oct 1, 2021
…lightning into feature/7468_log_initial_lr_without_scheduler pulling master branch
…lr_without_scheduler
…://github.com/VirajBagal/pytorch-lightning into feature/7468_log_initial_lr_without_scheduler Pulling changes
for more information, see https://pre-commit.ci
tchaton
reviewed
Oct 12, 2021
tchaton
approved these changes
Oct 13, 2021
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGMT !
awaelchli
approved these changes
Oct 13, 2021
SkafteNicki
approved these changes
Oct 14, 2021
rohitgr7
added a commit
to Tshimanga/pytorch-lightning
that referenced
this pull request
Oct 18, 2021
…d. (Lightning-AI#9786) * LR logging works even with no lr scheduler, wrote few extra tests as well * updated changelog * modified code as suggested by DeepSource * added helper functions * opt with no scheduler * rename * chlog * update test Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> Co-authored-by: rohitgr7 <[email protected]>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
What does this PR do?
In this PR, the
LearningRateMonitor
callback is extended to log learning rates of each optimizer even when LR schedulers are not specified by the user in training.When LR schedulers are not specified,
trainer.optimizers
is used instead oftrainer.lr_schedulers
for getting names of the optimizers, and then we iterate over the optimizers to log their learning rates.Extra tests are written to make sure that the learning rates are logged with correct names even when schedulers are not defined.
Motivation
Currently the LR monitor callback only takes effect if the user specifies LR schedulers for training. However, one could want to record what the initial learning rate used for their optimizer's parameter groups even without any LR schedulers. This PR tries to solve this issue.
Fixes #7468
Does your PR introduce any breaking changes?
No
Before submitting
PR review