Extend the Learning Rate Monitor callback to log the initial lr for optimizers #7468
Labels
callback
feature
Is an improvement or enhancement
good first issue
Good for newcomers
help wanted
Open to be worked on
Milestone
🚀 Feature
Log the initial learning rate of each optimizer
Motivation
Currently the LR monitor callback only takes effect if the user specifies LR schedulers for training. However, one could want to record what the initial learning rate used for their optimizer's parameter groups even without any LR schedulers.
Pitch
We could log this once on train start following the logic here: https://github.com/PyTorchLightning/pytorch-lightning/blob/f6fe715e73cc87bfc106329f9103564097515379/pytorch_lightning/callbacks/lr_monitor.py#L138-L156
cc @simran2905
Alternatives
Additional context
The text was updated successfully, but these errors were encountered: