-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix bug when using iterbaserunner with 'val' workflow #542
Conversation
Codecov Report
@@ Coverage Diff @@
## master #542 +/- ##
=======================================
Coverage 79.87% 79.88%
=======================================
Files 107 107
Lines 6093 6095 +2
Branches 987 988 +1
=======================================
+ Hits 4867 4869 +2
Misses 1094 1094
Partials 132 132
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
mmcls/models/classifiers/base.py
Outdated
optimizer (:obj:`torch.optim.Optimizer` | dict): The optimizer of | ||
runner is passed to ``train_step()``. This argument is unused | ||
and reserved. | ||
optimizer (:obj:`torch.optim.Optimizer` | dict | optional): The |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Well, the right usage should be
optimizer (:obj:`torch.optim.Optimizer` | dict | optional): The | |
optimizer (:obj:`torch.optim.Optimizer` | dict, optional): The |
Because optional
is not a type, but a kind of annotation.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
…n-mmlab#542) * add kwargs and default of optimizer in train_step and val_step * update docstring * update docstring * update optional annotation
Motivation
fix bug when using iterbaserunner with 'val' workflow. refer to #535
Align the train_step and val_step with epochBaseRunner and iterBaseRunner in mmcv
Modification
add a default value None for optimizer
add kwargs
Checklist
Before PR:
After PR: