-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add typing to lightning.tuner #7117
Conversation
if TYPE_CHECKING: | ||
from pytorch_lightning import Trainer |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this won't work. Use import pytorch_lightning as pl
instead and annotate with 'pl.Trainer'
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@justusschock can you please mention the reason why it won't work?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sphinx cannot deal with the forward references you provide here as a result from the optional import
pytorch_lightning/tuner/tuning.py
Outdated
if TYPE_CHECKING: | ||
import pytorch_lightning as pl |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this has to be imported without the if TYPE_CHECKING
guard
if TYPE_CHECKING: | ||
import pytorch_lightning as pl |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this has to be imported without the if TYPE_CHECKING
guard
Codecov Report
@@ Coverage Diff @@
## master #7117 +/- ##
=======================================
- Coverage 92% 88% -4%
=======================================
Files 196 196
Lines 12828 12830 +2
=======================================
- Hits 11825 11265 -560
- Misses 1003 1565 +562 |
pytorch_lightning/tuner/lr_finder.py
Outdated
@@ -311,7 +312,7 @@ def func(): | |||
|
|||
return func | |||
|
|||
def plot(self, suggest: bool = False, show: bool = False): | |||
def plot(self, suggest: bool = False, show: bool = False) -> 'plt.Figure': |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
'plt.Figure' is causing PEP8 to fail. ./pytorch_lightning/tuner/lr_finder.py:315: [F821] undefined name 'plt'
What should be done here? @justusschock
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is pyplot imported there?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
pyplot is imported inside the function
@aniketmaurya any update here? can we resolve conflicts? :] |
I'll resolve the conflicts in few hours. |
By mistake closed the PR 😅 |
pytorch_lightning/tuner/lr_finder.py
Outdated
@@ -60,6 +63,174 @@ def _determine_lr_attr_name(trainer: 'pl.Trainer', model: 'pl.LightningModule') | |||
) | |||
|
|||
|
|||
def lr_find( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
could you move it to the old place. easier for us to review the changes :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done @awaelchli
pytorch_lightning/tuner/lr_finder.py
Outdated
@@ -346,7 +346,10 @@ def on_batch_start(self, trainer, pl_module): | |||
|
|||
self.lrs.append(trainer.lr_schedulers[0]['scheduler'].lr[0]) | |||
|
|||
def on_train_batch_end(self, trainer, pl_module, outputs, batch, batch_idx, dataloader_idx): | |||
def on_train_batch_end( | |||
self, trainer: 'pl.Trainer', pl_module: 'pl.LightningModule', outputs, batch, batch_idx: Optional[int], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can't figure out the type for outputs
and batch
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
outputs: Optional[Union[tensor, Dict[str, Any]]
batch: Any
outputs is what the user can return from training_step and it can be None (signaling skipping a step) a tensor (just the loss) or a dict with multiple items and at least a key with the name loss.
for more information, see https://pre-commit.ci
@aniketmaurya mind check the last issues:
|
Should I annotate method variables also @Borda ? |
yes if possible and if not pls add inline annotation skip the line from checking... |
Ok cool. Also there are few false positive with mypy specially where lr scheduler is used. I'll add that line to skip as well. |
This pull request has been automatically marked as stale because it has not had recent activity. It will be closed in 7 days if no further activity occurs. If you need further help see our docs: https://pytorch-lightning.readthedocs.io/en/latest/generated/CONTRIBUTING.html#pull-request or ask the assistance of a core contributor here or on Slack. Thank you for your contributions. |
This pull request is going to be closed. Please feel free to reopen it create a new from the actual master. |
What does this PR do?
Part of #7037.
Before submitting
PR review
Anyone in the community is free to review the PR once the tests have passed.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:
Did you have fun?
Make sure you had fun coding 🙃