Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add typing to lightning.tuner #7117

Closed
wants to merge 24 commits into from
Closed

Add typing to lightning.tuner #7117

wants to merge 24 commits into from

Conversation

aniketmaurya
Copy link
Contributor

@aniketmaurya aniketmaurya commented Apr 20, 2021

What does this PR do?

Part of #7037.

Before submitting

  • Was this discussed/approved via a GitHub issue? (not for typos and docs)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes? (if necessary)
  • Did you write any new necessary tests? (not for typos and docs)
  • Did you verify new and existing tests pass locally with your changes?
  • Did you update the CHANGELOG? (not for typos, docs, test updates, or internal minor changes/refactorings)

PR review

Anyone in the community is free to review the PR once the tests have passed.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:

  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified

Did you have fun?

Make sure you had fun coding 🙃

@aniketmaurya aniketmaurya changed the title Add typing Add typing to lightning.tuner Apr 20, 2021
pytorch_lightning/tuner/auto_gpu_select.py Outdated Show resolved Hide resolved
Comment on lines 18 to 19
if TYPE_CHECKING:
from pytorch_lightning import Trainer
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this won't work. Use import pytorch_lightning as pl instead and annotate with 'pl.Trainer'

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@justusschock can you please mention the reason why it won't work?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sphinx cannot deal with the forward references you provide here as a result from the optional import

pytorch_lightning/tuner/tuning.py Outdated Show resolved Hide resolved
@aniketmaurya aniketmaurya marked this pull request as ready for review April 20, 2021 16:20
Comment on lines 19 to 20
if TYPE_CHECKING:
import pytorch_lightning as pl
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this has to be imported without the if TYPE_CHECKING guard

Comment on lines 18 to 19
if TYPE_CHECKING:
import pytorch_lightning as pl
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this has to be imported without the if TYPE_CHECKING guard

@codecov
Copy link

codecov bot commented Apr 22, 2021

Codecov Report

Merging #7117 (432dd03) into master (a584196) will decrease coverage by 4%.
The diff coverage is 97%.

@@           Coverage Diff           @@
##           master   #7117    +/-   ##
=======================================
- Coverage      92%     88%    -4%     
=======================================
  Files         196     196            
  Lines       12828   12830     +2     
=======================================
- Hits        11825   11265   -560     
- Misses       1003    1565   +562     

@@ -311,7 +312,7 @@ def func():

return func

def plot(self, suggest: bool = False, show: bool = False):
def plot(self, suggest: bool = False, show: bool = False) -> 'plt.Figure':
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

'plt.Figure' is causing PEP8 to fail. ./pytorch_lightning/tuner/lr_finder.py:315: [F821] undefined name 'plt'
What should be done here? @justusschock

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is pyplot imported there?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

pyplot is imported inside the function

@Borda
Copy link
Member

Borda commented May 4, 2021

@aniketmaurya any update here? can we resolve conflicts? :]

@aniketmaurya
Copy link
Contributor Author

@aniketmaurya any update here? can we resolve conflicts? :]

I'll resolve the conflicts in few hours.

@mergify mergify bot removed the has conflicts label May 5, 2021
@aniketmaurya aniketmaurya reopened this May 7, 2021
@aniketmaurya
Copy link
Contributor Author

aniketmaurya commented May 7, 2021

By mistake closed the PR 😅

@@ -60,6 +63,174 @@ def _determine_lr_attr_name(trainer: 'pl.Trainer', model: 'pl.LightningModule')
)


def lr_find(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could you move it to the old place. easier for us to review the changes :)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done @awaelchli

@@ -346,7 +346,10 @@ def on_batch_start(self, trainer, pl_module):

self.lrs.append(trainer.lr_schedulers[0]['scheduler'].lr[0])

def on_train_batch_end(self, trainer, pl_module, outputs, batch, batch_idx, dataloader_idx):
def on_train_batch_end(
self, trainer: 'pl.Trainer', pl_module: 'pl.LightningModule', outputs, batch, batch_idx: Optional[int],
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can't figure out the type for outputs and batch

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

outputs: Optional[Union[tensor, Dict[str, Any]]
batch: Any

outputs is what the user can return from training_step and it can be None (signaling skipping a step) a tensor (just the loss) or a dict with multiple items and at least a key with the name loss.

@Borda
Copy link
Member

Borda commented May 11, 2021

@aniketmaurya mind check the last issues:

pytorch_lightning/tuner/auto_gpu_select.py:19: error: Function is missing a type annotation  [no-untyped-def]
pytorch_lightning/tuner/auto_gpu_select.py:40: error: Function is missing a return type annotation  [no-untyped-def]
pytorch_lightning/tuner/batch_size_scaling.py:42: error: Return value expected  [return-value]
pytorch_lightning/tuner/batch_size_scaling.py:254: error: Function is missing a type annotation for one or more arguments  [no-untyped-def]
pytorch_lightning/tuner/lr_finder.py:98: error: Need type annotation for 'results' (hint: "results: Dict[<type>, <type>] = ...")  [var-annotated]
pytorch_lightning/tuner/lr_finder.py:108: error: Function is missing a return type annotation  [no-untyped-def]
pytorch_lightning/tuner/lr_finder.py:144: error: Function is missing a return type annotation  [no-untyped-def]
pytorch_lightning/tuner/lr_finder.py:175: error: Function is missing a return type annotation  [no-untyped-def]
pytorch_lightning/tuner/lr_finder.py:210: error: Return value expected  [return-value]
pytorch_lightning/tuner/lr_finder.py:247: error: Cannot assign to a method  [assignment]
pytorch_lightning/tuner/lr_finder.py:302: error: Cannot assign to a method  [assignment]
pytorch_lightning/tuner/lr_finder.py:334: error: Need type annotation for 'losses' (hint: "losses: List[<type>] = ...")  [var-annotated]
pytorch_lightning/tuner/lr_finder.py:335: error: Need type annotation for 'lrs' (hint: "lrs: List[<type>] = ...")  [var-annotated]
pytorch_lightning/tuner/lr_finder.py:349: error: Value of type "Optional[List[Any]]" is not indexable  [index]
pytorch_lightning/tuner/lr_finder.py:418: error: Function is missing a return type annotation  [no-untyped-def]
pytorch_lightning/tuner/lr_finder.py:456: error: Function is missing a return type annotation  [no-untyped-def]
pytorch_lightning/tuner/tuning.py:54: error: Incompatible types in assignment (expression has type "Optional[_LRFinder]", target has type "Optional[int]")  [assignment]
pytorch_lightning/tuner/tuning.py:58: error: Incompatible return value type (got "Dict[str, Optional[int]]", expected "Dict[str, Union[int, _LRFinder, None]]")  [return-value]
pytorch_lightning/tuner/tuning.py:58: note: "Dict" is invariant -- see http://mypy.readthedocs.io/en/latest/common_issues.html#variance
pytorch_lightning/tuner/tuning.py:58: note: Consider using "Mapping" instead, which is covariant in the value type
pytorch_lightning/tuner/tuning.py:58: note: Perhaps you need a type annotation for "result"? Suggestion: "Dict[str, Union[int, _LRFinder, None]]"
pytorch_lightning/tuner/tuning.py:133: error: Incompatible return value type (got "Union[int, _LRFinder, None]", expected "Optional[int]")  [return-value]
pytorch_lightning/tuner/tuning.py:201: error: Incompatible return value type (got "Union[int, _LRFinder, None]", expected "Optional[_LRFinder]")  [return-value]
Found 20 errors in 4 files (checked 408 source files)

@aniketmaurya
Copy link
Contributor Author

aniketmaurya commented May 11, 2021

pytorch_lightning/tuner/lr_finder.py:335: error: Need type annotation for 'lrs' (hint: "lrs: List[] = ...") [var-annotated]

Should I annotate method variables also @Borda ?

@Borda
Copy link
Member

Borda commented May 11, 2021

Should I annotate method variables also @Borda ?

yes if possible and if not pls add inline annotation skip the line from checking...

@aniketmaurya
Copy link
Contributor Author

Should I annotate method variables also @Borda ?

yes if possible and if not pls add inline annotation skip the line from checking...

Ok cool. Also there are few false positive with mypy specially where lr scheduler is used. I'll add that line to skip as well.

@stale
Copy link

stale bot commented May 30, 2021

This pull request has been automatically marked as stale because it has not had recent activity. It will be closed in 7 days if no further activity occurs. If you need further help see our docs: https://pytorch-lightning.readthedocs.io/en/latest/generated/CONTRIBUTING.html#pull-request or ask the assistance of a core contributor here or on Slack. Thank you for your contributions.

@stale stale bot added the won't fix This will not be worked on label May 30, 2021
@stale
Copy link

stale bot commented Jun 6, 2021

This pull request is going to be closed. Please feel free to reopen it create a new from the actual master.

@stale stale bot closed this Jun 6, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
refactor won't fix This will not be worked on
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants