-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
CLI: add arg - instantiate_only #8251
Conversation
Codecov Report
@@ Coverage Diff @@
## master #8251 +/- ##
=======================================
+ Coverage 88% 92% +4%
=======================================
Files 212 213 +1
Lines 13701 13800 +99
=======================================
+ Hits 12125 12732 +607
+ Misses 1576 1068 -508 |
Can you elaborate why one would want this? |
at him moment you have to always run the fit, right? but if you want just to get the instances of Trainer, Model, and Data and then your own staff you need to overwrite |
Co-authored-by: Ethan Harris <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
at him moment you have to always run the fit, right? but if you want just to get the instances of Trainer, Model, and Data and then your own staff you need to overwrite
CLI.fit(...)
to be empty...
moreover (well it is docs) but from API is it not intuitive that with creating a CLI instance it automatically perform training
This is the whole point of the CLI. With #7508 we could make it so not passing the function to run has the same effect as what this PR wants.
But this flag doesn't fit right with me when the only purpose of the CLI is to fit
right now.
cc @mauvilsa for thoughts
for me, the main purpose of CLI is to instantiate all main actors - Trainer, Model and Data So what I have to do for use Tuner: class TuneFitCLI(LightningCLI):
def before_fit(self) -> None:
"""Implement to run some code before fit is started"""
res = self.trainer.tune(**self.fit_kwargs, scale_batch_size_kwargs=dict(max_trials=5))
self.instantiate_classes()
torch.cuda.empty_cache()
self.datamodule.batch_size = int(res['scale_batch_size'] * 0.9)
if __name__ == '__main__':
cli = TuneFitCLI(
model_class=MultiPlantPathology,
datamodule_class=PlantPathologyDM,
trainer_defaults=TRAINER_DEFAULTS,
seed_everything_default=42,
) |
So you say that if don't want to run any Trainer method by default I cannot use CLI or need to overwrite the CLI.fit()? |
I am saying that |
for example for the tune, or does extra action before fit or just print model details, you can see any past CLI in Bolts @mauvilsa so are we just about the name? so let's call it |
The discussion is scattered on both PR. Writing here what I think is the best way forward is:
|
@carmocca if your PR has the option to not run any trainer method, then it is fine and we do not need this :] |
closing this one in favor of #7508 |
What does this PR do?
allow
instantiate_only
Before submitting
PR review
Anyone in the community is free to review the PR once the tests have passed.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:
Did you have fun?
Make sure you had fun coding 🙃