-
Notifications
You must be signed in to change notification settings - Fork 690
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Refactor anomalib to new annotation format, add refurb
and pyupgrade
#845
Conversation
…tor/add-pyupgrade-and-refurb
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks. Much more readable now. I have a few comments. My main concern is with the deletion of unused attributes with the del
keyword, which feels a bit hacky. Maybe we could think of an alternative approach.
super().__init__() | ||
logger.info("Initializing %s model.", self.__class__.__name__) | ||
|
||
self.save_hyperparameters() | ||
self.model: nn.Module | ||
self.loss: Tensor | ||
self.callbacks: List[Callback] | ||
self.loss: nn.Module |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't understand this change. As far as I know, the loss
attribute holds the value of the loss between epochs, so it should be a Tensor.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
in almost all of the implementations, we use self.loss
to compute the loss. For example, here
self.loss = DraemLoss() |
When we define this as Tensor
, pylint and mypy complains, because we use this as Callable
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If PL uses this under the hood, we could then rename this to self.loss_func
Co-authored-by: Dick Ameln <[email protected]>
…notoolkit/anomalib into refactor/add-pyupgrade-and-refurb
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, just one minor comment, which could probably be ignored
"channels_hidden": self.cross_conv_hidden_channels, | ||
"kernel_size": self.kernel_sizes[coupling_block], | ||
}, | ||
nodes.extend([permute_node]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I feel this would best be left as append, because it just adds a single instance. But if refurb complains about this I'm fine with this change.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I feel so too, but just did it this way to silence refurb :)
@@ -147,7 +149,7 @@ def __set_default_root_dir(self) -> None: | |||
# If `resume_from_checkpoint` is not specified, it means that the project has not been created before. | |||
# Therefore, we need to create the project directory first. | |||
if config.trainer.resume_from_checkpoint is None: | |||
root_dir = config.trainer.default_root_dir if config.trainer.default_root_dir else "./results" | |||
root_dir = config.trainer.default_root_dir or "./results" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
cool, I was not aware of this syntax
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
me neither :)
Description
refurb
. The tool has not yet been added to the pre-commit configuration since it requires Python3.10 to execute.Changes
Checklist