Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refactor anomalib to new annotation format, add refurb and pyupgrade #845

Merged
merged 87 commits into from
Jan 26, 2023

Conversation

samet-akcay
Copy link
Contributor

Description

  • Refactor anomalib to new annotation format
  • Modify the code using refurb. The tool has not yet been added to the pre-commit configuration since it requires Python3.10 to execute.
  • Add pyupgrade to pre-commit settings and modify the code accordingly.

Changes

  • Bug fix (non-breaking change which fixes an issue)
  • Refactor (non-breaking change which refactors the code base)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • This change requires a documentation update

Checklist

  • My code follows the pre-commit style and check guidelines of this project.
  • I have performed a self-review of my code
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective or that my feature works
  • New and existing tests pass locally with my changes
  • I have added a summary of my changes to the CHANGELOG (not for minor changes, docs and tests).

@github-actions github-actions bot added the Tests label Jan 17, 2023
Copy link
Contributor

@djdameln djdameln left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks. Much more readable now. I have a few comments. My main concern is with the deletion of unused attributes with the del keyword, which feels a bit hacky. Maybe we could think of an alternative approach.

anomalib/data/btech.py Outdated Show resolved Hide resolved
anomalib/models/cfa/lightning_model.py Show resolved Hide resolved
super().__init__()
logger.info("Initializing %s model.", self.__class__.__name__)

self.save_hyperparameters()
self.model: nn.Module
self.loss: Tensor
self.callbacks: List[Callback]
self.loss: nn.Module
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't understand this change. As far as I know, the loss attribute holds the value of the loss between epochs, so it should be a Tensor.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

in almost all of the implementations, we use self.loss to compute the loss. For example, here

self.loss = DraemLoss()

When we define this as Tensor, pylint and mypy complains, because we use this as Callable

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If PL uses this under the hood, we could then rename this to self.loss_func

anomalib/models/components/feature_extractors/timm.py Outdated Show resolved Hide resolved
anomalib/models/csflow/lightning_model.py Outdated Show resolved Hide resolved
anomalib/utils/callbacks/cdf_normalization.py Show resolved Hide resolved
anomalib/utils/cli/cli.py Show resolved Hide resolved
setup.py Show resolved Hide resolved
Copy link
Contributor

@djdameln djdameln left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, just one minor comment, which could probably be ignored

"channels_hidden": self.cross_conv_hidden_channels,
"kernel_size": self.kernel_sizes[coupling_block],
},
nodes.extend([permute_node])
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I feel this would best be left as append, because it just adds a single instance. But if refurb complains about this I'm fine with this change.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I feel so too, but just did it this way to silence refurb :)

@@ -147,7 +149,7 @@ def __set_default_root_dir(self) -> None:
# If `resume_from_checkpoint` is not specified, it means that the project has not been created before.
# Therefore, we need to create the project directory first.
if config.trainer.resume_from_checkpoint is None:
root_dir = config.trainer.default_root_dir if config.trainer.default_root_dir else "./results"
root_dir = config.trainer.default_root_dir or "./results"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

cool, I was not aware of this syntax

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

me neither :)

@github-actions github-actions bot added the Docs label Jan 26, 2023
@samet-akcay samet-akcay merged commit 01f3323 into main Jan 26, 2023
@samet-akcay samet-akcay deleted the refactor/add-pyupgrade-and-refurb branch January 26, 2023 16:42
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants