Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added add_metrics method to MetricCollection #221

Merged
merged 9 commits into from
May 4, 2021

Conversation

IgorHoholko
Copy link
Contributor

@IgorHoholko IgorHoholko commented May 2, 2021

Fixes #203

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure to update the docs?
  • Did you write any new necessary tests?

What does this PR do?

add_metrics() method for MetricCollection class.

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

@IgorHoholko IgorHoholko changed the title added add_metrics method to MetricCollection (#203) Added add_metrics method to MetricCollection (#203) May 2, 2021
@codecov
Copy link

codecov bot commented May 2, 2021

Codecov Report

Merging #221 (cd1db7a) into master (cb6899b) will decrease coverage by 8.80%.
The diff coverage is 84.61%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #221      +/-   ##
==========================================
- Coverage   96.78%   87.97%   -8.81%     
==========================================
  Files         184      184              
  Lines        5976     5965      -11     
==========================================
- Hits         5784     5248     -536     
- Misses        192      717     +525     
Flag Coverage Δ
Linux 79.12% <84.61%> (+0.01%) ⬆️
Windows 79.12% <84.61%> (+0.01%) ⬆️
cpu 79.12% <84.61%> (-17.67%) ⬇️
gpu 96.78% <ø> (+<0.01%) ⬆️
macOS 79.12% <84.61%> (-17.67%) ⬇️
pytest 87.97% <84.61%> (-8.81%) ⬇️
python3.6 ?
python3.8 ?
python3.9 ?
torch1.3.1 ?
torch1.4.0 ?
torch1.8.1 ?

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
torchmetrics/collections.py 74.71% <84.61%> (-24.12%) ⬇️
torchmetrics/utilities/distributed.py 17.24% <0.00%> (-81.04%) ⬇️
torchmetrics/classification/auc.py 47.82% <0.00%> (-52.18%) ⬇️
torchmetrics/functional/classification/auroc.py 46.15% <0.00%> (-40.01%) ⬇️
torchmetrics/metric.py 55.98% <0.00%> (-39.52%) ⬇️
torchmetrics/functional/regression/psnr.py 60.60% <0.00%> (-36.37%) ⬇️
torchmetrics/functional/classification/accuracy.py 57.57% <0.00%> (-36.37%) ⬇️
...chmetrics/functional/classification/stat_scores.py 66.66% <0.00%> (-33.34%) ⬇️
...rics/functional/classification/confusion_matrix.py 66.66% <0.00%> (-33.34%) ⬇️
torchmetrics/utilities/checks.py 60.54% <0.00%> (-31.90%) ⬇️
... and 231 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update cb6899b...cd1db7a. Read the comment docs.

@SkafteNicki SkafteNicki added the enhancement New feature or request label May 3, 2021
@SkafteNicki SkafteNicki added this to the v0.4 milestone May 3, 2021
Copy link
Member

@SkafteNicki SkafteNicki left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@SkafteNicki SkafteNicki marked this pull request as ready for review May 3, 2021 07:25
@Borda Borda changed the title Added add_metrics method to MetricCollection (#203) Added add_metrics method to MetricCollection May 3, 2021
@Borda Borda enabled auto-merge (squash) May 3, 2021 17:20
@@ -185,6 +147,51 @@ def persistent(self, mode: bool = True) -> None:
for _, m in self.items():
m.persistent(mode)

def add_metrics(self, metrics: Union[Metric, Sequence[Metric], Dict[str, Metric]],
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
def add_metrics(self, metrics: Union[Metric, Sequence[Metric], Dict[str, Metric]],
def append(self, metrics: Union[Metric, Sequence[Metric], Dict[str, Metric]],

would it be a better name?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it's fine to keep the name different. If we say append it might imply that it won't work as extend while here it does.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

add_metrics seems more appropriate for a Collection.

if isinstance(metrics, Metric):
# set compatible with original type expectations
metrics = [metrics]
if isinstance(metrics, Sequence):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I understand this is just copying what we had here before, but we should probably have Iterable here instead of Sequence as we doin't really care about indexing.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

but iterable does not have len, right?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We are not using len though, we are just directly converting to a list in the next line.

@@ -185,6 +147,51 @@ def persistent(self, mode: bool = True) -> None:
for _, m in self.items():
m.persistent(mode)

def add_metrics(self, metrics: Union[Metric, Sequence[Metric], Dict[str, Metric]],
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It should be Mapping[str, Metrics], not Dict + similar a check below.

@@ -185,6 +147,51 @@ def persistent(self, mode: bool = True) -> None:
for _, m in self.items():
m.persistent(mode)

def add_metrics(self, metrics: Union[Metric, Sequence[Metric], Dict[str, Metric]],
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it's fine to keep the name different. If we say append it might imply that it won't work as extend while here it does.

@Borda
Copy link
Member

Borda commented May 3, 2021

@IgorHoholko mind check the comments above?

@mergify mergify bot added the has conflicts label May 3, 2021
@mergify mergify bot removed the has conflicts label May 4, 2021
Copy link
Contributor

@tchaton tchaton left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM ! Worth to increase the coverage slightly.

@@ -185,6 +147,51 @@ def persistent(self, mode: bool = True) -> None:
for _, m in self.items():
m.persistent(mode)

def add_metrics(self, metrics: Union[Metric, Sequence[Metric], Dict[str, Metric]],
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

add_metrics seems more appropriate for a Collection.

@Borda Borda merged commit 27bc28e into Lightning-AI:master May 4, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request ready
Projects
None yet
Development

Successfully merging this pull request may close these issues.

True update() for MetricCollection
5 participants