Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Accuracy metric - negative ignore_index fails when combined with subset_accuracy=True #961

Closed
antifriz opened this issue Apr 15, 2022 · 2 comments · Fixed by #1195
Closed
Assignees
Labels
bug / fix Something isn't working help wanted Extra attention is needed
Milestone

Comments

@antifriz
Copy link

🐛 Bug

To Reproduce

Combine ignore_index=-1 with subset_accuracy=True

Code sample

import torch
import torchmetrics

accuracy = torchmetrics.Accuracy(
    subset_accuracy=True,
    ignore_index=-1,
)
# element with index 2 will always be outputted
preds = torch.arange(45, dtype=torch.float).reshape(3, 5, 3).permute(0, 2, 1)
target = torch.tensor(
    [
        [2, 2, 2, 0, -1],
        [2, 2, -1, -1, -1],
        [2, 2, 2, 2, 2],
    ],
    dtype=torch.int64,
)
accuracy.update(preds=preds, target=target)

Expected behavior

Not crashing

Environment

  • OS (e.g., Linux): OSX
  • Python & PyTorch Version (e.g., 1.0): 3.9 (torch==1.11.0, torchmetrics==0.8.0)
  • How you installed PyTorch (conda, pip, build command if you used source): pip
  • Any other relevant information:

Additional context

Stack trace

Traceback (most recent call last):
  File "main.py", line 18, in <module>
    accuracy.update(preds=preds, target=target)
  File ".../python3.9/site-packages/torchmetrics/metric.py", line 312, in wrapped_func
    update(*args, **kwargs)
  File ".../python3.9/site-packages/torchmetrics/classification/accuracy.py", line 233, in update
    correct, total = _subset_accuracy_update(
  File ".../python3.9/site-packages/torchmetrics/functional/classification/accuracy.py", line 224, in _subset_accuracy_update
    preds, target, mode = _input_format_classification(
  File ".../python3.9/site-packages/torchmetrics/utilities/checks.py", line 432, in _input_format_classification
    target = to_onehot(target, max(2, num_classes))  # type: ignore
  File ".../python3.9/site-packages/torchmetrics/utilities/data.py", line 99, in to_onehot
    return tensor_onehot.scatter_(1, index, 1.0)
RuntimeError: index -1 is out of bounds for dimension 1 with size 3
@antifriz antifriz added bug / fix Something isn't working help wanted Extra attention is needed labels Apr 15, 2022
@github-actions
Copy link

Hi! thanks for your contribution!, great first issue!

@Borda Borda added this to the v0.8 milestone Apr 15, 2022
@SkafteNicki SkafteNicki removed this from the v0.8 milestone May 5, 2022
@Borda Borda added this to the v0.10 milestone Jul 27, 2022
@SkafteNicki
Copy link
Member

Issue will be fixed by classification refactor: see this issue #1001 and this PR #1195 for all changes

Small recap: This issue describes that using an negative ignore_index fails when combined with subset_accuracy=True. After the refactor, accuracy will no longer support the subset_accuracy argument. Instead this functionality have been split into its own metric called exact_match. This metric (together with all other classification metrics) now support negative ignore_index arguments.

from torchmetrics.functional import multilabel_exact_match
import torch
preds = torch.tensor([[0.11, 0.22, 0.84], [0.73, 0.33, 0.92]])
target = torch.tensor([[0, 1, -1], [1, 0, 1]])
multilabel_exact_match(preds, target, num_labels=3, ignore_index=-1)

which works and gives the correct result.
Issue will be closed when #1195 is merged.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug / fix Something isn't working help wanted Extra attention is needed
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants