Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The Macro-F1 metric is being calculated incorrectly. #403

Closed
celsofranssa opened this issue Jul 25, 2021 · 3 comments · Fixed by #303
Closed

The Macro-F1 metric is being calculated incorrectly. #403

celsofranssa opened this issue Jul 25, 2021 · 3 comments · Fixed by #303
Labels
bug / fix Something isn't working duplicate This issue or pull request already exists help wanted Extra attention is needed
Milestone

Comments

@celsofranssa
Copy link

🐛 Bug

F1 metric outputs wrong value when using average="macro"

To Reproduce

Steps to reproduce the behavior:

import torch
import torchmetrics
from torchmetrics import F1, MetricCollection
from sklearn.metrics import f1_score, accuracy_score

# torchmetrics
metrics = MetricCollection(
            metrics={
                "MicF1": F1(num_classes=11, average="micro"),
                "MacF1": F1(num_classes=11, average="macro")
            }
    )

# examples of target classes
target=torch.tensor([ 8, 10,  5,  2])

# examples of model predictions (LogSoftmax)
preds=torch.tensor([
        [-2.6118, -2.8115, -2.8481, -2.4452, -2.5818, -2.0106, -2.3749, -1.8872,
         -2.1726, -2.5530, -2.5767],
        [-2.5555, -2.7056, -2.8433, -2.4674, -2.5098, -1.8838, -2.3232, -2.0609,
         -2.3225, -2.4336, -2.6861],
        [-2.8012, -2.7260, -2.7958, -2.4332, -2.4809, -1.8476, -2.2616, -2.0058,
         -2.3366, -2.5866, -2.6174],
        [-2.7187, -2.7297, -2.6931, -2.2206, -2.4540, -2.1421, -2.4888, -1.9133,
         -2.0793, -2.6489, -2.7666]
    ])

# torchmetric output
metrics(torch.argmax(preds, dim=-1),target)
# {'MacF1': tensor(0.0606), 'MicF1': tensor(0.2500)}

# skitlearning output 
{
    "MicF1":f1_score(target, torch.argmax(preds, dim=-1), average='micro'),
    "MacF1": f1_score(target, torch.argmax(preds, dim=-1), average='macro')
 }
#{'MacF1': 0.13333333333333333, 'MicF1': 0.25}

Code sample

Expected behavior

F1 metric values (macro and micro average), should be the same as scikit-learn.

@celsofranssa celsofranssa added bug / fix Something isn't working help wanted Extra attention is needed labels Jul 25, 2021
@github-actions
Copy link

Hi! thanks for your contribution!, great first issue!

@celsofranssa
Copy link
Author

I've found this issue is related to #295.

@SkafteNicki SkafteNicki added the duplicate This issue or pull request already exists label Jul 26, 2021
@SkafteNicki
Copy link
Member

Duplicate of #295, will be fixed by #303

@SkafteNicki SkafteNicki linked a pull request Jul 26, 2021 that will close this issue
4 tasks
@Borda Borda added this to the v0.5 milestone Aug 18, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug / fix Something isn't working duplicate This issue or pull request already exists help wanted Extra attention is needed
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants