You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@Borda thanks for your advice.
i follow your suggestion in #543 that setting the average as 'macro'. however, it doesn't work.
i @ u in #1111, but you have not replied yet. i try to re-open #1111 , but fail.
therefore, i have to create a new issue for discussion.
To Reproduce
here is the code that compares torchmetrics and sklearn.metrics
i have read the doc of torchmetrics and sklearn.metrics, and found that torchmetrics didn't have binary option for average.
details are here: sklearn.metrics-doc and torchmetrics-doc
The text was updated successfully, but these errors were encountered:
Hi @Lucienxhh,
We are aware of this problem when evaluating the metrics in for binary problems and are in the process of a larger refactor of the classification package (#1001)
In the mean time if you want to match the results of sklearn you can do something like this:
torchmetrics_recall=torchmetrics.Recall(average=None, num_classes=num_classes) # return score for both classestorchmetrics_recall(preds, target)[-1] # only extract the positive which corresponds to sklearns
🐛 Bug
@Borda thanks for your advice.
i follow your suggestion in #543 that setting the average as 'macro'. however, it doesn't work.
i @ u in #1111, but you have not replied yet. i try to re-open #1111 , but fail.
therefore, i have to create a new issue for discussion.
To Reproduce
here is the code that compares
torchmetrics
andsklearn.metrics
the results differ, which makes me confused.
Expected behavior
i have read the doc of
torchmetrics
andsklearn.metrics
, and found thattorchmetrics
didn't havebinary
option foraverage
.details are here: sklearn.metrics-doc and torchmetrics-doc
The text was updated successfully, but these errors were encountered: