The Macro-F1 metric is being calculated incorrectly. #403
Labels
bug / fix
Something isn't working
duplicate
This issue or pull request already exists
help wanted
Extra attention is needed
Milestone
🐛 Bug
F1 metric outputs wrong value when using
average="macro"
To Reproduce
Steps to reproduce the behavior:
Code sample
Expected behavior
F1 metric values (macro and micro average), should be the same as scikit-learn.
The text was updated successfully, but these errors were encountered: