Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixing metrics calculation in ClassificationSolver #63

Merged
merged 1 commit into from
Nov 24, 2022

Conversation

SebieF
Copy link
Collaborator

@SebieF SebieF commented Nov 24, 2022

Typo: "macro" vs. "micro" for micro_f1_score and using "macro" precision, recall, f1 score for binary classification as suggested here: https://stats.stackexchange.com/questions/99694/what-does-it-imply-if-accuracy-and-recall-are-the-same
@SebieF SebieF added the bug Something isn't working label Nov 24, 2022
@SebieF SebieF self-assigned this Nov 24, 2022
@SebieF SebieF merged commit b3a4f01 into sacdallago:main Nov 24, 2022
@SebieF SebieF deleted the metrics-hotfix branch November 24, 2022 15:08
@SebieF SebieF mentioned this pull request Dec 5, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant