Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Report specificity, sensitivity etc for binary classification with -test #57

Open
vdemario opened this issue Mar 21, 2016 · 1 comment

Comments

@vdemario
Copy link

I made a small patch on my own fork to report a little bit more data with -test when growforest is finishing. It looks like this:

Error: 0.06121835978431722
Accuracy: 48510 / 51673 = 0.9387881485495326
True Negatives 24999 / Total Negatives 26585 = Specificity (True Negative Rate) 0.940342
True Positives 23511 / Total Positives 25088 = Sensitivity (True Positive Rate) 0.937141
True Positives 23511 / Predicted Positives 25097 = Precision (Positive Predictive Value) 0.936805
True Negatives 24999 / Predicted Negatives 26576 = Negative Predictive Value 0.940661
F1 Score: 0.936973

I didn't make a PR because in my little patch I just assumed I was performing classification with 2 categories (it's what I always do) and didn't check if this was really the case.

Would this be useful in general? If so, I can add the checks to run this only when it makes sense and submit it.

@ryanbressler
Copy link
Owner

I'm not sure how useful it is to have these reported by the utility (i mostly export the predictions and do my validation elsewhere using roc auc) but they could certainly be in the code somewhere for others to use if needed. Thoughts?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants