-
Notifications
You must be signed in to change notification settings - Fork 415
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Functional Confusion Matrix with Multi-Label #100
Comments
Hi! thanks for your contribution!, great first issue! |
So I guess the more usefull expected behavior would be the Multi-Label Confusion Matrix output like |
I'm in the same case as you. I don't expect the same output as you. In my opinion the output should be a list/Tensor of binary confusion matrices of length num_classes like in this sklearn example Personally I iterate over the the predictions and targets as
|
Thanks for the response! Yeah I look in to sklearn and understand what I should be getting now. I am doing what you suggested to get the response now. |
IMO yes if the philosophy is to create a pytorch optimised sklearn. |
🐛 Bug
I am trying to analyze a model that has multi-label predictions. When creating a confusion matrix with the functional
confusion_matrix
method, I get a much different result than expected. I may be misunderstanding how this is supposed to work so any help would be appreciated!To Reproduce
Steps to reproduce the behavior:
torch.sigmoid
applied to the output(N,C)
and have a matching shape truth data.confusion_matrix
method on the dataCode sample
Expected behavior
I would expect the confusion matrix to count the classes that were predicted for each true class. I may be wrong
Environment
conda
,pip
, source): condaThanks for the great project and help!!
The text was updated successfully, but these errors were encountered: