You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
When the confidence is very high for images treated by the patchcore model, the anomaly score can be nan.
This is due to the formula:
weights = 1 - (torch.max(torch.exp(confidence)) / torch.sum(torch.exp(confidence)))
in anomaly_map.py, torch.exp() gives nan for values too large.
Expected behavior
Two possible solutions:
Either a threshold should be set on the tensors, like
confidence[confidence>thres] = thres
Possible values for thres are e.g., 50.0.
Or, even cleaner, replace 1- (torch.max(torch.exp(confidence)) / torch.sum(torch.exp(confidence))) by 1-torch.max(torch.nn.Softmax()(confidence))
The text was updated successfully, but these errors were encountered:
Hi, thanks for spotting this and apologies for the late response. I agree that the latter solution, replacing by 1 - torch.max(torch.nn.Softmax()(confidence)), would be the best way to address this. Would you be interested in submitting a PR for this so you could become a contributor?
Describe the bug
When the confidence is very high for images treated by the patchcore model, the anomaly score can be nan.
This is due to the formula:
weights = 1 - (torch.max(torch.exp(confidence)) / torch.sum(torch.exp(confidence)))
in anomaly_map.py, torch.exp() gives nan for values too large.
Expected behavior
Two possible solutions:
Either a threshold should be set on the tensors, like
confidence[confidence>thres] = thres
Possible values for thres are e.g., 50.0.
Or, even cleaner, replace 1- (torch.max(torch.exp(confidence)) / torch.sum(torch.exp(confidence))) by 1-torch.max(torch.nn.Softmax()(confidence))
The text was updated successfully, but these errors were encountered: