Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Loss_res is NaN #1

Open
arid29 opened this issue Sep 21, 2023 · 1 comment
Open

Loss_res is NaN #1

arid29 opened this issue Sep 21, 2023 · 1 comment

Comments

@arid29
Copy link

arid29 commented Sep 21, 2023

Hi, As I was trying to reproduce the code, I found that Loss_res was somehow NaN which was making it not-differentiable, thus breaking the code. Can you please help me with the same?

Thanks

@MickeyLLG
Copy link

I revised the code and this works for me.

class ResLoss(nn.Module):

def __init__(self, reduction='mean'):
    super(ResLoss, self).__init__()
    self.criterion = nn.KLDivLoss(reduction='none')
    self.reduction = reduction

def forward(self, output, target, target_weight=None):
    B, K, _, _ = output.shape
    heatmaps_pred = output.reshape((B, K, -1))
    mark_i = torch.argmax(heatmaps_pred, dim=-1)

    heatmaps_gt = target.reshape((B, K, -1))
    mark_j = torch.argmax(heatmaps_gt, dim=-1)

    mask = torch.ones_like(heatmaps_pred)
    mask[:, :, mark_i] = 0
    mask[:, :, mark_j] = 0
    heatmaps_pred = heatmaps_pred * mask
    heatmaps_gt = heatmaps_gt * mask

    heatmaps_pred = F.log_softmax(heatmaps_pred, dim=-1)
    heatmaps_gt = F.softmax(heatmaps_gt, dim=-1)
    loss = F.kl_div(heatmaps_pred, heatmaps_gt, reduction=self.reduction)
    return loss

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants