You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi!
I want to ask you, is there any difference in output loss?
return self.embedding_loss(Fx, Fe) + self.config.solver.alpha * self.output_loss(prediction, labels) + lamda * l2_norm # self.cross_loss(features, labels, keep_prob)
Is it possible here?:
self.output_loss(prediction, labels) ?
The text was updated successfully, but these errors were encountered:
You should combine embedding_loss and output_loss to reproduce the loss introduced in the original paper, by considering both the embedding error and the label correlation error.
If you only use the label_correlation loss ("self.output_loss(prediction, labels), you cannot enforce the network to learn to minimize the embedding loss, which is definitely not good.
Hi!
I want to ask you, is there any difference in output loss?
return self.embedding_loss(Fx, Fe) + self.config.solver.alpha * self.output_loss(prediction, labels) + lamda * l2_norm # self.cross_loss(features, labels, keep_prob)
Is it possible here?:
self.output_loss(prediction, labels) ?
The text was updated successfully, but these errors were encountered: