Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

loss #6

Open
elilth opened this issue Jul 28, 2018 · 1 comment
Open

loss #6

elilth opened this issue Jul 28, 2018 · 1 comment

Comments

@elilth
Copy link

elilth commented Jul 28, 2018

Hi!
        I want to ask you, is there any difference in output loss?
return self.embedding_loss(Fx, Fe) + self.config.solver.alpha * self.output_loss(prediction, labels) + lamda * l2_norm # self.cross_loss(features, labels, keep_prob)

Is it possible here?:
self.output_loss(prediction, labels) ?

@greeness
Copy link

You should combine embedding_loss and output_loss to reproduce the loss introduced in the original paper, by considering both the embedding error and the label correlation error.

If you only use the label_correlation loss ("self.output_loss(prediction, labels), you cannot enforce the network to learn to minimize the embedding loss, which is definitely not good.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants