-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Some inconsistence in the paper and the code #2
Comments
I also wanna know something about the second question, which confuses me deeply. What's more, I have repeated the experiments in the paper and the accuracy decreases several percentages. I think it should be related to the two questions above. |
I agree with you. I think this operation is very tricky, since 'loss_add' is obtained through the best prediction of previous epochs of pseudo label miner, which is unfair for other baselines. Moreover, this trick can improve the performance greatly according to my ablation. |
I think it may be caused by the version of packages. |
It hints that problem happen in Relu() function. |
Hi, Dai
There is some inconsistence in the paper and the code listed below.
1---It is writen that f_p and f_e will pretrain in the paper, but in the code it seems you just compute the feature cos simmilarities to get the potential edge set at the very beginning(and this is very essential. Without this step the performance greatlydecreases). I don't see any pretrain step.
2---The total loss in the paper are composed with L_E(reconstruction loss), L_p(the crossentropy loss of the pseudo label predictor on training set) and L_G(the crossentropy loss of the final classifier). It is writen that argmin L_G + αL_E + βL_P
On contrary, the line 133 in NRGNN.py, "total_loss = loss_gcn + loss_pred + self.args.alpha * rec_loss + self.args.beta * loss_add", the loss_add is not consistent with the lossL_p. Apparently there are four components in the code, and the loss_pred is the L_P in the paper. Is there any details about loss_add in paper that i missed?
Thanks
The text was updated successfully, but these errors were encountered: