Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Teacher could not acquire grads from student's cross entropy loss #6

Open
TheBobbyliu opened this issue Sep 23, 2022 · 0 comments
Open

Comments

@TheBobbyliu
Copy link

According to the article, in the second step, we calculate CrossEntropy loss of S' on samples from a separate quiz set, and then calculate the gradients of CE loss with respect to the parameters of teacher.
The problem is, teacher's parameters seem to be not related to student's loss, so when I run the code of meta_loops.py, in line 115:
t_grads = torch.autograd.grad(s_prime_loss, t_model.parameters())
I only get None for all grads of teacher's parameters.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant