You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
for batch, label in zip(buffer_data, buffer_label):
self.zero_grad()
loglikelihood = F.cross_entropy(self(batch), label)
loglikelihood.backward()
for n, p in self.named_parameters():
n = n.replace('.', '__')
grads[n] = grads.get(n, 0) + p.grad ** 2
it runs much faster. Can you please investigate this?
Thank you.
Thanh Tung
The text was updated successfully, but these errors were encountered:
I found that performing forward-backward in a loop is much faster than using autograd.grad with retain_graph=True. Your current code is:
https://github.com/kuc2477/pytorch-ewc/blob/master/model.py#L75
after I change it to:
it runs much faster. Can you please investigate this?
Thank you.
Thanh Tung
The text was updated successfully, but these errors were encountered: