You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I noticed that when using torch.autograd.grad, the implementation does not pass create_graph parameter which by default is False. See L#86 and L#116 of the Meta class. To compute higher-order derivatives, we need to set create_graph=True otherwise derivatives w.r.t fast_weights will not flow to the derivatives computation i.e. first-order only. Is it intentional to not set the create_graph parameter?
The text was updated successfully, but these errors were encountered:
Hi, I noticed that when using
torch.autograd.grad
, the implementation does not passcreate_graph
parameter which by default isFalse
. See L#86 and L#116 of theMeta
class. To compute higher-order derivatives, we need to setcreate_graph=True
otherwise derivatives w.r.tfast_weights
will not flow to the derivatives computation i.e. first-order only. Is it intentional to not set thecreate_graph
parameter?The text was updated successfully, but these errors were encountered: