-
Notifications
You must be signed in to change notification settings - Fork 422
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
2nd Order or 1st Order Approximation? #32
Comments
I think this implementation is only for first order version of MAML. |
@Vampire-Vx @yinxiaojian |
in order to make the 2nd order derivatives available. dragen1860#32 Now the regularizer coefficient makes differences.
Is this implementation a 1st order Approximation version of Maml ?
In meta.py, when you do autograd.grad, you do not specify create_graph = True, which means that the gradient operation would not be included in the computation graph.
Thus, although the design here is trying to calculate the 2nd order derivatives, the grad is not included, so only 1st order approximation.
The text was updated successfully, but these errors were encountered: