-
Notifications
You must be signed in to change notification settings - Fork 49
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
More autograd helpers #2055
More autograd helpers #2055
Conversation
…g automatic differentiation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks great! think at some point we should include all of these in our demo notebooks. Just a few minor comments.
…implementations
…f objective functions
4f79757
to
d25c2e8
Compare
Hi @yaugenst-flex! I wonder if we can serialize the |
Hey @e-g-melo, the thing is that we currently do not have support for n-ary operators. Currently, every metric is restricted to be scalar. I will look into it though, this is on the todo list. |
I've created a GUI version of this notebook, which has a different In GUI, we can make it like this:
I wonder if we could have something simple, like |
Oh.. I saw your comment on Tom's notebook. Anyway, we can wait, no worries. |
Yeah we can definitely add that, and it's also the goal to have these more generic functions. But I still have to figure out a good interface for them.. |
DeprecationWarning
ofDataArray.values
.smooth_min
,smooth_max
, andleast_squares
implementations toplugins.autograd.functions
.@scalar_objective
decorator that extractsDataArray.data
and performs some additional checks on the output data of objective functions. If an objective function is decorated with this, callingDataArray.values
within the objective function is not necessary anymore. Ourgrad
andvalue_and_grad
implementations inplugins.autograd
wrap objective functions with this decorator automatically.pad
function.