-
Notifications
You must be signed in to change notification settings - Fork 4
Conversation
This is pretty much exactly what I had planned to do. Nice :). I don't think loading and unloading the second order tensor twice will be a problem compared to having to do it for the Hessian. For Hessian I thought something like this could work? function hessian{F}(f::F, v::Union{SecondOrderTensor, Vec})
s = zero(eltype(v))
gradf = y -> begin
s, fv1 = gradient(f, y)
return s, fv1
end
fv, ∇fv = gradient(gradf, v)
return s, fv, ∇fv
end This might give bad performance because julia sometimes have problems when you modify a variable in a closure. |
Didnt see your solution until I pushed mine... The problem with yours is that |
But I guess your |
v_dual = _load(v) | ||
res = f(v_dual) | ||
fv, ∇fv = _extract(res, v) | ||
return ∇fv |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
return fv, ∇fv
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
no? this is not gradient
, its _gradient
, I use this only in hessian
since we need a version of gradient
that returns only the gradient and not the value.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh, sorry.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
But might not be the best solution though
Tests should pass now, but maybe we should think a bit more on the hessian. Will do some other stuff now. |
Maybe we should reverse the order as well? You would expect that a function called OTH with the order we have now the value is always the first output, regardless if it is Thoughts? |
@@ -89,7 +89,7 @@ end | |||
@inline function _extract{D <: Dual}(v::Tensor{2, 2, D}, ::Any) | |||
@inbounds begin | |||
v1, v2, v3, v4 = value(v[1,1]), value(v[2,1]), value(v[1,2]), value(v[2,2]) | |||
f = Tensor{2, 2}((v1, v2, v3, v3)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
kappa
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thats already fixed 👍
Oh, right... forgot to update the doctests :) Will do it later. |
Seems like it is about two times slower for both |
Will make some better benchmarks later. |
You can run only the AD benchmarks with |
d8fe6e9
to
1ca466d
Compare
1ca466d
to
057c538
Compare
|
Minimized to: julia> using StaticArrays
julia> s = rand(SVector{3})
3-element StaticArrays.SVector{3,Float64}:
0.710116
0.382896
0.0640471
julia> s * 0.0
3-element StaticArrays.SVector{3,Float64}:
0.0
0.0
0.0
julia> @code_warntype s * 0.0 |
same with just rand too:
|
Maybe JuliaLang/julia#19421 |
Only the |
The docs (not docstrings) should perhaps say something about the |
Sure, btw, is it too verbose with the examples in the docstrings? Maybe we can skip examples for the |
The hessian is a bit verbose, yeah. Could maybe just assign them to variables and not print the result, |
8a01749
to
cdef41a
Compare
```@docs | ||
gradient | ||
hessian | ||
``` | ||
|
||
We here give a few examples of differentiating various functions and compare with the analytical solution. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Perhaps make this into a header not that the section is becoming quite long. ## Examples
Anything left to do here? |
No, not necessary to run benchmarks, since we now left the original functions untouched. |
Wasnt docstrings supposed to be inserted here https://kristofferc.github.io/ContMechTensors.jl/latest/man/automatic_differentiation.html ? |
Yes, https://github.com/KristofferC/ContMechTensors.jl/pull/94/files#diff-3457962e803ef1880a1bf358cb6c0402R24 |
Docs run on nightly which are failing? Change docs to run on v0.5? |
Fix #92 perhaps
This returns the value of the function by default. How can we do this for hessian? If we do it the way we do it now that would mean we load the tensor twice, and extract the result twice...
TODO:
Run benchmarks