Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Community discussions using Optim.jl and Manifolds.jl together #1242

Open
dehann opened this issue May 6, 2021 · 8 comments
Open

Community discussions using Optim.jl and Manifolds.jl together #1242

dehann opened this issue May 6, 2021 · 8 comments

Comments

@dehann
Copy link
Member

dehann commented May 6, 2021

We need to build a bridge between Manifolds and Optimization routines. This issue is just a place to list ongoing conversations that are already happening in the community.

@Affie
Copy link
Member

Affie commented May 6, 2021

Also, see discussion in RoME JuliaRobotics/RoME.jl#244

@dehann
Copy link
Member Author

dehann commented May 6, 2021

@kellertuer
Copy link

Which routines are you thinking about?

@dehann
Copy link
Member Author

dehann commented May 14, 2021

TL;DR

Which routines are you thinking about?

At minimum retract! (or exact expmap if available) -- to allow familiar numerical operation x = X \oplus dx during optimization. The logmap too. Users will want get_coordinates / vee for ease of use. Basically let users extend as per Manifolds.jl docs. For now, our use-cases can restrict to Riemannian, so having users implement inner would likely be okay too.

I think I have the a similar question: what is the minimum set of functions to use / integrate / support Manifolds / Manopt / Optim, and make it easy for newcomers to contribute to the general Julia ecosystem.

It probably boils down to the same user functions that Manopt.jl needs from a bespoke user manifold (assuming <: ManifoldsBase)


We currently depend heavily on both NLsolve and Optim, but are finding problems with even basic SpecialEuclidean(2) tasks, specifically with the solver getting stuck at the [-pi,+pi) wrap around point. We are testing this with Manopt now and suspect that will probably work just fine (will have results soon). As mentioned a while back, we are interested in supporting much more elaborate manifolds.

We already provide a bunch of the standard manifolds for use in (aka "variables and factors" for) robotics and SLAM. We are now working to instead adopt Manifolds.jl as The Standard, rather than doing our own half-baked internal manifolds effort. So we are adopting and extending the abstractions that already exist in Manifolds, Manopt, Optim, Flux, etc. But the learning curve on how to use Manifolds.jl has been hard, hence the tutorial docs and will finish those soon (JuliaManifolds/Manifolds.jl#355).

What is IncrementalInference.jl (IIF): it provides tree-based non-Gaussian probabilistic (i.e. sum-product) inference. IIF extends Optim, Manifolds, NLsolve, (soon Manopt too), and many other packages. Note, however, it's a significant change for IIF to move exclusively to Manopt and dropping Optim, and is an unlikely path -- but we want the features :-) I hope Optim 920 can help show the gaps between Manifolds.jl and Optim.jl so I and others can help fix if we can (and help Manopt.jl).

@kellertuer
Copy link

I think I have the a similar question: what is the minimum set of functions to use / integrate / support Manifolds / Manopt / Optim, and make it easy for newcomers to contribute to the general Julia ecosystem.

It probably boils down to the same user functions that Manopt.jl needs from a bespoke user manifold (assuming <: ManifoldsBase)

I don‘t follow completely here integrate what/where. Use Manifolds/Manopt/Optim here? for Manifolds/Manopt just work on ManifoldsBase, i.e. exp/log/dist (or retract/inverse_retract to be more general) and in a lot of cases vector_transport.

We currently depend heavily on both NLsolve and Optim, but are finding problems with even basic SpecialEuclidean(2) tasks,

For me the SE(2) test case you are referring to – ... sorry I can‘t follow what the test is even doing, there are no comments just a lot of commented out lines, so I don‘t see which algorithm/solver you plan to use for example. From that test case I don‘t even see where you use optimization – Pose2Pose? Often if you do euclidean optimization (e.g. gradient descent) on a manifold you are basically lost, for sure.

Finally concerning manifolds.jl – feel free to reach out if you are stuck on learning something. And if we can – together – extend our library of available manifolds and/or improve documentation/examples/tutorials for the current ones, that would be great :)

@dehann
Copy link
Member Author

dehann commented May 17, 2021

I don‘t follow completely here integrate what/where.

Oh sorry, I mean this package IncrementalInference.jl (IIF) also provides a general API for users to extend. The work now is to integrate with ManifoldsBase properly using either Manopt or Optim with minimal duplication. I'm avoiding going into details, perhaps easiest is just to see where IIF currently calls out to either Optim or NLsolve (will likely add Manopt here too):

function _solveLambdaNumeric( fcttype::Union{F,<:Mixture{N_,F,S,T}},
objResX::Function,
residual::AbstractVector{<:Real},
u0::AbstractVector{<:Real},
islen1::Bool=false ) where {N_,F<:AbstractRelativeRoots,S,T}
#
#
r = NLsolve.nlsolve( (res, x) -> res .= objResX(x), u0, inplace=true)
#
return r.zero
end
function _solveLambdaNumeric( fcttype::Union{F,<:Mixture{N_,F,S,T}},
objResX::Function,
residual::AbstractVector{<:Real},
u0::AbstractVector{<:Real},
islen1::Bool=false ) where {N_,F<:AbstractRelativeMinimize,S,T}
# retries::Int=3 )
#
# wrt #467 allow residual to be standardize for Roots and Minimize and Parametric cases.
r = if islen1
Optim.optimize((x) -> (residual .= objResX(x); sum(residual.^2)), u0, Optim.BFGS() )
else
Optim.optimize((x) -> (residual .= objResX(x); sum(residual.^2)), u0)
end
#
return r.minimizer
end

For me the SE(2) test case you are referring to - ... sorry I can‘t follow what the test is even doing

Yeah, too much detail to explain here, perhaps just take note of the current SpecialEuclidean(2) --> SE(2) calculation we have (using TransformUtils.jl) -- this code goes from coordinates to group operations and then back to coordinates that the optimization libraries can recognize (basically data wrangling with the retract step outside of Optim / NLsolve):
https://github.com/JuliaRobotics/RoME.jl/blob/1a34ca9b0012185d342070aab3a6c70bf19b6834/src/factors/Pose2D.jl#L31-L33

This is what I'm fixing now, to have either Optim or Manopt do the retract internally, but knowing how users should overload in the general case.

Often if you do euclidean optimization (e.g. gradient descent) on a manifold you are basically lost, for sure.

Agreed.

I'll just add that IIF is based on graphical models where "variables" and "factors" are the fundamental abstractions. So the connect between all this should read something like

  • "what is the retraction for a factor (ie math function/s) on these variables (which are each defined on some manifold individually). Users can then work from this design pattern to extend yet more models of their own."

Finally concerning manifolds.jl – feel free to reach out if you are stuck on learning something. And if we can – together – extend our library of available manifolds and/or improve documentation/examples/tutorials for the current ones, that would be great :)

Thanks, yes I definitely want to help. Let me get a bit further, feel I'm getting closer. It will go much faster once the first design pattern is figured out and written down on paper to reference. Problem here is everyone recognizes something, but the end-to-end story is still way too dis-separate and all-over-the-place -- I'm trying to boil it down and get the abstractions squared in a recognizable way.

@kellertuer
Copy link

I am still not sure, which algorithms you use for the second case, but we have a quasi-Newton with BFGS in Manopt if you would like to support using that, see https://manoptjl.org/stable/solvers/quasi_Newton.html, not that for now we do not yet have an AD way of getting the gradient, you have to provide a gradient for now. If the residuals are distances and we hence have a sum of distances squared, the gradient (even on a manifold) is not that hard to compute (basically -logarithic-maps).

if islen1=true I do not see which algorithm is used, but still if it a gradient based version, the same for computing the gradient holds.

@Affie
Copy link
Member

Affie commented May 18, 2021

I am still not sure, which algorithms you use for the second case...

Nelder Mead

With the parametric batch solution (still under development) we currently support passing in any algorithm choice to Optim. We default to BFGS with forward AD and a hacky manifold just to fix "Circular".
see:

function solveConditionalsParametric(fg::AbstractDFG,
frontals::Vector{Symbol};
solvekey::Symbol=:parametric,
autodiff = :forward,
algorithm=Optim.BFGS,
algorithmkwargs=(), # add manifold to overwrite computed one
options = Optim.Options(allow_f_increases=true,
time_limit = 100,
# show_trace = true,
# show_every = 1,
))

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants