-
Notifications
You must be signed in to change notification settings - Fork 224
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Problem with some line searches and manifolds #626
Comments
What version of Optim and LineSearches are you using? Optim.optimize(f, g!, x0, Optim.GradientDescent(manifold=manif, alphaguess=LineSearches.InitialStatic(alpha=.01), linesearch=Optim.Static()), Optim.Options(show_trace=true, allow_f_increases=true,g_tol=1e-6)) |
In what way is the reported gradient wrong? In the infinity norm or elementwise? Here's what I get: srand(0)
const n = 4
const m = 1
M = randn(n,n)
# M = M'M
const A = (M+M')/2
f(x) = -vecdot(x,A*x)
g(x) = -2*A*x
g!(stor,x) = copy!(stor,g(x))
x0 = randn(n)
manif = Optim.Sphere()
res = Optim.optimize(f, g!, x0, Optim.GradientDescent(manifold=manif, alphaguess=LineSearches.InitialStatic(alpha=.01), linesearch=Optim.Static()), Optim.Options(show_trace=true, allow_f_increases=true,g_tol=1e-6))
display(res)
println("vecnorm = $(vecnorm(g(Optim.minimizer(res)), Inf))") Output: Results of Optimization Algorithm
* Algorithm: Gradient Descent
* Starting Point: [-1.6072563241277753,-2.48079273065994, ...]
* Minimizer: [-0.7442193383565487,-0.6356629624665635, ...]
* Minimum: -1.293132e+00
* Iterations: 956
* Convergence: true
* |x - x'| ≤ 0.0e+00: false
|x - x'| = 1.32e-09
* |f(x) - f(x')| ≤ 0.0e+00 |f(x)|: true
|f(x) - f(x')| = 0.00e+00 |f(x)|
* |g(x)| ≤ 1.0e-06: false
|g(x)| = 1.92e+00
* Stopped by an increasing objective: false
* Reached Maximum Number of Iterations: false
* Objective Calls: 957
* Gradient Calls: 959
vecnorm = 1.9247471962074876 |
The gradient is the unprojected one, while it should be using the projected one, which goes to zero. The problem appears to be triggered for BackTracking and Static combined with GradientDescent, BFGS or LBFGS (interestingly, ConjugateGradient works!) |
StrongWolfe also works :o |
So, #628 closed this by it's "Fix #626". Did the tests in that PR cover the problems discovered here @antoine-levitt ? |
Yep! They now test every first order method with every linesearch. |
This is the simplest reproducer I could find:
Notice it does converge, but the reported gradient is wrong. BackTracking is also affected, MoreThuente and HagerZhang are OK. Any idea what's going on?
The text was updated successfully, but these errors were encountered: