You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Assume fx, cx, gx and Ad are stored internally, where fx is the objective, cx is the constraints, and gx is the gradient at x and Ad is the Jacobian times the direction d.
The user may decide to keep pointers or new copies. Test will be made to determine that's possible.
Short draft on the design of merit functions
Examples of merit functions for equality-constrained problems:
l1
andl2
:phi(x, eta) = f(x) + eta * |c(x)|_p
, wherep = 1
orp = 2
.l2
squared (what is the name of this one again?):phi(x, eta) = f(x) + eta * |c(x)|_2^2 / 2
.phi(x, eta) = f(x) - y(x)' * c(x) + eta * |c(x)|^2 / 2
, wherey(x) = argmin |g(x) + A(x)' * y|^2
phi(x, y, eta) = f(x) - y' * c(x) + eta * |c(x)|^2 / 2
.Important features:
Suggested implementation:
AbstractMerit
L1Merit <: AbstractMerit
,AugLagMerit <: AbstractMerit
fx
,cx
,gx
andAd
are stored internally, wherefx
is the objective,cx
is the constraints, andgx
is the gradient atx
andAd
is the Jacobian times the directiond
.obj(::AbstractMerit, x::Vector; update=true)
directional(::AbstractMerit, x::Vector, d::Vector; update=true)
Discussion:
LineModel
, which has a lot more liberty and deals with annlp
directly.LineModel
can computegrad!
on annlp
.nlp
an also computes `grad!.AbstractMerit
be anNLPModel
? (I think no)grad!
onnlp
from line search and trust region?The text was updated successfully, but these errors were encountered: