-
Notifications
You must be signed in to change notification settings - Fork 591
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add AGS global solver #194
Changes from 18 commits
2696355
b5a5afc
9afec62
de8fa3c
e050013
7657851
ef16e49
7f24c6f
cf122f7
886d963
bd981b9
1a3dc05
d7f3030
fb9a8b7
bdf8700
b0b1834
c7b67be
7743d22
7b6aa88
f66ac2d
0b60363
3cceeed
5e1e09f
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -31,7 +31,7 @@ Better yet, run some algorithm for a really long time until the minimum *f*<sub> | |
Global optimization | ||
------------------- | ||
|
||
All of the global-optimization algorithms currently require you to specify bound constraints on all the optimization parameters. Of these algorithms, only ISRES and ORIG_DIRECT support nonlinear inequality constraints, and only ISRES supports nonlinear equality constraints. (However, any of them can be applied to nonlinearly constrained problems by combining them with the [augmented Lagrangian method](#Augmented_Lagrangian_algorithm.md) below.) | ||
All of the global-optimization algorithms currently require you to specify bound constraints on all the optimization parameters. Of these algorithms, only ISRES, AGS and ORIG_DIRECT support nonlinear inequality constraints, and only ISRES supports nonlinear equality constraints. (However, any of them can be applied to nonlinearly constrained problems by combining them with the [augmented Lagrangian method](#Augmented_Lagrangian_algorithm.md) below.) | ||
|
||
**Something you should consider** is that, after running the global optimization, it is often worthwhile to then use the global optimum as a starting point for a local optimization to "polish" the optimum to a greater accuracy. (Many of the global optimization algorithms devote more effort to searching the global parameter space than in finding the precise position of the local optimum accurately.) | ||
|
||
|
@@ -123,6 +123,24 @@ Some references on StoGO are: | |
|
||
Only bound-constrained problems are supported by this algorithm. | ||
|
||
### AGS | ||
|
||
This algorithm adapted from [this repo](https://github.com/sovrasov/glob_search_nlp_solver). | ||
AGS can handle arbitrary objectives and nonlinear inequality constraints. Also bound constraints are required for this method. To guarantee convergence, objectives and constraints should satisfy the Lipschitz condition on the specified hyperrectangle. | ||
AGS is derivative-free and employs the Hilbert curve to reduce the source problem to the univariate one. The algorithm divides the univariate space into intervals, generating new points by using posterior probabilities. On each trial AGS tries to evaluate the constraints consequently one by one. If some constraint is violated at this point, the next ones won't be evaluated. If all constraints are preserved, i.e. the trial point is feasible, AGS will evaluate the objective. Thus, some of constraints (except the first one) and objective can be partially undefined inside the search hyperrectangle. Current implementation of AGS doesn't support vector constraints. | ||
|
||
Limitations of the machine arithmetic don't allow to build a tight approximation for Hilbert when the space dimension is greater than 5, so this implementation of AGS is restricted in that sense. | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. If it is just slightly less reliable for dimensions > 5, it is probably worth allowing it to be used there so that people can experiment… or does the code not work at all in higher dimensions? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Ok, increased the limit to 10, added a warning that sometimes the method can stop early for high-dimensional problems. |
||
|
||
AGS, like StoGO, is written in C++, but it requires C++11. If the library is built with [C++](NLopt_Installation.md) and compiler supports C++11, AGS will be built too. | ||
|
||
AGS is specified within NLopt by `NLOPT_GN_AGS`. Additional parameters of AGS which are not adjustable from the common NLOpt interface are declared and described in `ags.h`. Also an example of solving a constrained problem is given in the AGS source folder. | ||
References: | ||
- Yaroslav D. Sergeyev, Dmitri L. Markin: An algorithm for solving global optimization problems with nonlinear constraints, Journal of Global Optimization, 7(4), pp 407–419, 1995 | ||
- Strongin R.G., Sergeyev Ya.D., 2000. Global optimization with non-convex constraints. Sequential and parallel algorithms. Kluwer Academic | ||
Publishers, Dordrecht. | ||
- Gergel V. and Lebedev I.: Heterogeneous Parallel Computations for Solving Global Optimization Problems. Proc. Comput. Science 66, pp. 53–62 (2015) | ||
- [Implementation](https://github.com/sovrasov/multicriterial-go) of AGS for constrained multi-objective problems. | ||
|
||
### ISRES (Improved Stochastic Ranking Evolution Strategy) | ||
|
||
This is my implementation of the "Improved Stochastic Ranking Evolution Strategy" (ISRES) algorithm for nonlinearly-constrained global optimization (or at least semi-global; although it has heuristics to escape local optima, I'm not aware of a convergence proof), based on the method described in: | ||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,102 @@ | ||
// A C-callable front-end to the AGS global-optimization library. | ||
// -- Vladislav Sovrasov | ||
|
||
#include "ags.h" | ||
#include "solver.hpp" | ||
#include <iostream> | ||
#include <cstring> | ||
|
||
double ags_eps = 0; | ||
double ags_r = 3; | ||
double eps_res = 0.001; | ||
unsigned evolvent_density = 12; | ||
int ags_refine_loc = 0; | ||
int ags_verbose = 0; | ||
|
||
int ags_minimize(unsigned n, nlopt_func func, void *data, unsigned m, nlopt_constraint *fc, | ||
double *x, double *minf, const double *l, const double *u, nlopt_stopping *stop) | ||
{ | ||
int ret_code = NLOPT_SUCCESS; | ||
|
||
if (n > ags::solverMaxDim) | ||
return NLOPT_INVALID_ARGS; | ||
if(m != nlopt_count_constraints(m, fc) || m > ags::solverMaxConstraints) | ||
return NLOPT_INVALID_ARGS; | ||
|
||
std::vector<double> lb(l, l + n); | ||
std::vector<double> ub(u, u + n); | ||
std::vector<ags::NLPSolver::FuncPtr> functions; | ||
for (unsigned i = 0; i < m; i++) | ||
{ | ||
if (fc[i].m != 1) | ||
return NLOPT_INVALID_ARGS; | ||
functions.push_back([fc, data, n, i](const double* x) { | ||
double val = 0; | ||
nlopt_eval_constraint(&val, NULL, &fc[i], n, x); | ||
return val; | ||
}); | ||
} | ||
functions.push_back([func, data, n](const double* x) {return func(n, x, NULL, data);}); | ||
|
||
ags::SolverParameters params; | ||
params.r = ags_r; | ||
params.itersLimit = stop->maxeval; | ||
params.eps = ags_eps; | ||
params.evolventDensity = evolvent_density; | ||
params.epsR = eps_res; | ||
params.stopVal = stop->minf_max; | ||
params.refineSolution = (bool)ags_refine_loc; | ||
|
||
ags::NLPSolver solver; | ||
solver.SetParameters(params); | ||
solver.SetProblem(functions, lb, ub); | ||
|
||
ags::Trial optPoint; | ||
try | ||
{ | ||
auto external_stop_func = [stop, &ret_code](){ | ||
if (nlopt_stop_time(stop)) { | ||
ret_code = NLOPT_MAXTIME_REACHED; | ||
return true; | ||
} | ||
else return false; | ||
}; | ||
optPoint = solver.Solve(external_stop_func); | ||
} | ||
catch (const std::runtime_error& exp) | ||
{ | ||
std::cerr << "AGS internal error: " << std::string(exp.what()) << std::endl; | ||
return NLOPT_FAILURE; | ||
} | ||
|
||
if (ags_verbose) | ||
{ | ||
auto calcCounters = solver.GetCalculationsStatistics(); | ||
auto holderConstEstimations = solver.GetHolderConstantsEstimations(); | ||
|
||
std::cout << std::string(20, '-') << "AGS statistics: " << std::string(20, '-') << std::endl; | ||
for (size_t i = 0; i < calcCounters.size() - 1; i++) | ||
std::cout << "Number of calculations of constraint # " << i << ": " << calcCounters[i] << "\n"; | ||
std::cout << "Number of calculations of objective: " << calcCounters.back() << "\n";; | ||
|
||
for (size_t i = 0; i < holderConstEstimations.size() - 1; i++) | ||
std::cout << "Estimation of Holder constant of function # " << i << ": " << holderConstEstimations[i] << "\n"; | ||
std::cout << "Estimation of Holder constant of objective: " << holderConstEstimations.back() << "\n"; | ||
if (optPoint.idx != m) | ||
std::cout << "Feasible point not found" << "\n"; | ||
std::cout << std::string(40, '-') << std::endl; | ||
} | ||
|
||
if (m == optPoint.idx) | ||
{ | ||
memcpy(x, optPoint.y, n*sizeof(x[0])); | ||
*minf = optPoint.g[optPoint.idx]; | ||
} | ||
else //feasible point not found. | ||
return NLOPT_FAILURE; | ||
|
||
if (solver.GetCalculationsStatistics()[0] >= params.itersLimit) | ||
return NLOPT_MAXEVAL_REACHED; | ||
|
||
return ret_code; | ||
} |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,31 @@ | ||
/* A C-callable front-end to the AGS global-optimization library. | ||
-- Vladislav Sovrasov */ | ||
|
||
#ifndef AGS_H | ||
#define AGS_H | ||
|
||
#include "nlopt-util.h" | ||
|
||
#ifdef __cplusplus | ||
extern "C" { | ||
#endif | ||
|
||
//The algorithm supports 3 types of stop criterions: stop by execution time, stop by value and stop by exceeding limit of iterations. | ||
|
||
int ags_minimize(unsigned n, nlopt_func func, void *data, unsigned m, nlopt_constraint *fc, | ||
double *x, double *minf, const double *l, const double *u, nlopt_stopping *stop); | ||
|
||
extern double ags_eps; //method tolerance in Holder metric on 1d interval. Less value -- better search precision, less probability of early stop. | ||
extern double ags_r; //reliability parameter. Higher value of r -- slower convergence, higher chance to cache the global minima. | ||
extern double eps_res; // parameter which prevents method from paying too much attention to constraints. Greater values of this parameter speed up convergence, | ||
// but global minima can be lost. | ||
extern unsigned evolvent_density; // density of evolvent. By default density is 2^-12 on hybercube [0,1]^N, | ||
// which means that maximum search accuracyis 2^-12. If search hypercube is large the density can be increased accordingly to achieve better accuracy. | ||
extern int ags_refine_loc; //refine the final optimum using built-in local optimizer | ||
extern int ags_verbose; //print additional info | ||
|
||
#ifdef __cplusplus | ||
} | ||
#endif | ||
|
||
#endif |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,75 @@ | ||
/* | ||
Copyright (C) 2018 Sovrasov V. - All Rights Reserved | ||
* You may use, distribute and modify this code under the | ||
* terms of the MIT license. | ||
* You should have received a copy of the MIT license with | ||
* this file. If not visit https://opensource.org/licenses/MIT | ||
*/ | ||
#pragma once | ||
|
||
#include <stdexcept> | ||
#include <string> | ||
|
||
#define NLP_SOLVER_ERROR(msg) throw std::runtime_error(std::string(msg)) | ||
#define NLP_SOLVER_ASSERT(expr, msg) if(!(expr)) NLP_SOLVER_ERROR(msg) | ||
|
||
namespace ags | ||
{ | ||
|
||
const unsigned solverMaxDim = 5; | ||
const unsigned solverMaxConstraints = 10; | ||
|
||
template <class fptype> | ||
class IGOProblem | ||
{ | ||
public: | ||
~IGOProblem() {} | ||
|
||
virtual fptype Calculate(const fptype* y, int fNumber) const = 0; | ||
virtual int GetConstraintsNumber() const = 0; | ||
virtual int GetDimension() const = 0; | ||
virtual void GetBounds(fptype* left, fptype* right) const = 0; | ||
virtual int GetOptimumPoint(fptype* y) const = 0; | ||
virtual fptype GetOptimumValue() const = 0 ; | ||
}; | ||
|
||
struct Trial | ||
{ | ||
double x; | ||
double y[solverMaxDim]; | ||
double g[solverMaxConstraints + 1]; | ||
int idx; | ||
Trial() {} | ||
Trial(double _x) : x(_x) {} | ||
}; | ||
|
||
struct Interval | ||
{ | ||
Trial pl; | ||
Trial pr; | ||
double R; | ||
double delta; | ||
Interval() {} | ||
Interval(const Trial& _pl, const Trial& _pr) : pl(_pl), pr(_pr) {} | ||
}; | ||
|
||
struct CompareIntervals | ||
{ | ||
bool operator() (const Interval* i1, const Interval* i2) const | ||
{ | ||
return i1->pl.x < i2->pl.x; | ||
} | ||
}; | ||
|
||
class CompareByR | ||
{ | ||
public: | ||
bool operator() (const Interval* i1, const Interval* i2) const | ||
{ | ||
return i1->R < i2->R; | ||
} | ||
}; | ||
|
||
|
||
|
||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
add serial comma after AGS
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Didn't know that, used rules from Russian :)