Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

More docs and tests #145

Merged
merged 8 commits into from
Jul 15, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
72 changes: 67 additions & 5 deletions README.md

Large diffs are not rendered by default.

3 changes: 3 additions & 0 deletions docs/make.jl
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,9 @@ makedocs(
"algorithms/surrogate.md",
"algorithms/mts.md",
"algorithms/sdp.md",
"algorithms/metaheuristics.md",
"algorithms/nomad.md",
"algorithms/tobs.md",
],
"Optimization result" => "result.md"
],
Expand Down
88 changes: 80 additions & 8 deletions docs/src/algorithms/algorithms.md

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion docs/src/algorithms/auglag.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Augmented Lagrangian algorithm
# Augmented Lagrangian algorithm in pure Julia

## Description

Expand Down
4 changes: 2 additions & 2 deletions docs/src/algorithms/hyperopt.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
# Multi-start optimization
# Multi-start and hyper-parameter optimization in pure Julia

## Description

[Hyperopt.jl](https://github.com/baggepinnen/Hyperopt.jl) is a Julia library that implements a number of hyperparameter optimization algorithms which can be used to optimize the starting point of the optimization.
[Hyperopt.jl](https://github.com/baggepinnen/Hyperopt.jl) is a Julia library that implements a number of hyperparameter optimization algorithms which can be used to optimize the starting point of the optimization. `NonconvexHyperopt.jl` allows the use of the algorithms in `Hyperopt.jl` as meta-algorithms using the `HyperoptAlg` struct.

## Quick start

Expand Down
4 changes: 2 additions & 2 deletions docs/src/algorithms/ipopt.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
# Ipopt
# Interior point method using `Ipopt.jl`

## Description

[Ipopt](https://coin-or.github.io/Ipopt) is a well known interior point optimizer developed and maintained by COIN-OR. The Julia wrapper of Ipopt is [Ipopt.jl](https://github.com/jump-dev/Ipopt.jl). Nonconvex allows the use of Ipopt.jl using the `IpoptAlg` algorithm struct. Ipopt can be used as a second order optimizer using the Hessian of the Lagrangian. Alternatively, an [l-BFGS approximation](https://en.wikipedia.org/wiki/Limited-memory_BFGS) of the Hessian can be used instead turning Ipopt into a first order optimizer tha only requires the gradient of the Lagrangian.
[Ipopt](https://coin-or.github.io/Ipopt) is a well known interior point optimizer developed and maintained by COIN-OR. The Julia wrapper of Ipopt is [Ipopt.jl](https://github.com/jump-dev/Ipopt.jl). `Ipopt.jl` is wrapped in `NonconvexIpopt.jl`. `NonconvexIpopt` allows the use of `Ipopt.jl` using the `IpoptAlg` algorithm struct. `IpoptAlg` can be used as a second order optimizer computing the Hessian of the Lagrangian in every iteration. Alternatively, an [l-BFGS approximation](https://en.wikipedia.org/wiki/Limited-memory_BFGS) of the Hessian can be used instead turning `IpoptAlg` into a first order optimizer tha only requires the gradient of the Lagrangian.

## Quick start

Expand Down
56 changes: 56 additions & 0 deletions docs/src/algorithms/metaheuristics.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
# A collection of meta-heuristic algorithms in pure Julia

## Description

[Metaheuristics.jl](https://github.com/jmejia8/Metaheuristics.jl) is an optimization library with a collection of [metaheuristic optimization algorithms](https://en.wikipedia.org/wiki/Metaheuristic) implemented. `NonconvexMetaheuristics.jl` allows the use of all the algorithms in the `Metaheuristics.jl` using the `MetaheuristicsAlg` struct.

The main advantage of metaheuristic algorithms is that they don't require the objective and constraint functions to be differentiable. One advantage of the `Metaheuristics.jl` package compared to other black-box optimization or metaheuristic algorithm packages is that a large number of the algorithms implemented in `Metaheuristics.jl` support bounds, inequality and equality constraints using constraint handling techniques for metaheuristic algorithms.

## Supported algorithms

`Nonconvex.jl` only supports the single objective optimization algorithms in `Metaheuristics.jl`. The following algorithms are supported:
- Evolutionary Centers Algorithm (`ECA`)
- Differential Evolution (`DE`)
- Differential Evolution (`PSO`)
- Artificial Bee Colony (`ABC`)
- Gravitational Search Algorithm (`CGSA`)
- Simulated Annealing (`SA`)
- Whale Optimization Algorithm (`WOA`)
- Machine-coded Compact Genetic Algorithm (`MCCGA`)
- Genetic Algorithm (`GA`)

For a summary of the strengths and weaknesses of each algorithm above, please refer to the table in the [algorithms page](https://jmejia8.github.io/Metaheuristics.jl/dev/algorithms/) in the `Metaheuristics` documentation. To define a `Metaheuristics` algorithm, you can use the `MetaheuristicsAlg` algorithm struct which wraps one of the above algorithm types, e.g. `MetaheuristicsAlg(ECA)` or `MetaheuristicsAlg(DE)`.

## Quick start

Given a model `model` and an initial solution `x0`, the following can be used to optimize the model using `Metaheuristics`.
```julia
using Nonconvex
Nonconvex.@load Metaheuristics

alg = MetaheuristicsAlg(ECA)
options = MetaheuristicsOptions(N = 1000) # population size
result = optimize(model, alg, x0, options = options)
```
`Metaheuristics` is an optional dependency of Nonconvex so you need to load the package to be able to use it.

## Options

The options keyword argument to the `optimize` function shown above must be an instance of the `MetaheuristicsOptions` struct when the algorihm is a `MetaheuristicsAlg`. To specify options use keyword arguments in the constructor of `MetaheuristicsOptions`, e.g:
```julia
options = MetaheuristicsOptions(N = 1000)
```
All the other options that can be set for each algorithm can be found in the [algorithms section](https://jmejia8.github.io/Metaheuristics.jl/dev/algorithms/) of the documentation of `Metaheuristics.jl`. Note that one notable difference between using `Metaheuristics` directly and using it through `Nonconvex` is that in `Nonconvex`, all the options must be passed in through the `options` struct and only the algorithm type is part of the `alg` struct.

## Variable bounds

When using `Metaheuristics` algorithms, finite variables bounds are necessary. This is because the initial population is sampled randomly in the finite interval of each variable. Use of `Inf` as an upper bound or `-Inf` is therefore not acceptable.

## Initialization

Most metaheuristic algorithms are population algorithms which can accept multiple initial solutions to be part of the initial population. In `Nonconvex`, you can specify multiple initial solutions by making `x0` a vector of solutions. However, since `Nonconvex` models support arbitrary collections as decision variables, you must specify that the `x0` passed in is indeed a population of solutions rather than a single solution that's a vector of vectors for instance. To specify that `x0` is a vector of solutions, you can set the `multiple_initial_solutions` option to `true` in the `options` struct, e.g:
```julia
options = MetaheuristicsOptions(N = 1000, multiple_initial_solutions = true)
x0 = [[1.0, 1.0], [0.0, 0.0]]
```
When fewer solutions are passed in `x0` compared to the population size, random initial solutions between the lower and upper bounds are sampled to complete the initial population.
10 changes: 4 additions & 6 deletions docs/src/algorithms/minlp.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,10 @@
# Mixed integer nonlinear programming (MINLP)
# First and second order mixed integer nonlinear programming algorithms

## Description

There are 2 MINLP solvers available in Nonconvex:
1. [Juniper.jl](https://github.com/lanl-ansi/Juniper.jl) with [Ipopt.jl](https://github.com/jump-dev/Ipopt.jl) as a sub-solver.
2. [Pavito.jl](https://github.com/jump-dev/Pavito.jl) with [Ipopt.jl](https://github.com/jump-dev/Ipopt.jl) and [Cbc.jl](https://github.com/jump-dev/Cbc.jl) as sub-solvers.

These rely on local nonlinear programming solvers and a branch and bound procedure to find a locally optimal solution that satisfies the integerality constraints.
There are 2 first and second order MINLP solvers available in `Nonconvex`:
1. [Juniper.jl](https://github.com/lanl-ansi/Juniper.jl) with [Ipopt.jl](https://github.com/jump-dev/Ipopt.jl) as a sub-solver. `NonconvexJuniper.jl` allows the use of the branch and bound algorithm in `Juniper.jl` using the `JuniperIpoptAlg` struct.
2. [Pavito.jl](https://github.com/jump-dev/Pavito.jl) with [Ipopt.jl](https://github.com/jump-dev/Ipopt.jl) and [Cbc.jl](https://github.com/jump-dev/Cbc.jl) as sub-solvers. `NonconvexPavito.jl` allows the use of the sequential polyhedral outer-approximations algorithm in `Pavito.jl` using the `PavitoIpoptCbcAlg` struct.

## Juniper + Ipopt

Expand Down
4 changes: 2 additions & 2 deletions docs/src/algorithms/mma.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
# Method of moving asymptotes (MMA)
# Method of moving asymptotes in pure Julia

## Description

There are 2 versions of MMA that are available in Nonconvex.jl:
There are 2 versions of the method of moving asymptotes (MMA) that are available in `NonconvexMMA.jl`:
1. The original MMA algorithm from the [1987 paper](https://onlinelibrary.wiley.com/doi/abs/10.1002/nme.1620240207).
2. The globally convergent MMA (GCMMA) algorithm from the [2002 paper](https://epubs.siam.org/doi/abs/10.1137/S1052623499362822).

Expand Down
24 changes: 4 additions & 20 deletions docs/src/algorithms/mts.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
# Multiple Trajectory Search (MTS)
# Multi-trajectory search algorithm in pure Julia

## Description

MTS: Multiple Trajectory Search for Large-Scale Global Optimization, is a derivative-free heuristic optimization method presented in paper [Lin-Yu Tseng and Chun Chen, 2008](https://sci2s.ugr.es/sites/default/files/files/TematicWebSites/EAMHCO/contributionsCEC08/tseng08mts.pdf).
The main algorihtm `MTS` contains three subroutines `localsearch1`, `localsearch2` and `localsearch3`. This module implements all the optimization methods in the paper. People often use the entire `MTS` or only `localsearch1` to optimize functions, while `localsearch2` or `localsearch3` would rarely be used independently. Therefore, the module only exports `MTS` and `localsearch1`.
Multiple trajectory search (MTS) is a derivative-free heuristic optimization method presented by [Lin-Yu Tseng and Chun Chen, 2008](https://sci2s.ugr.es/sites/default/files/files/TematicWebSites/EAMHCO/contributionsCEC08/tseng08mts.pdf).
The `MTS` algorithm is implemented in the `NonconvexSearch.jl` package. This module implements all the optimization methods in the paper.

## Quick start

Expand All @@ -13,27 +13,11 @@ Using default `MTSOptions()`. `MTS` is used for optimization.
using Nonconvex
Nonconvex.@load MTS

alg = MTSAlg() # Or LS1Alg()
alg = MTSAlg()
LS1_options = MTSOptions()
m = Model(f)
lb = [0, 0]
ub = [5, 5]
# Must have a box constraint. And (in)equality constraints are not supported for MTS methods.
addvar!(m, lb, ub)
result = optimize(model, alg, x0, options = options)
```

## Options

You can choose which algorithm to use by specifying `option.method`. Avaliable list is `[MTS (default), localsearch1, Nonconvex.localsearch2 (not recommended), Nonconvex.localsearch3 (not recommended)]`.

```julia
alg = MTSAlg() # Or LS1Alg()
LS1_options = MTSOptions(method=localsearch1)
m = Model(f))
lb = [0, 0]
ub = [5, 5]
# Must have a box constraint. And (in)equality constraints are not supported in MTS methods.
addvar!(m, lb, ub)
result = optimize(model, alg, x0, options = options
```
6 changes: 3 additions & 3 deletions docs/src/algorithms/nlopt.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
# NLopt
# Various optimization algorithms from `NLopt.jl`

## Description

[NLopt](https://github.com/stevengj/nlopt) is an optimization library with a collection of optimization algorithms implemented. Different algorithms have different limitations. To see the limitations of each algorithm, check the [algorithms section](https://nlopt.readthedocs.io/en/latest/NLopt_Algorithms/) of the documentation of NLopt. [NLopt.jl](https://github.com/JuliaOpt/NLopt.jl) is the Julia wrapper of NLopt. Nonconvex allows the use of NLopt.jl using the `NLoptAlg` algorithm struct.
[NLopt](https://github.com/stevengj/nlopt) is an optimization library with a collection of optimization algorithms implemented. [NLopt.jl](https://github.com/JuliaOpt/NLopt.jl) is the Julia wrapper of `NLopt`. `NonconvexNLopt` allows the use of `NLopt.jl` using the `NLoptAlg` algorithm struct.

## Quick start

Expand Down Expand Up @@ -64,7 +64,7 @@ For a description of the above algorithms, please refer to the [algorithms secti
---
**Disclaimer:**

Not all the algorithms have been tested with Nonconvex. So if you try one and it doesn't work, please open an issue.
Not all the algorithms have been tested with `Nonconvex`. So if you try one and it doesn't work, please open an issue.

---

Expand Down
40 changes: 40 additions & 0 deletions docs/src/algorithms/nomad.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
# Nonlinear optimization with the MADS (NOMAD) algorithm for continuous and discrete, constrained optimization

## Description

[NOMAD.jl](https://github.com/bbopt/NOMAD.jl) is an optimization package wrapping the C++ implementation of the [NOMAD algorithm](https://dl.acm.org/doi/10.1145/1916461.1916468). `NonconvexNOMAD` allows the use of `NOMAD.jl` using the the `NOMADAlg` struct. `NOMAD.jl` supports continuous and integer decision variables as well as bounds and inequality constraints. Linear equality constraints are also supported when no integer decision variables are in the model.

## Quick start

Given a model `model` and an initial solution `x0`, the following can be used to optimize the model using `NOMAD`.
```julia
using Nonconvex
Nonconvex.@load NOMAD

alg = NOMADAlg()
options = NOMADOptions()
result = optimize(model, alg, x0, options = options)
```
`NOMAD` is an optional dependency of Nonconvex so you need to load the package to be able to use it.

## Algorithm types

There are 3 different variants of the `NOMADAlg` struct:
- `NOMADAlg(:explicit)`
- `NOMADAlg(:progressive)`
- `NOMADAlg(:custom)`

The explicit algorithm ensures all the constraints are satisfied at all times removing any infeasible point from the population. The progressive algorithm allows infeasible points to be part of the population but enforces feasibility in a progressive manner. The custom variant allows the use of flags on each constraint to declare it as `:explicit` or `:progressive`. For instance, assume `model` is the `Nonconvex` model and `g1` and `g2` are 2 constraint functions.
```julia
add_ineq_constraint!(model, g1, flags = [:explicit])
add_ineq_constraint!(m, g2, flags = [:progressive])
```
The above code declares the first constraint as explicit and the second as progressive. In other words, every point violating the first constraint will be removed from the population but the second constraint will be more progressively enforced.

## Options

The options keyword argument to the `optimize` function shown above must be an instance of the `NOMADOptions` struct when the algorihm is a `NOMADAlg`. To specify options use keyword arguments in the constructor of `NOMADOptions`, e.g:
```julia
options = NOMADOptions()
```
All the options that can be set can be found in the [`NOMAD.jl` documentation](https://bbopt.github.io/NOMAD.jl/stable/nomadProblem/).
2 changes: 1 addition & 1 deletion docs/src/algorithms/sdp.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Semidifinite programming
# Interior point meta-algorithm for handling nonlinear semidefinite constraints

## Description

Expand Down
2 changes: 1 addition & 1 deletion docs/src/algorithms/surrogate.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Surrogate-assisted Bayesian optimization
# Surrogate-assisted continuous and discrete, constrained optimization

## Description

Expand Down
84 changes: 84 additions & 0 deletions docs/src/algorithms/tobs.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,84 @@
# Topology optimization of binary structures (TOBS), a nonlinear binary optimization heuristic

## Description

The method of topology optimization of binary structures ([TOBS](https://www.sciencedirect.com/science/article/abs/pii/S0168874X17305619?via%3Dihub)) was originally developed in the context of optimal distribution of material in mechanical components. The TOBS algorithm only supports binary decision variables. The TOBS algorithm is a heuristic that relies on the sequential linearization of the objective and constraint functions, progressively enforcing the constraints in the process. The resulting binary linear program can be solved using any mixed integer linear programming (MILP) solver such `Cbc`. This process is repeated iteratively until convergence. This package implements the heuristic for binary nonlinear programming problems.

## Construct an instance

To construct an instance of the `TOBS` algorithm, use:
```julia
alg = TOBSAlg()
```
When optimizing a model using `TOBSAlg`, all the variables are assumed to be binary if their lower and upper bounds are 0 and 1 respectively even if the `isinteger` flag was not used. If there are variables with other bounds' values, the optimization will give an error.

## Example

In this example, the classic topology optimization problem of minimizing the compliance of the structure subject to a volume constraint. Begin by installing and loading the packages required.

```julia
import Nonconvex
Nonconvex.@load TOBS
using Pkg
Pkg.add("TopOpt")
using TopOpt
```

Define the problem and its parameters using [TopOpt.jl](https://github.com/JuliaTopOpt/TopOpt.jl).

```julia
E = 1.0 # Young’s modulus
v = 0.3 # Poisson’s ratio
f = 1.0 # downward force
rmin = 6.0 # filter radius
xmin = 0.001 # minimum density
V = 0.5 # maximum volume fraction
p = 3.0 # SIMP penalty

# Define FEA problem
problem_size = (160, 100) # size of rectangular mesh
x0 = fill(1.0, prod(problem_size)) # initial design
problem = PointLoadCantilever(Val{:Linear}, problem_size, (1.0, 1.0), E, v, f)
solver = FEASolver(Direct, problem; xmin=xmin)
TopOpt.setpenalty!(solver, p)
cheqfilter = DensityFilter(solver; rmin=rmin) # filter function
comp = TopOpt.Compliance(problem, solver) # compliance function
```

Define the objective and constraint functions.

```julia
obj(x) = comp(cheqfilter(x)) # compliance objective
constr(x) = sum(cheqfilter(x)) / length(x) - V # volume fraction constraint
```

Finally, define the optimization problem using `Nonconvex.jl` and optimize it.

```julia
m = Model(obj)
addvar!(m, zeros(length(x0)), ones(length(x0)))
Nonconvex.add_ineq_constraint!(m, constr)
options = TOBSOptions()

r = optimize(m, TOBSAlg(), x0; options=options)
r.minimizer
r.minimum
```

The following is a visualization of the optimization history using this example.

![histories](https://user-images.githubusercontent.com/84910559/164938659-797a6a6d-3518-4f7b-a4ff-24b43b822080.png)

![gif](https://user-images.githubusercontent.com/19524993/167059067-f08502a8-c62d-4d62-a2df-e132efc5e25c.gif)

## Options

The following are the options that can be set by passing them to `TOBSOptions`, e.g. `TOBSOptions(movelimit = 0.1)`.
- `movelimit`: the maximum move limit in each iteration as a ratio of the total number of variables. Default value is 0.1, i.e. a maximum of 10% of the variables are allowed to flip value in each iteration.
- `convParam`: the tolerance value. The algrotihm is said to have converged if the moving average of the relative change in the objective value in the last `pastN` iterations is less than `convParam`. Default value is 0.001.
- `pastN`: the number of past iterations used to compute the moving average of the relative change in the objective value. Default value is 20.
- `constrRelax`: the amount of constraint relaxation applied to the linear approximation in each iteration. This is the relative constraint relaxation if the violation is higher than `constrRelax` and the absolute constraint relaxation otherwise. Default value is 0.1.
- `timeLimit`: the time limit (in seconds) of each MILP solve for the linearized sub-problem. Default value is 1.0.
- `optimizer`: the `JuMP` optimizer type used to solve the MILP sub-problem. Default value is `Cbc.Optimizer`.
- `maxiter`: the maximum number of iterations for the algorithm. Default value is 200.
- `timeStable`: a boolean value that when set to `true` switches on the time stability filter of the objective's gradient, discussed in the paper. Default value is `true`.
Loading