From c1f7ce4fc9e94b9d885cf449e596e8d952c2bf74 Mon Sep 17 00:00:00 2001 From: Mohamed Tarek Date: Sun, 13 Jun 2021 21:58:42 +1000 Subject: [PATCH] fix docs --- README.md | 48 ++++++++++++++++++++-------------------- src/wrappers/hyperopt.jl | 8 ++++--- 2 files changed, 29 insertions(+), 27 deletions(-) diff --git a/README.md b/README.md index 199b21e4..c8d16868 100644 --- a/README.md +++ b/README.md @@ -155,35 +155,35 @@ r.minimizer # [0.4934, 1.0] ## Starting point optimization -You can automatically search a good hyperparameter, using methods in Hyperopt.jl. For example, to optimize the starting point of the optimization, use: +### `RandomSampler`, `LHSampler`, `CLHSampler` or `GPSampler` + +You can optimize the initial point `x0` using [`Hyperopt.jl`](https://github.com/baggepinnen/Hyperopt.jl): + ```julia -r1 = @search_x0 Nonconvex.optimize( - m, alg, [1.234, 2.345], - options = options, convcriteria = convcriteria, -) +import Hyperopt -# If you prefer customized options -hyperopt_options = X0OptOptions( - x0_lb = [0.5, 0.5], x0_rb = [2.8, 2.8], - iters=20, searchspace_size=iters, - sampler=Hyperopt.RandomSampler(), - verbose=true, keepall=true, -) -# Searching hyperparameters using customized options. -r2 = @hypersearch hyperopt_options, Nonconvex.optimize( - m, alg, [1.234, 2.345], - options = options, convcriteria = convcriteria, -) -# Equivalent as above. -r3 = @search_x0 hyperopt_options, Nonconvex.optimize( - m, alg, [1.234, 2.345], - options = options, convcriteria = convcriteria, +sampler = RandomSampler() +options = HyperoptOptions( + sub_options = IpoptOptions(first_order = true), + sampler = sampler, ) +r = Nonconvex.optimize(m, alg, [1.234, 2.345], options = options) +``` + +When optimizing the starting point, the upper and lower bounds on the initial solution must be finite. To see all the options that can be set in `HyperoptOptions`, see `?HyperoptOptions`. The sampler can be replaced by `LHSampler()`, `CLHSampler()` or `GPSampler()`. See the documentation of [`Hyperopt.jl`](https://github.com/baggepinnen/Hyperopt.jl) for more details on the options available for each sampler. For `GPSampler`, `Hyperopt.Min` is always used by default in `Nonconvex.jl` so you should not pass this argument. + +### `Hyperband` -println(r1.minimum) -println(r2.minimum) -println(r3.minimum) +Alternatively, the hyperband algorithm from `Hyperopt.jl` can be used where the inner sampler can be of type: `RandomSampler`, `LHSampler` or `CLHSampler`: +```julia +sampler = Hyperband(R=100, η=3, inner=RandomSampler()) +options = HyperoptOptions( + sub_options = max_iter -> IpoptOptions(first_order = true, max_iter = max_iter), + sampler = spl, +) +r = Nonconvex.optimize(m, alg, [1.234, 2.345], options = options) ``` +The `sub_options` keyword argument must be a function here that specifies the resources option for the sub-solver used. ## Custom gradient / adjoint diff --git a/src/wrappers/hyperopt.jl b/src/wrappers/hyperopt.jl index f0a1e927..a1262e45 100644 --- a/src/wrappers/hyperopt.jl +++ b/src/wrappers/hyperopt.jl @@ -3,15 +3,17 @@ end """ - HyperoptOptions: Options performing starting point optimization + HyperoptOptions: options performing starting point optimization using Hyperopt.jl - `sub_options`: options for the sub-optimizer. -- `x0_lb`: Lower bound of starting point, if don't specify it, the default value will be `nothing`, +- `lb`: Lower bound of starting point, if don't specify it, the default value will be `nothing`, then will end up be replaced by the lower bound of optimization problem. -- `x0_rb`: Hier bound of starting point, same as above. +- `rb`: Hier bound of starting point, same as above. - `searchspace_size::Integer`: How many potential starting points we generate. - `iters::Integer`: Among all generated potential starting points, how many of them will be evaluated. - `sampler::Hyperopt.Sampler`: An instance of ['Hyperopt.Sampler'](@ref), which decides search algorithm. +- `ctol`: infeasibility tolerance for accepting a solution as feasible +- `keep_all`: if true, all the solutions of the sub-problems will be saved """ @params struct HyperoptOptions sub_options