Skip to content

Megscammell/METOD-Algorithm

Repository files navigation

METOD (Multistart With Early Termination of Descents)-Algorithm-

CI codecov Documentation Status DOI

Multistart is a global optimization technique and works by applying local descent to several random starting points. Multistart can be inefficient since local descent is applied to each starting point, and the same local minimizers are discovered. For objective functions with locally quadratic behaviour close to the neighbourhoods of local minimizers, the Multistart with Early Termination of Descents (METOD) Algorithm can terminate many local descents early, which can significantly improve efficiency.

The early termination of descents in METOD is achieved by means of a particular inequality which holds when trajectories are from the region of attraction of the same local minimizer and often violates when the trajectories belong to different regions of attraction.

Documentation

Documentation for the METOD-Algorithm can be found at https://metod-algorithm.readthedocs.io/.

Installation

To install and test the METOD Algorithm, type the following into the command line:

$ git clone https://github.com/Megscammell/METOD-Algorithm.git
$ cd METOD-Algorithm
$ python setup.py develop
$ pytest

Quickstart

Apply the METOD Algorithm with an objective function and gradient.

>>> import numpy as np
>>> import math
>>> import metod_alg as mt
>>>
>>> np.random.seed(90)
>>> d = 2
>>> A = np.array([[1, 0],[0, 10]])
>>> theta = np.random.uniform(0, 2 * math.pi)
>>> rotation = np.array([[math.cos(theta), -math.sin(theta)],
...                     [math.sin(theta), math.cos(theta)]])
>>> x0 = np.array([0.5, 0.2])
>>>
>>> def f(x, x0, A, rotation):
...     return 0.5 * (x - x0).T @ rotation.T @ A @ rotation @ (x - x0)
...
>>> def g(x, x0, A, rotation):
...     return rotation.T @ A @ rotation @ (x - x0)
...
>>> args = (x0, A, rotation)
>>> (discovered_minimizers,
...  number_minimizers,
...  func_vals_of_minimizers,
...  excessive_no_descents, 
...  starting_points,
...  no_grad_evals) = mt.metod(f, g, args, d, num_points=10)
>>> assert(np.all(np.round(discovered_minimizers[0], 3) == np.array([0.500, 0.200])))
>>> assert(number_minimizers == 1)
>>> assert(np.round(func_vals_of_minimizers, 3) == 0)
>>> assert(excessive_no_descents == 0)
>>> assert(np.array(starting_points).shape == (10, d))

Examples

Examples of the METOD Algorithm applied with two different objective functions are available as Jupyter notebooks and Python scripts. All examples can be found at https://github.com/Megscammell/METOD-Algorithm/tree/master/Examples. All examples have an intuitive layout and structure, which can be easily followed.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Packages

No packages published