Skip to content

Optimizers

Petri Riihikallio edited this page Aug 23, 2021 · 6 revisions

Optimizer's effect

Different optimizers produce different kinds of results, which can't be handled similarly and saved to the same database. This is why there is a system of inheritance in place. See the Architecture to view the structure.

Run

The class Run is an abstract class that specifies that there has to be a date time field created (when the run has happened). It also specifies ordering to be based on that.

RunScipy

This class is also abstract. It initializes fields for results that use a scipy optimizer. More about which exact fields are the same for the scipy optimizers can be found from Architecture or more specific information from run_scipy.py where the RunScipy is defined.

The following classes inherit RunScipy (which inherits Run).

  • RunScipyNelderMead

    • The Run that uses Nelder-Mead optimizer that is scipy based.
  • RunScipyBFGS

    • The Run that uses BFGS optimizer that is scipy based.
  • RunScipyLBFGSB

    • The Run that uses L-BFGS-B optimizer that is scipy based. See that the model has same fields as the RunScipyBFGS but it is an own class for now. There might be some differencies so refactoring has not been done yet.
  • RunScipyCOBYLA

    • The Run that uses COBYLA optimizer that is scipy based.

RunGradient

RunGradient is also abstract. It initializes fields for results that use so called gradient based optimizer. More about which exact fields are the same for the gradient based optimizers can be found from Architecture or more specific information from run_gradient.py where the RunGradient is defined.

The following classes inherit RunGradient (which inherits Run).

  • RunGradientNesterov
    • The Run that uses Nesterov optimizer that is gradient based.

More information can be found from Tequila's repository. Moreover, the implementations are based on tequila's actions. See the optimizers if interested!

Repositories

Clone this wiki locally