Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gradient descent variants #984

Open
ozanoktem opened this issue Apr 22, 2017 · 0 comments
Open

Gradient descent variants #984

ozanoktem opened this issue Apr 22, 2017 · 0 comments

Comments

@ozanoktem
Copy link
Contributor

ozanoktem commented Apr 22, 2017

We should implement some of the popular stochastic gradient optimisation techniques, such as SGD, SGD+momentum, Adagrad, Adadelta and Adam. These methods find local optimum (global when dealing with convex problem) of differentiable objective, see nice surveys in this arXiv preprint and this blog post.

Furthermore, this arXiv preprint suggests a gradient descent where they replace the classical (squared) two norm metric in the gradient descent setting with a generalised Bregman distance, based on a more general proper, convex and lower semi-continuous functional.

adler-j added a commit to adler-j/odl that referenced this issue May 22, 2017
adler-j added a commit to adler-j/odl that referenced this issue Aug 1, 2017
kohr-h pushed a commit to kohr-h/odl that referenced this issue Aug 31, 2017
mehrhardt pushed a commit to mehrhardt/odl that referenced this issue Sep 19, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants