Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update "Composing Optimisers" docs #1628

Merged
merged 1 commit into from
Jun 24, 2021
Merged

Update "Composing Optimisers" docs #1628

merged 1 commit into from
Jun 24, 2021

Conversation

StevenWhitaker
Copy link
Contributor

Addresses #1627 (perhaps only partially).

Use 1 instead of 0.001 for the first argument of ExpDecay in the example, so that the sentence following the example, i.e.,

Here we apply exponential decay to the Descent optimiser.

makes more sense.

It was also suggested in the linked issue that it might be worth changing the default learning rate of ExpDecay to 1. Since this PR doesn't address that, I'm not sure merging this PR should necessarily close the issue.

Use `1` instead of `0.001` for first argument of `ExpDecay` in example.
Copy link
Member

@darsnack darsnack left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! Yeah I agree we should keep the issue open.

@darsnack
Copy link
Member

bors r+

@bors
Copy link
Contributor

bors bot commented Jun 24, 2021

Build succeeded:

@bors bors bot merged commit e7686b2 into FluxML:master Jun 24, 2021
@StevenWhitaker StevenWhitaker deleted the StevenWhitaker-update-docs branch June 25, 2021 12:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants