Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
1628: Update "Composing Optimisers" docs r=darsnack a=StevenWhitaker Addresses #1627 (perhaps only partially). Use `1` instead of `0.001` for the first argument of `ExpDecay` in the example, so that the sentence following the example, i.e., > Here we apply exponential decay to the `Descent` optimiser. makes more sense. It was also [suggested](#1627 (comment)) in the linked issue that it might be worth changing the default learning rate of `ExpDecay` to `1`. Since this PR doesn't address that, I'm not sure merging this PR should necessarily close the issue. Co-authored-by: StevenWhitaker <[email protected]>
- Loading branch information