Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Warm-up technique no longer works as documented #4554

Closed
murnanedaniel opened this issue Nov 6, 2020 · 2 comments
Closed

Warm-up technique no longer works as documented #4554

murnanedaniel opened this issue Nov 6, 2020 · 2 comments

Comments

@murnanedaniel
Copy link

Since the update to > v1.0, this technique (and the instructions in the docs) breaks. Specifically, I get TypeError: optimizer_step() got an unexpected keyword argument 'epoch', since the optimizer_step() call has changed to self.optimizer_step(optimizer, opt_idx, batch_idx, train_step_and_backward_closure). I can see how to re-organise optimizer, opt_idx and batch_idx, but what should one do with train_step_and_backward_closure? Default it to None?

Originally posted by @murnanedaniel in #2934 (comment)

@github-actions
Copy link
Contributor

github-actions bot commented Nov 6, 2020

Hi! thanks for your contribution!, great first issue!

@murnanedaniel
Copy link
Author

#4455
I see that this issue will be fixed with new documentation. Specifically, changing current_epoch to epoch in the optimizer_step override, as well as changing optimizer.step() to optimizer.step(optimizer_closure).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant