You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Since the update to > v1.0, this technique (and the instructions in the docs) breaks. Specifically, I get TypeError: optimizer_step() got an unexpected keyword argument 'epoch', since the optimizer_step() call has changed to self.optimizer_step(optimizer, opt_idx, batch_idx, train_step_and_backward_closure). I can see how to re-organise optimizer, opt_idx and batch_idx, but what should one do with train_step_and_backward_closure? Default it to None?
#4455
I see that this issue will be fixed with new documentation. Specifically, changing current_epoch to epoch in the optimizer_step override, as well as changing optimizer.step() to optimizer.step(optimizer_closure).
Since the update to > v1.0, this technique (and the instructions in the docs) breaks. Specifically, I get
TypeError: optimizer_step() got an unexpected keyword argument 'epoch'
, since theoptimizer_step()
call has changed toself.optimizer_step(optimizer, opt_idx, batch_idx, train_step_and_backward_closure)
. I can see how to re-organiseoptimizer, opt_idx
andbatch_idx
, but what should one do withtrain_step_and_backward_closure
? Default it toNone
?Originally posted by @murnanedaniel in #2934 (comment)
The text was updated successfully, but these errors were encountered: