You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
``After going through the code and browsing internet, the main reason for this error is disabling of computation gradient somewhere during the training step.
If you notice in training step, the code uses AdamW that is a third party implementation and has a function called def step() with a decorator torch.no_grad() inside the optimization.py class of transformers.
Hi,
I run the training code:
and encounters the error:
RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn
How can we solve this?
Thanks in advance!
The text was updated successfully, but these errors were encountered: