Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ENH] Truncated Backpropagation Through Time #1581

Open
svnv-svsv-jm opened this issue Jun 17, 2024 · 3 comments
Open

[ENH] Truncated Backpropagation Through Time #1581

svnv-svsv-jm opened this issue Jun 17, 2024 · 3 comments
Labels
feature request New feature or request

Comments

@svnv-svsv-jm
Copy link

Is Truncated Backpropagation Through Time supported?

I've found no documentation or example about it.

@benHeid benHeid changed the title Truncated Backpropagation Through Time [ENH] Truncated Backpropagation Through Time Sep 22, 2024
@benHeid benHeid added the feature request New feature or request label Sep 22, 2024
@benHeid
Copy link
Collaborator

benHeid commented Sep 22, 2024

It seems that it is not supported. At least I haven't found it in training_step nor in the step method implemented. I think this could be a nice add on for autoregressive or recurrent networks.

For implementing it, I suppose it would require to add for the recurrent models a new step method or add a hook in the step method of the BaseModel for the calculation of the loss.

@svnv-svsv-jm
Copy link
Author

PyTorch Lightning had this feature, then they removed it. We could take inspiration from their old code?

@benHeid
Copy link
Collaborator

benHeid commented Oct 3, 2024

I suppose the advise of PyTorch lightning is using manual optimization for that.

https://lightning.ai/docs/pytorch/stable/model/manual_optimization.html

Pull Requests are appreciated :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature request New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants