Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refactor: Split conv backward to allow conditional gradient computation #2278

Conversation

AsherJingkongChen
Copy link
Contributor

@AsherJingkongChen AsherJingkongChen commented Sep 13, 2024

Checklist

  • Confirmed that run-checks all script has been executed.
  • Made sure the book is up to date with changes in this PR.

Related Issues/PRs

Not found.

Changes

In crate burn-tensor, I split each of conv[1-3]d_backward and their transposed variants into three functions suffixed with *_x_backward, *_weight_backward, and *_bias_backward.

In crate burn-autodiff, I changed the backward implementations of conv ops to allow gradients being computed if required.

The public APIs appear to remain unchanged, since Autodiff backend (decorator) hides the changes.

Testing

All tests are passed on Apple M2 Pro. I did not add any new test because the original tests should be adequate.

@AsherJingkongChen
Copy link
Contributor Author

AsherJingkongChen commented Sep 14, 2024

Use case

This is M-SSIM loss function. I use this to compare the performance between the main and my PR. The changed version runs faster, because no grad computation is required for Conv2d.weight here.

https://gist.github.com/AsherJingkongChen/9baa00fc1d0503e55a358c55e393fc87

Copy link

codecov bot commented Sep 14, 2024

Codecov Report

Attention: Patch coverage is 86.24754% with 70 lines in your changes missing coverage. Please review.

Project coverage is 85.92%. Comparing base (6f0e61a) to head (8d0852f).
Report is 5 commits behind head on main.

Files with missing lines Patch % Lines
crates/burn-autodiff/src/ops/module.rs 61.45% 69 Missing ⚠️
crates/burn-tensor/src/tensor/ops/modules/conv.rs 99.55% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #2278      +/-   ##
==========================================
- Coverage   85.92%   85.92%   -0.01%     
==========================================
  Files         750      750              
  Lines       94357    94573     +216     
==========================================
+ Hits        81076    81259     +183     
- Misses      13281    13314      +33     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link
Member

@nathanielsimard nathanielsimard left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need the WithBias and NoBias variant of each convolution?

@nathanielsimard
Copy link
Member

@

Do we need the WithBias and NoBias variant of each convolution?

Because of the prepare method in the backward pass, it's not yet possible to have only one backward struct.

@nathanielsimard nathanielsimard merged commit 7ac5dee into tracel-ai:main Sep 16, 2024
11 checks passed
@AsherJingkongChen
Copy link
Contributor Author

@

Do we need the WithBias and NoBias variant of each convolution?

Because of the prepare method in the backward pass, it's not yet possible to have only one backward struct.

@nathanielsimard I think so.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants