Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[autodiff] Make loss seed only set once in the tape #7910

Merged
merged 6 commits into from
Apr 27, 2023

Conversation

erizmr
Copy link
Contributor

@erizmr erizmr commented Apr 27, 2023

Issue: #

Brief Summary

🤖 Generated by Copilot at fdbd83a

This pull request enhances the reverse mode automatic differentiation (AD) module in ad._ad.py. It simplifies the loss gradient initialization and validation, and removes unnecessary code.

Walkthrough

🤖 Generated by Copilot at fdbd83a

  • Initialize the loss gradient with 1.0 for reverse mode AD in the tape context ([link](https://github.com/taichi-dev/taichi/pull/7910/files?diff=unified&w=0#diff-b986921c47e4b8c903d6bfc906398260dfeb17e16f05e5cd5b52e401eddc0bd0R215), [link](https://github.com/taichi-dev/taichi/pull/7910/files?diff=unified&w=0#diff-b986921c47e4b8c903d6bfc906398260dfeb17e16f05e5cd5b52e401eddc0bd0R228-R233))

@netlify
Copy link

netlify bot commented Apr 27, 2023

Deploy Preview for docsite-preview ready!

Name Link
🔨 Latest commit 9bbfddf
🔍 Latest deploy log https://app.netlify.com/sites/docsite-preview/deploys/644a4e557d6044000852f9a9
😎 Deploy Preview https://deploy-preview-7910--docsite-preview.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify site settings.

@erizmr erizmr changed the title [autodiff] Enforce loss seed only set once in the tape [autodiff] Make loss seed only set once in the tape Apr 27, 2023
@erizmr erizmr requested a review from ailzhang April 27, 2023 09:10
Copy link
Contributor

@ailzhang ailzhang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

@erizmr erizmr merged commit 8820ade into taichi-dev:master Apr 27, 2023
@erizmr erizmr deleted the clear_grad branch April 27, 2023 16:09
quadpixels pushed a commit to quadpixels/taichi that referenced this pull request May 13, 2023
Issue: #

### Brief Summary

<!--
copilot:summary
-->
### <samp>🤖 Generated by Copilot at fdbd83a</samp>

This pull request enhances the reverse mode automatic differentiation
(AD) module in `ad._ad.py`. It simplifies the loss gradient
initialization and validation, and removes unnecessary code.

### Walkthrough

<!--
copilot:walkthrough
-->
### <samp>🤖 Generated by Copilot at fdbd83a</samp>

* Initialize the loss gradient with 1.0 for reverse mode AD in the tape
context
(`[link](https://github.com/taichi-dev/taichi/pull/7910/files?diff=unified&w=0#diff-b986921c47e4b8c903d6bfc906398260dfeb17e16f05e5cd5b52e401eddc0bd0R215)`,
`[link](https://github.com/taichi-dev/taichi/pull/7910/files?diff=unified&w=0#diff-b986921c47e4b8c903d6bfc906398260dfeb17e16f05e5cd5b52e401eddc0bd0R228-R233)`)

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants