-
Notifications
You must be signed in to change notification settings - Fork 621
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add warning message to Gradients and training documentation about ComplexWarnings
#6543
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Minor suggestions. Don't forget to add your name to the changelog :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just left a small comment/question about line 46
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## master #6543 +/- ##
=======================================
Coverage 99.34% 99.34%
=======================================
Files 455 455
Lines 43193 43232 +39
=======================================
+ Hits 42909 42948 +39
Misses 284 284 ☔ View full report in Codecov by Sentry. |
Co-authored-by: Mudit Pandey <[email protected]>
Co-authored-by: Mudit Pandey <[email protected]>
Co-authored-by: Mudit Pandey <[email protected]>
…mplexWarnings` (#6543) **Context:** - This PR addresses a recurring ComplexWarning issue that appears when using differentiable workflows involving both complex and float types, particularly during backpropagation. Users have reported this issue (see an example [here](https://discuss.pennylane.ai/t/casting-of-complex-to-float-when-using-phaseshift-in-circuit-for-qml/7439)), and [internal discussions](https://xanaduhq.slack.com/archives/C0743CNS9E3/p1730329623028289) have concluded that while the warning is generally harmless, it can still be confusing for users. **Description of the Change:** - A warning note has been added to the Gradients and training page in the documentation. The note explains the origin of the ComplexWarning, assures users that it is non-critical in this context, and provides a code snippet to suppress the warning. **Benefits:** - Clarifies to users that the ComplexWarning does not indicate a calculation error, which can help avoid unnecessary troubleshooting. - Offers an easy method to suppress the warning **Possible Drawbacks:** - Suppressing warnings could potentially hide other, less common issues involving complex numbers in more advanced workflows - Users may overlook the warning's informational value if they suppress it without fully understanding the implications **Related GitHub Issues:** --------- Co-authored-by: Mudit Pandey <[email protected]>
Context:
Description of the Change:
Benefits:
Possible Drawbacks:
Related GitHub Issues: