-
Notifications
You must be signed in to change notification settings - Fork 615
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make qchem jax compatible #6096
Conversation
Hello. You may have forgotten to update the changelog!
|
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## master #6096 +/- ##
==========================================
- Coverage 99.64% 99.64% -0.01%
==========================================
Files 469 468 -1
Lines 44331 44065 -266
==========================================
- Hits 44173 43907 -266
Misses 158 158 ☔ View full report in Codecov by Sentry. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @austingmhuang for fixing this. Left some comments, mostly minor, but my main point is about duplicating many tests for jax, which seems rather unnecessary.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @austingmhuang, please update the changelog before merging.
Doesn't build due to the fact that [this PR](PennyLaneAI/pennylane#6096) is not yet merged. JSON files need to be updated as well. **Summary:** Fixes remaining demos that checked for requires_grad. Won't build for now due to missing a fix in [this PR](PennyLaneAI/pennylane#6324) [[sc-69776](https://app.shortcut.com/xanaduai/story/69776)] [[sc-69778](https://app.shortcut.com/xanaduai/story/69778)] --------- Co-authored-by: GitHub Nightly Merge Action <[email protected]> Co-authored-by: Mudit Pandey <[email protected]> Co-authored-by: Mikhail Andrenkov <[email protected]> Co-authored-by: David Wierichs <[email protected]> Co-authored-by: Korbinian Kottmann <[email protected]> Co-authored-by: Ivana Kurečić <[email protected]> Co-authored-by: Paul Finlay <[email protected]> Co-authored-by: Jack Brown <[email protected]> Co-authored-by: bellekaplan <[email protected]> Co-authored-by: obliviateandsurrender <[email protected]> Co-authored-by: soranjh <[email protected]> Co-authored-by: Soran Jahangiri <[email protected]>
Context:
Autograd deprecation means we are moving to using JAX for auto-differentiation. Since the
requires_grad
keyword is not supported by JAX arrays, we need a different solution.Description of the Change:
We keep backwards compatibility to
pnp
by checking what interface the user is using. We check this by checking the interface of the tensor. If they are using autograd, then we stick to the old workflow and checkrequires_grad
usinggetattr()
.If the user inputs a jax array for any of
[coordinates, coefficients, alpha]
, we assume that the user wants to use JAX and define all undefined coeffs/alphas using jax arrays. This means that if a user decides to mix pnp with jax, we don't hard cast the rest into either since we can't make a decision; therefore it'll result in a warning about mixing these two.WHEN USING JAX:
If users wish to differentiate any one of these parameters they should mark the parameter they want differentiable using the JAX UI, e.g.
jax.grad(..., argnums = <indice(s) of differentiable parameter(s)>)(*args)
. In our case, due to technical limitations,*args
must always be exactly[coordinates, coefficients, alpha]
. No other order is allowed and you cannot omit any of them. Unfortunately, this also includes when you are NOT using jax.grad or a jax function, like when you definediff_hamiltonian(...)(*args)
, the args here (if using JAX) also need to be exactly[coordinates, coefficients, alpha]
. When you do decide to differentiate, doingjax.grad(..., argnums=1)(coordinates, coefficients, alpha)
would mean you wantcoefficients
to be differentiable. Note this is a departure from the UI of qml.grad, where you could doqml.grad(..., argnum=0)(coefficients)
instead.Additional notes:
UI for
qml.grad
and all the other stuff is unchanged for autograd and pnp users. However, if you are using JAX and trying to use theargs
keyword inmolecular_hamiltonian
and related hamiltonians, you will need to define all of [coordinates, coefficients, alpha] as well since it goes downstream todiff_hamiltonian(...)(*args)
.Benefits:
Now JAX compatible.
Possible Drawbacks:
More changes may be needed to JIT, may have performance issues. Different UI for qml.grad and jax.grad. Different expectations for args keyword for jax arrays and pnp arrays.
Related GitHub Issues:
[sc-69776] [sc-69778]