-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update scalar.jl to use muladd #102
Conversation
The errors seems to be from |
ptiede#1 should fix CI. |
use LogDensityProblemsAD
Sorry, it was closed accidentally. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you please add a test for what was fixed?
Eg a simple logdensity that has a transformation involving ScaledShiftedExp
, AD with Enzyme, check that logdensity_and_gradient
is finite.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thanks!
Sorry for the delay! Holidays happened, and I got stuck with a recent Enzyme bug that just got fixed. Did you want me to add an additional test in another PR? |
It is fine as it is, but of course tests are always welcome if you have time. Thanks again for the PR. |
This updates scalar.jl to use muladd generically for
ScaledShiftedLogistic
since JuliaDiff/DiffRules.jl#28 has been merged. I also usedmuladd
instead offma
since it only usesfma
if it does give a speed boost. Additionally, I was having issues using Enzyme.jl with fma inserted while everything seems to work fine with muladd.