Use LogDensityProblemsAD with ADTypes #198
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
LogDensityProblemsAD has ADTypes integration for defining gradients and hessians of the log-density. DynamicPPL.jl then customizes this integration for
DynamicPPL.LogDensityFunction
(see e.g. https://github.com/TuringLang/DynamicPPL.jl/blob/master/ext/DynamicPPLForwardDiffExt.jl).Switching to use LogDensityProblemsAD gradient and hessian functions instead of Optimization's ensures that when such customizations are made to improve performance or work around some issue unique to that LogDensityProblem, we then do the right thing. A potential downside is that Optimization.jl may support more ADs than LogDensityProblemsAD.jl, and we would not support these extra ADs. However, a simple workaround is that a user could pass
Base.Fix1(LogDensityProblems.log_density, prob)
topathfinder
ormultipathfinder
, which would not be identifiable as a LogDensityProblem and would then result in Optimization's ADTypes integration being used.Edit: relates #93