Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ERROR: MethodError: no method matching Int64(::Irrational{:log2π}) #333

Closed
jariji opened this issue Sep 17, 2022 · 7 comments
Closed

ERROR: MethodError: no method matching Int64(::Irrational{:log2π}) #333

jariji opened this issue Sep 17, 2022 · 7 comments

Comments

@jariji
Copy link

jariji commented Sep 17, 2022

I'm trying to learn to use this package. I copied the documentation example of DynamicHMC from https://juliagaussianprocesses.github.io/AbstractGPs.jl/dev/examples/0-intro-1d/

using AbstractGPs
using Distributions
using StatsFuns

using Plots
default(; legend=:outertopright, size=(700, 400))

using Random
Random.seed!(42)  # setting the seed for reproducibility of this notebook


x = [
    0.8658165855998895,
    0.6661700880180962,
    0.8049218148148531,
    0.7714303440386239,
    0.14790478354654835,
    0.8666105548197428,
    0.007044577166530286,
    0.026331737288148638,
    0.17188596617099916,
    0.8897812990554013,
    0.24323574561119998,
    0.028590102134105955,
]
y = [
    1.5255314337144372,
    3.6434202968230003,
    3.010885733911661,
    3.774442382979625,
    3.3687639483798324,
    1.5506452040608503,
    3.790447985799683,
    3.8689707574953,
    3.4933565751758713,
    1.4284538820635841,
    3.8715350915692364,
    3.7045949061144983,
]



x_train = x[1:8]
y_train = y[1:8]
x_test = x[9:end]
y_test = y[9:end]

f = GP(Matern52Kernel())

fx = f(x_train, 0.1)
logpdf(fx, y_train)


p_fx = posterior(fx, y_train)
logpdf(p_fx(x_test), y_test)


function gp_loglikelihood(x, y)
    function loglikelihood(params)
        kernel =
            softplus(params[1]) * (Matern52Kernel()  ScaleTransform(softplus(params[2])))
        f = GP(kernel)
        fx = f(x, 0.1)
        return logpdf(fx, y)
    end
    return loglikelihood
end

const loglik_train = gp_loglikelihood(x_train, y_train)

## DHMC

using DynamicHMC
using LogDensityProblems

n_samples = 2_000
n_adapts = 1_000


# Log joint density
function LogDensityProblems.logdensity(ℓ::typeof(loglik_train), params)
    return (params) + logprior(params)
end
logprior(params) = logpdf(MvNormal(2, 1), params)

# The parameter space is two-dimensional
LogDensityProblems.dimension(::typeof(loglik_train)) = 2

# `loglik_train` does not allow to evaluate derivatives of
# the log-likelihood function
function LogDensityProblems.capabilities(::Type{<:typeof(loglik_train)})
    return LogDensityProblems.LogDensityOrder{0}()
end


samples =
    mcmc_with_warmup(
        Random.GLOBAL_RNG,
        ADgradient(:ForwardDiff, loglik_train),
        n_samples;
        reporter=NoProgressReport(),
    ).chain

it says

ERROR: MethodError: no method matching Int64(::Irrational{:log2π})

Stacktrace:
  [1] convert(#unused#::Type{Int64}, x::Irrational{:log2π})
    @ Base ./number.jl:7
  [2] mvnormal_c0(g::MvNormal{Int64, PDMats.ScalMat{Int64}, FillArrays.Zeros{Int64, 1, Tuple{Base.OneTo{Int64}}}})
    @ Distributions ~/.julia/packages/Distributions/t65ji/src/multivariate/mvnormal.jl:99
  [3] _logpdf(d::MvNormal{Int64, PDMats.ScalMat{Int64}, FillArrays.Zeros{Int64, 1, Tuple{Base.OneTo{Int64}}}}, x::Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}})
    @ Distributions ~/.julia/packages/Distributions/t65ji/src/multivariate/mvnormal.jl:127
  [4] logpdf(d::MvNormal{Int64, PDMats.ScalMat{Int64}, FillArrays.Zeros{Int64, 1, Tuple{Base.OneTo{Int64}}}}, X::Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}})
    @ Distributions ~/.julia/packages/Distributions/t65ji/src/multivariates.jl:201
  [5] logprior(params::Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}})
    @ Main ~/.../dhmc-test.jl:84
  [6] logdensity(ℓ::var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}, params::Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}})
    @ Main ~/.../dhmc-test.jl:82
  [7] (::LogDensityProblems.var"#46#47"{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}})(x::Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}})
    @ LogDensityProblems ~/.julia/packages/LogDensityProblems/tWBzE/src/AD_ForwardDiff.jl:20
  [8] vector_mode_dual_eval!(f::LogDensityProblems.var"#46#47"{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}}, cfg::ForwardDiff.GradientConfig{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}}}, x::Vector{Float64})
    @ ForwardDiff ~/.julia/packages/ForwardDiff/pDtsf/src/apiutils.jl:37
  [9] vector_mode_gradient!(result::DiffResults.MutableDiffResult{1, Float64, Tuple{Vector{Float64}}}, f::LogDensityProblems.var"#46#47"{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}}, x::Vector{Float64}, cfg::ForwardDiff.GradientConfig{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}}})
    @ ForwardDiff ~/.julia/packages/ForwardDiff/pDtsf/src/gradient.jl:113
 [10] gradient!(result::DiffResults.MutableDiffResult{1, Float64, Tuple{Vector{Float64}}}, f::LogDensityProblems.var"#46#47"{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}}, x::Vector{Float64}, cfg::ForwardDiff.GradientConfig{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}}}, ::Val{true})
    @ ForwardDiff ~/.julia/packages/ForwardDiff/pDtsf/src/gradient.jl:37
 [11] gradient!(result::DiffResults.MutableDiffResult{1, Float64, Tuple{Vector{Float64}}}, f::LogDensityProblems.var"#46#47"{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}}, x::Vector{Float64}, cfg::ForwardDiff.GradientConfig{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}}})
    @ ForwardDiff ~/.julia/packages/ForwardDiff/pDtsf/src/gradient.jl:35
 [12] logdensity_and_gradient(fℓ::LogDensityProblems.ForwardDiffLogDensity{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}, ForwardDiff.GradientConfig{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}}}}, x::Vector{Float64})
    @ LogDensityProblems ~/.julia/packages/LogDensityProblems/tWBzE/src/AD_ForwardDiff.jl:50
 [13] evaluate_ℓ(ℓ::LogDensityProblems.ForwardDiffLogDensity{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}, ForwardDiff.GradientConfig{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}}}}, q::Vector{Float64})
    @ DynamicHMC ~/.julia/packages/DynamicHMC/x07K9/src/hamiltonian.jl:195
 [14] initialize_warmup_state(rng::Random._GLOBAL_RNG, ℓ::LogDensityProblems.ForwardDiffLogDensity{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}, ForwardDiff.GradientConfig{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}}}}; q::Vector{Float64}, κ::GaussianKineticEnergy{LinearAlgebra.Diagonal{Float64, Vector{Float64}}, LinearAlgebra.Diagonal{Float64, Vector{Float64}}}, ϵ::Nothing)
    @ DynamicHMC ~/.julia/packages/DynamicHMC/x07K9/src/mcmc.jl:125
 [15] initialize_warmup_state(rng::Random._GLOBAL_RNG, ℓ::LogDensityProblems.ForwardDiffLogDensity{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}, ForwardDiff.GradientConfig{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}}}})
    @ DynamicHMC ~/.julia/packages/DynamicHMC/x07K9/src/mcmc.jl:125
 [16] mcmc_keep_warmup(rng::Random._GLOBAL_RNG, ℓ::LogDensityProblems.ForwardDiffLogDensity{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}, ForwardDiff.GradientConfig{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}}}}, N::Int64; initialization::Tuple{}, warmup_stages::Tuple{InitialStepsizeSearch, TuningNUTS{Nothing, DualAveraging{Float64}}, TuningNUTS{LinearAlgebra.Diagonal, DualAveraging{Float64}}, TuningNUTS{LinearAlgebra.Diagonal, DualAveraging{Float64}}, TuningNUTS{LinearAlgebra.Diagonal, DualAveraging{Float64}}, TuningNUTS{LinearAlgebra.Diagonal, DualAveraging{Float64}}, TuningNUTS{LinearAlgebra.Diagonal, DualAveraging{Float64}}, TuningNUTS{Nothing, DualAveraging{Float64}}}, algorithm::DynamicHMC.NUTS{Val{:generalized}}, reporter::NoProgressReport)
    @ DynamicHMC ~/.julia/packages/DynamicHMC/x07K9/src/mcmc.jl:496
 [17] macro expansion
    @ ~/.julia/packages/UnPack/EkESO/src/UnPack.jl:100 [inlined]
 [18] mcmc_with_warmup(rng::Random._GLOBAL_RNG, ℓ::LogDensityProblems.ForwardDiffLogDensity{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}, ForwardDiff.GradientConfig{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#5"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}}}}, N::Int64; initialization::Tuple{}, warmup_stages::Tuple{InitialStepsizeSearch, TuningNUTS{Nothing, DualAveraging{Float64}}, TuningNUTS{LinearAlgebra.Diagonal, DualAveraging{Float64}}, TuningNUTS{LinearAlgebra.Diagonal, DualAveraging{Float64}}, TuningNUTS{LinearAlgebra.Diagonal, DualAveraging{Float64}}, TuningNUTS{LinearAlgebra.Diagonal, DualAveraging{Float64}}, TuningNUTS{LinearAlgebra.Diagonal, DualAveraging{Float64}}, TuningNUTS{Nothing, DualAveraging{Float64}}}, algorithm::DynamicHMC.NUTS{Val{:generalized}}, reporter::NoProgressReport)
    @ DynamicHMC ~/.julia/packages/DynamicHMC/x07K9/src/mcmc.jl:547
 [19] top-level scope

Julia 1.7.1
[99985d1d] AbstractGPs v0.3.9
[bbc10e6e] DynamicHMC v3.3.0
[31c24e10] Distributions v0.25.14
[4c63d2b9] StatsFuns v0.9.9

@theogf
Copy link
Member

theogf commented Sep 17, 2022

That comes from the wrong promotion for MvNormal (a long standing problem)

Replace this line

logprior(params) = logpdf(MvNormal(2, 1), params)

by this one

logprior(params) = logpdf(MvNormal(2.0, 1.0), params)

@jariji
Copy link
Author

jariji commented Sep 17, 2022

That gives

julia> MvNormal(2.0, 1.0)
ERROR: MethodError: no method matching MvNormal(::Float64, ::Float64)

so I tried

Normal(2.0, 1.0)

but that's missing something too

ERROR: MethodError: no method matching +(::ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}, ::Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}})
For element-wise addition, use broadcasting with dot syntax: scalar .+ array
Closest candidates are:
  +(::Any, ::Any, ::Any, ::Any...) at /nix/store/15kh8pp59zbnhxcxh2l66xr1hzly00y9-julia-bin-1.7.1/share/julia/base/operators.jl:655
  +(::ChainRulesCore.Tangent{P}, ::P) where P at ~/.julia/packages/ChainRulesCore/BYuIz/src/differential_arithmetic.jl:162
  +(::ForwardDiff.Dual{Tx}, ::RoundingMode) where Tx at ~/.julia/packages/ForwardDiff/pDtsf/src/dual.jl:144
  ...
Stacktrace:
  [1] logdensity(ℓ::var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}, params::Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}})
    @ Main ~/.../dhmc-test.jl:83
  [2] (::LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}})(x::Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}})
    @ LogDensityProblems ~/.julia/packages/LogDensityProblems/tWBzE/src/AD_ForwardDiff.jl:20
  [3] vector_mode_dual_eval!(f::LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, cfg::ForwardDiff.GradientConfig{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}}}, x::Vector{Float64})
    @ ForwardDiff ~/.julia/packages/ForwardDiff/pDtsf/src/apiutils.jl:37
  [4] vector_mode_gradient!(result::DiffResults.MutableDiffResult{1, Float64, Tuple{Vector{Float64}}}, f::LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, x::Vector{Float64}, cfg::ForwardDiff.GradientConfig{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}}})
    @ ForwardDiff ~/.julia/packages/ForwardDiff/pDtsf/src/gradient.jl:113
  [5] gradient!(result::DiffResults.MutableDiffResult{1, Float64, Tuple{Vector{Float64}}}, f::LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, x::Vector{Float64}, cfg::ForwardDiff.GradientConfig{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}}}, ::Val{true})
    @ ForwardDiff ~/.julia/packages/ForwardDiff/pDtsf/src/gradient.jl:37
  [6] gradient!(result::DiffResults.MutableDiffResult{1, Float64, Tuple{Vector{Float64}}}, f::LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, x::Vector{Float64}, cfg::ForwardDiff.GradientConfig{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}}})
    @ ForwardDiff ~/.julia/packages/ForwardDiff/pDtsf/src/gradient.jl:35
  [7] logdensity_and_gradient(fℓ::LogDensityProblems.ForwardDiffLogDensity{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}, ForwardDiff.GradientConfig{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}}}}, x::Vector{Float64})
    @ LogDensityProblems ~/.julia/packages/LogDensityProblems/tWBzE/src/AD_ForwardDiff.jl:50
  [8] evaluate_ℓ(ℓ::LogDensityProblems.ForwardDiffLogDensity{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}, ForwardDiff.GradientConfig{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}}}}, q::Vector{Float64})
    @ DynamicHMC ~/.julia/packages/DynamicHMC/x07K9/src/hamiltonian.jl:195
  [9] initialize_warmup_state(rng::Random._GLOBAL_RNG, ℓ::LogDensityProblems.ForwardDiffLogDensity{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}, ForwardDiff.GradientConfig{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}}}}; q::Vector{Float64}, κ::GaussianKineticEnergy{Diagonal{Float64, Vector{Float64}}, Diagonal{Float64, Vector{Float64}}}, ϵ::Nothing)
    @ DynamicHMC ~/.julia/packages/DynamicHMC/x07K9/src/mcmc.jl:125
 [10] initialize_warmup_state(rng::Random._GLOBAL_RNG, ℓ::LogDensityProblems.ForwardDiffLogDensity{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}, ForwardDiff.GradientConfig{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}}}})
    @ DynamicHMC ~/.julia/packages/DynamicHMC/x07K9/src/mcmc.jl:125
 [11] mcmc_keep_warmup(rng::Random._GLOBAL_RNG, ℓ::LogDensityProblems.ForwardDiffLogDensity{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}, ForwardDiff.GradientConfig{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}}}}, N::Int64; initialization::Tuple{}, warmup_stages::Tuple{InitialStepsizeSearch, TuningNUTS{Nothing, DualAveraging{Float64}}, TuningNUTS{Diagonal, DualAveraging{Float64}}, TuningNUTS{Diagonal, DualAveraging{Float64}}, TuningNUTS{Diagonal, DualAveraging{Float64}}, TuningNUTS{Diagonal, DualAveraging{Float64}}, TuningNUTS{Diagonal, DualAveraging{Float64}}, TuningNUTS{Nothing, DualAveraging{Float64}}}, algorithm::DynamicHMC.NUTS{Val{:generalized}}, reporter::NoProgressReport)
    @ DynamicHMC ~/.julia/packages/DynamicHMC/x07K9/src/mcmc.jl:496
 [12] macro expansion
    @ ~/.julia/packages/UnPack/EkESO/src/UnPack.jl:100 [inlined]
 [13] mcmc_with_warmup(rng::Random._GLOBAL_RNG, ℓ::LogDensityProblems.ForwardDiffLogDensity{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}, ForwardDiff.GradientConfig{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}}}}, N::Int64; initialization::Tuple{}, warmup_stages::Tuple{InitialStepsizeSearch, TuningNUTS{Nothing, DualAveraging{Float64}}, TuningNUTS{Diagonal, DualAveraging{Float64}}, TuningNUTS{Diagonal, DualAveraging{Float64}}, TuningNUTS{Diagonal, DualAveraging{Float64}}, TuningNUTS{Diagonal, DualAveraging{Float64}}, TuningNUTS{Diagonal, DualAveraging{Float64}}, TuningNUTS{Nothing, DualAveraging{Float64}}}, algorithm::DynamicHMC.NUTS{Val{:generalized}}, reporter::NoProgressReport)
    @ DynamicHMC ~/.julia/packages/DynamicHMC/x07K9/src/mcmc.jl:547

@theogf
Copy link
Member

theogf commented Sep 17, 2022

Ah right sorry.
So either you do

logprior(params) = logpdf(MvNormal(2.0 * ones(length(params), 1.0 * ones(length(params)), params)

or much better

logprior(params) = loglikelihood(Normal(2.0, 1.0), params)

@jariji
Copy link
Author

jariji commented Sep 17, 2022

That's working now. Thanks.

using AbstractGPs
using Distributions
using StatsFuns

using Plots
default(; legend=:outertopright, size=(700, 400))

using Random
Random.seed!(42)  # setting the seed for reproducibility of this notebook


x = [
    0.8658165855998895,
    0.6661700880180962,
    0.8049218148148531,
    0.7714303440386239,
    0.14790478354654835,
    0.8666105548197428,
    0.007044577166530286,
    0.026331737288148638,
    0.17188596617099916,
    0.8897812990554013,
    0.24323574561119998,
    0.028590102134105955,
]
y = [
    1.5255314337144372,
    3.6434202968230003,
    3.010885733911661,
    3.774442382979625,
    3.3687639483798324,
    1.5506452040608503,
    3.790447985799683,
    3.8689707574953,
    3.4933565751758713,
    1.4284538820635841,
    3.8715350915692364,
    3.7045949061144983,
]



x_train = x[1:8]
y_train = y[1:8]
x_test = x[9:end]
y_test = y[9:end]

f = GP(Matern52Kernel())

fx = f(x_train, 0.1)
logpdf(fx, y_train)


p_fx = posterior(fx, y_train)
logpdf(p_fx(x_test), y_test)


function gp_loglikelihood(x, y)
    function loglikelihood(params)
        kernel =
            softplus(params[1]) * (Matern52Kernel()  ScaleTransform(softplus(params[2])))
        f = GP(kernel)
        fx = f(x, 0.1)
        return logpdf(fx, y)
    end
    return loglikelihood
end

const loglik_train = gp_loglikelihood(x_train, y_train)


using DynamicHMC
using LogDensityProblems
using LinearAlgebra

n_samples = 2_000
n_adapts = 1_000


# Log joint density
function LogDensityProblems.logdensity(ℓ::typeof(loglik_train), params)
    return (params) + logprior(params)
end

logprior(params) = loglikelihood(Normal(2.0, 1.0), params)

# The parameter space is two-dimensional
LogDensityProblems.dimension(::typeof(loglik_train)) = 2

# `loglik_train` does not allow to evaluate derivatives of
# the log-likelihood function
function LogDensityProblems.capabilities(::Type{<:typeof(loglik_train)})
    return LogDensityProblems.LogDensityOrder{0}()
end



    mcmc_with_warmup(
        Random.GLOBAL_RNG,
        ADgradient(:ForwardDiff, loglik_train),
        n_samples;
        reporter=NoProgressReport(),
    )

@jariji jariji closed this as completed Sep 17, 2022
@devmotion
Copy link
Member

Good that you managed to solve your issue!

Actually, the main problem with your initial version was that you used a quite old version of Distributions (0.25.14, the latest version is 0.25.71). The error you saw was fixed last November by JuliaStats/Distributions.jl#1429 and is available in Distributions >= 0.25.32.

@jariji
Copy link
Author

jariji commented Sep 17, 2022

That's a good point. It looks like I am constrained by TemporalGPs, which has a note in its readme saying

In the interest of managing expectations, please note that TemporalGPs does not currently operate with the most current version of AbstractGPs / Zygote / ChainRules. I (Will) am aware of this problem, and will sort it out as soon as I have the time!

So attempting to use a newer version of Distributions fails with

pkg> add [email protected]
   Resolving package versions...
ERROR: Unsatisfiable requirements detected for package TemporalGPs [e155a3c4]:
 TemporalGPs [e155a3c4] log:
 ├─possible versions are: 0.1.0-0.5.13 or uninstalled
 ├─restricted to versions * by an explicit requirement, leaving only versions 0.1.0-0.5.13
 ├─restricted by compatibility requirements with Distributions [31c24e10] to versions: [0.1.0-0.3.2, 0.5.8-0.5.13] or uninstalled, leaving only versions: [0.1.0-0.3.2, 0.5.8-0.5.13]
 │ └─Distributions [31c24e10] log:
 │   ├─possible versions are: 0.16.0-0.25.71 or uninstalled
 │   ├─restricted to versions [0.21, 0.23-0.25] by RegressionTables [d519eb52], leaving only versions [0.21.0-0.21.12, 0.23.0-0.25.71]
 │   │ └─RegressionTables [d519eb52] log:
 │   │   ├─possible versions are: 0.5.7 or uninstalled
 │   │   └─RegressionTables [d519eb52] is fixed to version 0.5.7
 │   └─restricted to versions 0.25.71 by an explicit requirement, leaving only versions 0.25.71
 ├─restricted by compatibility requirements with FillArrays [1a297f60] to versions: 0.3.7-0.5.13 or uninstalled, leaving only versions: 0.5.8-0.5.13
 │ └─FillArrays [1a297f60] log:
 │   ├─possible versions are: 0.2.0-0.13.4 or uninstalled
 │   ├─restricted by compatibility requirements with AbstractGPs [99985d1d] to versions: 0.7.0-0.13.4
 │   │ └─AbstractGPs [99985d1d] log:
 │   │   ├─possible versions are: 0.1.0-0.5.13 or uninstalled
 │   │   ├─restricted to versions * by an explicit requirement, leaving only versions 0.1.0-0.5.13
 │   │   ├─restricted by compatibility requirements with Distributions [31c24e10] to versions: 0.3.3-0.5.13 or uninstalled, leaving only versions: 0.3.3-0.5.13
 │   │   │ └─Distributions [31c24e10] log: see above
 │   │   └─restricted by compatibility requirements with ChainRulesCore [d360d2e6] to versions: [0.1.0-0.2.2, 0.3.10-0.5.13] or uninstalled, leaving only versions: 0.3.10-0.5.13
 │   │     └─ChainRulesCore [d360d2e6] log:
 │   │       ├─possible versions are: 0.1.0-1.15.5 or uninstalled
 │   │       └─restricted by compatibility requirements with Distributions [31c24e10] to versions: 1.0.0-1.15.5
 │   │         └─Distributions [31c24e10] log: see above
 │   ├─restricted by compatibility requirements with Distributions [31c24e10] to versions: 0.9.0-0.13.4
 │   │ └─Distributions [31c24e10] log: see above
 │   └─restricted by compatibility requirements with TemporalGPs [e155a3c4] to versions: [0.7.0-0.8.14, 0.10.0-0.12.8], leaving only versions: 0.10.0-0.12.8
 │     └─TemporalGPs [e155a3c4] log: see above
 └─restricted by compatibility requirements with ChainRulesCore [d360d2e6] to versions: 0.1.0-0.3.10 or uninstalled — no versions left
   └─ChainRulesCore [d360d2e6] log: see above

@devmotion
Copy link
Member

TemporalGPs also only supports an old version of AbstractGPs. If it is not mandatory for your example (AFAICT it is not needed for the example in the initial issue), then I recommend using a separate environment with only the mandatory packages and in particular without TemporalGPs for running this example.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants