-
Notifications
You must be signed in to change notification settings - Fork 21
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ERROR: MethodError: no method matching Int64(::Irrational{:log2π}) #333
Comments
That comes from the wrong promotion for Replace this line
by this one
|
That gives julia> MvNormal(2.0, 1.0)
ERROR: MethodError: no method matching MvNormal(::Float64, ::Float64) so I tried Normal(2.0, 1.0) but that's missing something too ERROR: MethodError: no method matching +(::ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}, ::Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}})
For element-wise addition, use broadcasting with dot syntax: scalar .+ array
Closest candidates are:
+(::Any, ::Any, ::Any, ::Any...) at /nix/store/15kh8pp59zbnhxcxh2l66xr1hzly00y9-julia-bin-1.7.1/share/julia/base/operators.jl:655
+(::ChainRulesCore.Tangent{P}, ::P) where P at ~/.julia/packages/ChainRulesCore/BYuIz/src/differential_arithmetic.jl:162
+(::ForwardDiff.Dual{Tx}, ::RoundingMode) where Tx at ~/.julia/packages/ForwardDiff/pDtsf/src/dual.jl:144
...
Stacktrace:
[1] logdensity(ℓ::var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}, params::Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}})
@ Main ~/.../dhmc-test.jl:83
[2] (::LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}})(x::Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}})
@ LogDensityProblems ~/.julia/packages/LogDensityProblems/tWBzE/src/AD_ForwardDiff.jl:20
[3] vector_mode_dual_eval!(f::LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, cfg::ForwardDiff.GradientConfig{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}}}, x::Vector{Float64})
@ ForwardDiff ~/.julia/packages/ForwardDiff/pDtsf/src/apiutils.jl:37
[4] vector_mode_gradient!(result::DiffResults.MutableDiffResult{1, Float64, Tuple{Vector{Float64}}}, f::LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, x::Vector{Float64}, cfg::ForwardDiff.GradientConfig{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}}})
@ ForwardDiff ~/.julia/packages/ForwardDiff/pDtsf/src/gradient.jl:113
[5] gradient!(result::DiffResults.MutableDiffResult{1, Float64, Tuple{Vector{Float64}}}, f::LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, x::Vector{Float64}, cfg::ForwardDiff.GradientConfig{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}}}, ::Val{true})
@ ForwardDiff ~/.julia/packages/ForwardDiff/pDtsf/src/gradient.jl:37
[6] gradient!(result::DiffResults.MutableDiffResult{1, Float64, Tuple{Vector{Float64}}}, f::LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, x::Vector{Float64}, cfg::ForwardDiff.GradientConfig{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}}})
@ ForwardDiff ~/.julia/packages/ForwardDiff/pDtsf/src/gradient.jl:35
[7] logdensity_and_gradient(fℓ::LogDensityProblems.ForwardDiffLogDensity{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}, ForwardDiff.GradientConfig{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}}}}, x::Vector{Float64})
@ LogDensityProblems ~/.julia/packages/LogDensityProblems/tWBzE/src/AD_ForwardDiff.jl:50
[8] evaluate_ℓ(ℓ::LogDensityProblems.ForwardDiffLogDensity{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}, ForwardDiff.GradientConfig{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}}}}, q::Vector{Float64})
@ DynamicHMC ~/.julia/packages/DynamicHMC/x07K9/src/hamiltonian.jl:195
[9] initialize_warmup_state(rng::Random._GLOBAL_RNG, ℓ::LogDensityProblems.ForwardDiffLogDensity{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}, ForwardDiff.GradientConfig{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}}}}; q::Vector{Float64}, κ::GaussianKineticEnergy{Diagonal{Float64, Vector{Float64}}, Diagonal{Float64, Vector{Float64}}}, ϵ::Nothing)
@ DynamicHMC ~/.julia/packages/DynamicHMC/x07K9/src/mcmc.jl:125
[10] initialize_warmup_state(rng::Random._GLOBAL_RNG, ℓ::LogDensityProblems.ForwardDiffLogDensity{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}, ForwardDiff.GradientConfig{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}}}})
@ DynamicHMC ~/.julia/packages/DynamicHMC/x07K9/src/mcmc.jl:125
[11] mcmc_keep_warmup(rng::Random._GLOBAL_RNG, ℓ::LogDensityProblems.ForwardDiffLogDensity{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}, ForwardDiff.GradientConfig{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}}}}, N::Int64; initialization::Tuple{}, warmup_stages::Tuple{InitialStepsizeSearch, TuningNUTS{Nothing, DualAveraging{Float64}}, TuningNUTS{Diagonal, DualAveraging{Float64}}, TuningNUTS{Diagonal, DualAveraging{Float64}}, TuningNUTS{Diagonal, DualAveraging{Float64}}, TuningNUTS{Diagonal, DualAveraging{Float64}}, TuningNUTS{Diagonal, DualAveraging{Float64}}, TuningNUTS{Nothing, DualAveraging{Float64}}}, algorithm::DynamicHMC.NUTS{Val{:generalized}}, reporter::NoProgressReport)
@ DynamicHMC ~/.julia/packages/DynamicHMC/x07K9/src/mcmc.jl:496
[12] macro expansion
@ ~/.julia/packages/UnPack/EkESO/src/UnPack.jl:100 [inlined]
[13] mcmc_with_warmup(rng::Random._GLOBAL_RNG, ℓ::LogDensityProblems.ForwardDiffLogDensity{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}, ForwardDiff.GradientConfig{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#46#47"{var"#loglikelihood#1"{Vector{Float64}, Vector{Float64}}}, Float64}, Float64, 2}}}}, N::Int64; initialization::Tuple{}, warmup_stages::Tuple{InitialStepsizeSearch, TuningNUTS{Nothing, DualAveraging{Float64}}, TuningNUTS{Diagonal, DualAveraging{Float64}}, TuningNUTS{Diagonal, DualAveraging{Float64}}, TuningNUTS{Diagonal, DualAveraging{Float64}}, TuningNUTS{Diagonal, DualAveraging{Float64}}, TuningNUTS{Diagonal, DualAveraging{Float64}}, TuningNUTS{Nothing, DualAveraging{Float64}}}, algorithm::DynamicHMC.NUTS{Val{:generalized}}, reporter::NoProgressReport)
@ DynamicHMC ~/.julia/packages/DynamicHMC/x07K9/src/mcmc.jl:547 |
Ah right sorry. logprior(params) = logpdf(MvNormal(2.0 * ones(length(params), 1.0 * ones(length(params)), params) or much better logprior(params) = loglikelihood(Normal(2.0, 1.0), params) |
That's working now. Thanks. using AbstractGPs
using Distributions
using StatsFuns
using Plots
default(; legend=:outertopright, size=(700, 400))
using Random
Random.seed!(42) # setting the seed for reproducibility of this notebook
x = [
0.8658165855998895,
0.6661700880180962,
0.8049218148148531,
0.7714303440386239,
0.14790478354654835,
0.8666105548197428,
0.007044577166530286,
0.026331737288148638,
0.17188596617099916,
0.8897812990554013,
0.24323574561119998,
0.028590102134105955,
]
y = [
1.5255314337144372,
3.6434202968230003,
3.010885733911661,
3.774442382979625,
3.3687639483798324,
1.5506452040608503,
3.790447985799683,
3.8689707574953,
3.4933565751758713,
1.4284538820635841,
3.8715350915692364,
3.7045949061144983,
]
x_train = x[1:8]
y_train = y[1:8]
x_test = x[9:end]
y_test = y[9:end]
f = GP(Matern52Kernel())
fx = f(x_train, 0.1)
logpdf(fx, y_train)
p_fx = posterior(fx, y_train)
logpdf(p_fx(x_test), y_test)
function gp_loglikelihood(x, y)
function loglikelihood(params)
kernel =
softplus(params[1]) * (Matern52Kernel() ∘ ScaleTransform(softplus(params[2])))
f = GP(kernel)
fx = f(x, 0.1)
return logpdf(fx, y)
end
return loglikelihood
end
const loglik_train = gp_loglikelihood(x_train, y_train)
using DynamicHMC
using LogDensityProblems
using LinearAlgebra
n_samples = 2_000
n_adapts = 1_000
# Log joint density
function LogDensityProblems.logdensity(ℓ::typeof(loglik_train), params)
return ℓ(params) + logprior(params)
end
logprior(params) = loglikelihood(Normal(2.0, 1.0), params)
# The parameter space is two-dimensional
LogDensityProblems.dimension(::typeof(loglik_train)) = 2
# `loglik_train` does not allow to evaluate derivatives of
# the log-likelihood function
function LogDensityProblems.capabilities(::Type{<:typeof(loglik_train)})
return LogDensityProblems.LogDensityOrder{0}()
end
mcmc_with_warmup(
Random.GLOBAL_RNG,
ADgradient(:ForwardDiff, loglik_train),
n_samples;
reporter=NoProgressReport(),
) |
Good that you managed to solve your issue! Actually, the main problem with your initial version was that you used a quite old version of Distributions (0.25.14, the latest version is 0.25.71). The error you saw was fixed last November by JuliaStats/Distributions.jl#1429 and is available in Distributions >= 0.25.32. |
That's a good point. It looks like I am constrained by TemporalGPs, which has a note in its readme saying
So attempting to use a newer version of Distributions fails with
|
TemporalGPs also only supports an old version of AbstractGPs. If it is not mandatory for your example (AFAICT it is not needed for the example in the initial issue), then I recommend using a separate environment with only the mandatory packages and in particular without TemporalGPs for running this example. |
I'm trying to learn to use this package. I copied the documentation example of DynamicHMC from https://juliagaussianprocesses.github.io/AbstractGPs.jl/dev/examples/0-intro-1d/
it says
Julia 1.7.1
[99985d1d] AbstractGPs v0.3.9
[bbc10e6e] DynamicHMC v3.3.0
[31c24e10] Distributions v0.25.14
[4c63d2b9] StatsFuns v0.9.9
The text was updated successfully, but these errors were encountered: