Skip to content

Commit

Permalink
Merge remote-tracking branch 'origin/master' into rv/gpu
Browse files Browse the repository at this point in the history
  • Loading branch information
rossviljoen committed Aug 13, 2021
2 parents ffa7fc5 + dd513f1 commit 945ae43
Show file tree
Hide file tree
Showing 10 changed files with 40 additions and 38 deletions.
2 changes: 1 addition & 1 deletion Project.toml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
name = "AbstractGPs"
uuid = "99985d1d-32ba-4be9-9821-2ec096f28918"
authors = ["JuliaGaussianProcesses Team"]
version = "0.3.8"
version = "0.3.9"

[deps]
ChainRulesCore = "d360d2e6-b24c-11e9-a2a3-2a2ae2dbcce4"
Expand Down
10 changes: 5 additions & 5 deletions src/abstract_gp.jl
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ abstract type AbstractGP end
"""
mean(f::AbstractGP, x::AbstractVector)
Computes the mean vector of the multivariate Normal `f(x)`.
Compute the mean vector of the multivariate Normal `f(x)`.
"""
Statistics.mean(::AbstractGP, ::AbstractVector)

Expand Down Expand Up @@ -42,16 +42,16 @@ Statistics.cov(::AbstractGP, x::AbstractVector, y::AbstractVector)
"""
mean_and_cov(f::AbstractGP, x::AbstractVector)
Compute both `mean(f(x))` and `cov(f(x))`. Sometimes more efficient than separately
computation, particularly for posteriors.
Compute both `mean(f(x))` and `cov(f(x))`. Sometimes more efficient than
computing them separately, particularly for posteriors.
"""
StatsBase.mean_and_cov(f::AbstractGP, x::AbstractVector) = (mean(f, x), cov(f, x))

"""
mean_and_var(f::AbstractGP, x::AbstractVector)
Compute both `mean(f(x))` and the diagonal elements of `cov(f(x))`. Sometimes more efficient
than separately computation, particularly for posteriors.
Compute both `mean(f(x))` and the diagonal elements of `cov(f(x))`. Sometimes
more efficient than computing them separately, particularly for posteriors.
"""
StatsBase.mean_and_var(f::AbstractGP, x::AbstractVector) = (mean(f, x), var(f, x))

Expand Down
9 changes: 5 additions & 4 deletions src/base_gp.jl
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,6 @@ A Gaussian Process (GP) with known `mean` and `kernel`. See e.g. [1] for an intr
# Zero Mean
If only one argument is provided, assume the mean to be zero everywhere:
```jldoctest
julia> f = GP(Matern32Kernel());
Expand All @@ -21,8 +20,9 @@ true
### Constant Mean
If a `Real` is provided as the first argument, assume the mean function is constant with
that value
If a `Real` is provided as the first argument, assume the mean function is
constant with that value.
```jldoctest
julia> f = GP(5.0, Matern32Kernel());
Expand All @@ -38,6 +38,7 @@ true
### Custom Mean
Provide an arbitrary function to compute the mean:
```jldoctest
julia> f = GP(x -> sin(x) + cos(x / 2), Matern32Kernel());
Expand All @@ -64,7 +65,7 @@ GP(kernel::Kernel) = GP(ZeroMean(), kernel)

# AbstractGP interface implementation.

Statistics.mean(f::GP, x::AbstractVector) = _map(f.mean, x)
Statistics.mean(f::GP, x::AbstractVector) = _map_meanfunction(f.mean, x)

Statistics.cov(f::GP, x::AbstractVector) = kernelmatrix(f.kernel, x)

Expand Down
10 changes: 5 additions & 5 deletions src/exact_gpr_posterior.jl
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ end
"""
posterior(fx::FiniteGP, y::AbstractVector{<:Real})
Constructs the posterior distribution over `fx.f` given observations `y` at `x` made under
Construct the posterior distribution over `fx.f` given observations `y` at `x` made under
noise `fx.Σy`. This is another `AbstractGP` object. See chapter 2 of [1] for a recap on
exact inference in GPs. This posterior process has mean function
```julia
Expand All @@ -16,7 +16,7 @@ and kernel
```julia
k_posterior(x, z) = k(x, z) - k(x, fx.x) inv(cov(fx)) k(fx.x, z)
```
where `m` and `k` are the mean function and kernel of `fx.f` respectively.
where `m` and `k` are the mean function and kernel of `fx.f`, respectively.
"""
function posterior(fx::FiniteGP, y::AbstractVector{<:Real})
m, C_mat = mean_and_cov(fx)
Expand All @@ -29,9 +29,9 @@ end
"""
posterior(fx::FiniteGP{<:PosteriorGP}, y::AbstractVector{<:Real})
Constructs the posterior distribution over `fx.f` when `f` is itself a `PosteriorGP` by
updating the cholesky factorisation of the covariance matrix and avoiding recomputing it
from original covariance matrix. It does this by using `update_chol` functionality.
Construct the posterior distribution over `fx.f` when `f` is itself a `PosteriorGP` by
updating the Cholesky factorisation of the covariance matrix and avoiding recomputing it
from the original covariance matrix. It does this by using `update_chol` functionality.
Other aspects are similar to a regular posterior.
"""
Expand Down
8 changes: 3 additions & 5 deletions src/finite_gp_projection.jl
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
FiniteGP{Tf<:AbstractGP, Tx<:AbstractVector, TΣy}
The finite-dimensional projection of the AbstractGP `f` at `x`. Assumed to be observed under
Gaussian noise with zero mean and covariance matrix `Σ`
Gaussian noise with zero mean and covariance matrix `Σy`
"""
struct FiniteGP{Tf<:AbstractGP,Tx<:AbstractVector,TΣ} <: AbstractMvNormal
f::Tf
Expand Down Expand Up @@ -37,7 +37,6 @@ Base.length(f::FiniteGP) = length(f.x)
Compute the mean vector of `fx`.
```jldoctest
julia> f = GP(Matern52Kernel());
Expand All @@ -56,7 +55,6 @@ Compute the covariance matrix of `fx`.
## Noise-free observations
```jldoctest cov_finitegp
julia> f = GP(Matern52Kernel());
Expand Down Expand Up @@ -205,7 +203,7 @@ end
rand(rng::AbstractRNG, f::FiniteGP, N::Int=1)
Obtain `N` independent samples from the marginals `f` using `rng`. Single-sample methods
produce a `length(f)` vector. Multi-sample methods produce a `length(f)` x `N` `Matrix`.
produce a `length(f)` vector. Multi-sample methods produce a `length(f)` × `N` `Matrix`.
```jldoctest
Expand Down Expand Up @@ -275,7 +273,7 @@ end
"""
logpdf(f::FiniteGP, y::AbstractVecOrMat{<:Real})
The logpdf of `y` under `f` if is `y isa AbstractVector`. logpdf of each column of `y` if
The logpdf of `y` under `f` if `y isa AbstractVector`. The logpdf of each column of `y` if
`y isa Matrix`.
Expand Down
12 changes: 6 additions & 6 deletions src/latent_gp.jl
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,9 @@
LatentGP(f<:GP, lik, Σy)
- `f` is a `AbstractGP`.
- `lik` is the log likelihood function which maps sample from f to corresposing
conditional likelihood distributions.
- `Σy` is the observation noise
- `lik` is the likelihood function which maps samples from `f` to the corresponding
conditional likelihood distributions (i.e., `lik` must return a `Distribution` compatible with the observations).
- `Σy` is the noise under which the latent GP is "observed"; this represents the jitter used to avoid numeric instability and should generally be small.
"""
struct LatentGP{Tf<:AbstractGP,Tlik,TΣy}
Expand All @@ -17,8 +17,8 @@ end
LatentFiniteGP(fx<:FiniteGP, lik)
- `fx` is a `FiniteGP`.
- `lik` is the log likelihood function which maps sample from f to corresposing
conditional likelihood distributions.
- `lik` is the likelihood function which maps samples from `f` to the corresponding
conditional likelihood distributions (i.e., `lik` must return a `Distribution` compatible with the observations).
"""
struct LatentFiniteGP{Tfx<:FiniteGP,Tlik}
Expand All @@ -40,7 +40,7 @@ end
```math
log p(y, f; x)
```
Returns the joint log density of the gaussian process output `f` and real output `y`.
The joint log density of the Gaussian process output `f` and observation `y`.
"""
function Distributions.logpdf(lfgp::LatentFiniteGP, y::NamedTuple{(:f, :y)})
return logpdf(lfgp.fx, y.f) + logpdf(lfgp.lik(y.f), y.y)
Expand Down
13 changes: 8 additions & 5 deletions src/mean_function.jl
Original file line number Diff line number Diff line change
Expand Up @@ -7,11 +7,14 @@ Returns `zero(T)` everywhere.
"""
struct ZeroMean{T<:Real} <: MeanFunction end

_map(::ZeroMean{T}, x::AbstractVector) where {T} = Zeros{T}(length(x))
"""
This is an AbstractGPs-internal workaround for AD issues; ideally we would just extend Base.map
"""
_map_meanfunction(::ZeroMean{T}, x::AbstractVector) where {T} = Zeros{T}(length(x))

function ChainRulesCore.rrule(::typeof(_map), m::ZeroMean, x::AbstractVector)
function ChainRulesCore.rrule(::typeof(_map_meanfunction), m::ZeroMean, x::AbstractVector)
map_ZeroMean_pullback(Δ) = (NoTangent(), NoTangent(), ZeroTangent())
return _map(m, x), map_ZeroMean_pullback
return _map_meanfunction(m, x), map_ZeroMean_pullback
end

ZeroMean() = ZeroMean{Float64}()
Expand All @@ -25,7 +28,7 @@ struct ConstMean{T<:Real} <: MeanFunction
c::T
end

_map(m::ConstMean, x::AbstractVector) = Fill(m.c, length(x))
_map_meanfunction(m::ConstMean, x::AbstractVector) = Fill(m.c, length(x))

"""
CustomMean{Tf} <: MeanFunction
Expand All @@ -37,4 +40,4 @@ struct CustomMean{Tf} <: MeanFunction
f::Tf
end

_map(f::CustomMean, x::AbstractVector) = map(f.f, x)
_map_meanfunction(f::CustomMean, x::AbstractVector) = map(f.f, x)
2 changes: 1 addition & 1 deletion test/base_gp.jl
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
x = collect(range(-1.0, 1.0; length=N))
x′ = collect(range(-1.0, 1.0; length=N′))

@test mean(f, x) == AbstractGPs._map(m, x)
@test mean(f, x) == AbstractGPs._map_meanfunction(m, x)
@test cov(f, x) == kernelmatrix(k, x)
TestUtils.test_internal_abstractgps_interface(rng, f, x, x′)
end
Expand Down
10 changes: 5 additions & 5 deletions test/mean_function.jl
Original file line number Diff line number Diff line change
Expand Up @@ -9,15 +9,15 @@
f = ZeroMean{Float64}()

for x in [x]
@test AbstractGPs._map(f, x) == zeros(size(x))
@test AbstractGPs._map_meanfunction(f, x) == zeros(size(x))
# differentiable_mean_function_tests(f, randn(rng, P), x)
end

# Manually verify the ChainRule. Really, this should employ FiniteDifferences, but
# currently ChainRulesTestUtils isn't up to handling this, so this will have to do
# for now.
y, pb = rrule(AbstractGPs._map, f, x)
@test y == AbstractGPs._map(f, x)
y, pb = rrule(AbstractGPs._map_meanfunction, f, x)
@test y == AbstractGPs._map_meanfunction(f, x)
Δmap, Δf, Δx = pb(randn(P))
@test iszero(Δmap)
@test iszero(Δf)
Expand All @@ -31,7 +31,7 @@
m = ConstMean(c)

for x in [x]
@test AbstractGPs._map(m, x) == fill(c, N)
@test AbstractGPs._map_meanfunction(m, x) == fill(c, N)
# differentiable_mean_function_tests(m, randn(rng, N), x)
end
end
Expand All @@ -41,7 +41,7 @@
foo_mean = x -> sum(abs2, x)
f = CustomMean(foo_mean)

@test AbstractGPs._map(f, x) == map(foo_mean, x)
@test AbstractGPs._map_meanfunction(f, x) == map(foo_mean, x)
# differentiable_mean_function_tests(f, randn(rng, N), x)
end
end
2 changes: 1 addition & 1 deletion test/test_util.jl
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,7 @@ end
Test _very_ basic consistency properties of the mean function `m`.
"""
function mean_function_tests(m::MeanFunction, x::AbstractVector)
@test AbstractGPs._map(m, x) isa AbstractVector
@test AbstractGPs._map_meanfunction(m, x) isa AbstractVector
@test length(ew(m, x)) == length(x)
end

Expand Down

0 comments on commit 945ae43

Please sign in to comment.