Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Circular kernel #4082

Merged
merged 21 commits into from
Oct 10, 2020
Merged
Show file tree
Hide file tree
Changes from 18 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions RELEASE-NOTES.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@

### New features
- `sample_posterior_predictive_w` can now feed on `xarray.Dataset` - e.g. from `InferenceData.posterior`. (see [#4042](https://github.com/pymc-devs/pymc3/pull/4042))
- Added `pymc3.gp.cov.Circular` kernel for Gaussian Processes on circular domains, e.g. the unit circle (see [#4082](https://github.com/pymc-devs/pymc3/pull/4082)).
- Add MLDA, a new stepper for multilevel sampling. MLDA can be used when a hierarchy of approximate posteriors of varying accuracy is available, offering improved sampling efficiency especially in high-dimensional problems and/or where gradients are not available (see [#3926](https://github.com/pymc-devs/pymc3/pull/3926))
- Change SMC metropolis kernel to independent metropolis kernel [#4115](https://github.com/pymc-devs/pymc3/pull/3926))

Expand Down
632 changes: 632 additions & 0 deletions docs/source/notebooks/GP-Circular.ipynb

Large diffs are not rendered by default.

234 changes: 182 additions & 52 deletions docs/source/notebooks/GP-MeansAndCovs.ipynb

Large diffs are not rendered by default.

55 changes: 55 additions & 0 deletions pymc3/gp/cov.py
Original file line number Diff line number Diff line change
Expand Up @@ -299,6 +299,61 @@ def full(self, X, Xs=None):
return tt.alloc(0.0, X.shape[0], Xs.shape[0])


class Circular(Covariance):
R"""
Circular Kernel.

.. math::

k_g(x, y) = W_\pi(\operatorname{dist}_{\mathit{geo}}(x, y)),

with

.. math::

W_c(t) = \left(1 + \tau \frac{t}{c}\right)\left(1-\frac{t}{c}\right)^\tau_+

where :math:`c` is maximum value for :math:`t` and :math:`\tau\ge 4`.
:math:`\tau` controls for correlation strength, larger :math:`\tau` leads to less smooth functions
See [1]_ for more explanations and use cases.

Parameters
----------
period : scalar
defines the circular interval :math:`[0, \mathit{bound})`
tau : scalar
:math:`\tau\ge 4` defines correlation strength, the larger,
the smaller correlation is. Minimum value is :math:`4`

References
----------
.. [1] Espéran Padonou, O Roustant, "Polar Gaussian Processes for Predicting on Circular Domains"
https://hal.archives-ouvertes.fr/hal-01119942v1/document
"""

def __init__(self, input_dim, period, tau=4, active_dims=None):
super().__init__(input_dim, active_dims)
self.c = tt.as_tensor_variable(period / 2)
self.tau = tau

def dist(self, X, Xs):
if Xs is None:
Xs = tt.transpose(X)
else:
Xs = tt.transpose(Xs)
return tt.abs_((X - Xs + self.c) % (self.c * 2) - self.c)

def weinland(self, t):
return (1 + self.tau * t / self.c) * tt.clip(1 - t / self.c, 0, np.inf) ** self.tau

def full(self, X, Xs=None):
X, Xs = self._slice(X, Xs)
return self.weinland(self.dist(X, Xs))

def diag(self, X):
return tt.alloc(1.0, X.shape[0])


class Stationary(Covariance):
r"""
Base class for stationary kernels/covariance functions.
Expand Down
28 changes: 28 additions & 0 deletions pymc3/tests/test_gp.py
Original file line number Diff line number Diff line change
Expand Up @@ -1189,3 +1189,31 @@ def test_plot_gp_dist_warn_nan(self):
pm.gp.util.plot_gp_dist(ax, x=np.linspace(0, 50, X), samples=samples)
plt.close()
pass


class TestCircular:
def test_1d_tau1(self):
X = np.linspace(0, 1, 10)[:, None]
etalon = 0.600881
with pm.Model():
cov = pm.gp.cov.Circular(1, 1, tau=5)
K = theano.function([], cov(X))()
npt.assert_allclose(K[0, 1], etalon, atol=1e-3)
K = theano.function([], cov(X, X))()
npt.assert_allclose(K[0, 1], etalon, atol=1e-3)
# check diagonal
Kd = theano.function([], cov(X, diag=True))()
npt.assert_allclose(np.diag(K), Kd, atol=1e-5)

def test_1d_tau2(self):
X = np.linspace(0, 1, 10)[:, None]
etalon = 0.691239
with pm.Model():
cov = pm.gp.cov.Circular(1, 1, tau=4)
K = theano.function([], cov(X))()
npt.assert_allclose(K[0, 1], etalon, atol=1e-3)
K = theano.function([], cov(X, X))()
npt.assert_allclose(K[0, 1], etalon, atol=1e-3)
# check diagonal
Kd = theano.function([], cov(X, diag=True))()
npt.assert_allclose(np.diag(K), Kd, atol=1e-5)