Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding DirichletProcess function #121

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
282 changes: 282 additions & 0 deletions notebooks/dirichlet-process-numpy.ipynb

Large diffs are not rendered by default.

203 changes: 203 additions & 0 deletions notebooks/dirichlet-process-pymc.ipynb

Large diffs are not rendered by default.

Binary file added notebooks/dp-example-9.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added notebooks/dp-posterior-multiple-samples.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added notebooks/dp-posterior-single-sample.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added notebooks/dp-posterior.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 1 addition & 1 deletion pymc_experimental/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,6 @@
_log.addHandler(handler)


from pymc_experimental import distributions, gp, utils
from pymc_experimental import distributions, dp, gp, utils
from pymc_experimental.inference.fit import fit
from pymc_experimental.marginal_model import MarginalModel
18 changes: 18 additions & 0 deletions pymc_experimental/dp/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
# Copyright 2020 The PyMC Developers
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.


from pymc_experimental.dp.dp import DirichletProcess

__all__ = ["DirichletProcess"]
83 changes: 83 additions & 0 deletions pymc_experimental/dp/dp.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,83 @@
# Copyright 2020 The PyMC Developers
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

import numpy as np
import pymc as pm
import pytensor.tensor as pt
from pymc.model import modelcontext

__all__ = ["DirichletProcess"]


def DirichletProcess(name, alpha, base_dist, K, observed=None, sbw_name=None, atoms_name=None):
r"""
Truncated Dirichlet Process for Bayesian Nonparametric Density Modelling

Parameters
----------
alpha: tensor_like of float
Scale concentration parameter (alpha > 0) specifying the size of "sticks", or generated
weights, from the stick-breaking process. Ideally, alpha should have a prior and not be
a fixed constant.
base_dist: single batched distribution
Copy link
Member

@ricardoV94 ricardoV94 Mar 29, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would it make sense to use the same API as in other distribution factories, where the user passes a .dist variable and we resize it ourselves (and in this case, register in the model as well)?

https://github.com/pymc-devs/pymc/blob/f3ce16f2606f523137c27466069f1ab737626f21/pymc/distributions/censored.py#L55-L59

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, good idea

The base distribution for a Dirichlet Process. `base_dist` must have shape (K + 1,).
K: int
The truncation parameter for the number of components of the Dirichlet Process Mixture.
The Goldilocks Principle should be used in selecting an appropriate value of K: not too
low to capture all possible clusters and not too high to induce a heavy computational
burden for sampling.
"""
if sbw_name is None:
sbw_name = "sbw"

if atoms_name is None:
atoms_name = "atoms"

if observed is not None:
observed = np.asarray(observed)

if observed.ndim > 1:
raise ValueError("Multi-dimensional Dirichlet Processes are not " "yet supported.")

N = observed.shape[0]

try:
modelcontext(None)
except TypeError:
raise ValueError(
"PyMC Dirichlet Processes are only available under a pm.Model() context manager."
)

sbw = pm.StickBreakingWeights(sbw_name, alpha, K)

if observed is None:
return sbw, pm.Deterministic(atoms_name, base_dist)

"""
idx samples a new atom from `base_dist` with probability alpha/(alpha + N)
and an existing atom from `observed` with probability N/(alpha + N).

If a new atom is not sampled, an atom from `observed` is sampled uniformly.
"""
idx = pm.Bernoulli("idx", p=alpha / (alpha + N), shape=(K + 1,))
atom_selection = pm.Categorical("atom_selection", p=[1 / N] * N, shape=(K + 1,))

atoms = pm.Deterministic(
atoms_name,
var=pt.stack([pt.constant(observed)[atom_selection], base_dist], axis=-1)[
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ricardoV94 Following our conversation a few weeks (or months?) ago, I was able to make this work. Thanks for the ideas.

However, I believe that posterior predictive sampling would require defining a custom distribution class. I'm not so sure at this point, I believe that this would need some creativity and possibly revisiting the sketch that you thought about a while back.

pt.arange(K + 1), idx
],
)

return sbw, atoms