This Repository contains an implementation of a normalizing flow for conditional density estimation using Bernstein polynomials, as proposed in:
Sick Beate, Hothorn Torsten and Dürr Oliver, Deep transformation models: Tackling complex regression problems with neural network based transformation models, 2020. online
The tfp.Bijector
interface is used for the implementation to benefit from the powerful TensorFlow Probability framework.
Traditional regression models assume normality and homoscedasticity of the data, i.e. the residuals for each input value are expected to be normally distributed with constant variance. However, the shape of the data distribution in many real use cases is much more complex.
The following example of a classical data set containing the waiting time between eruptions of the Old Faithful Geyser in Yellowstone National Park is used as an example.
Gaussian | Normalizing Flow |
---|---|
As shown in the left figure, the normality assumption is clearly violated by the bimodal nature of the data. However, the proposed transformation model has the flexibility to adapt to this complexity.
To start using my code follow these simple steps.
Pull and install it directly from git using pip:
pip install git+https://github.com/MArpogaus/TensorFlow-Probability-Bernstein-Polynomial-Bijector.git
Or clone this repository and install it from there:
git clone https://github.com/MArpogaus/TensorFlow-Probability-Bernstein-Polynomial-Bijector.git ./bernstein_flow
cd bernstein_flow
pip install -e .
Pip should take care of installing the required dependencies on its own. For completeness, these are the packages used in the implementation:
This python package consists of four main components:
berstein_flow.bijectors.BernsteinBijector
: The implementation of Bernstein polynomials using thetfb.Bijector
interface for transformations oftfd.Distribution
samples.berstein_flow.distributions.BernsteinFlow
: The implementation of atfd.TransformedDistribution
using the Bernstein polynomials as the bijector.berstein_flow.losses.BernsteinFlowLoss
: The implementation of atfk.losses.Loss
function to calculate the negative logarithmic likelihood using theBernstinFlow
distribution.berstein_flow.util.visualization
: Contains of some convenient helper functions for visualization.
A tfd.TransformedDistribution
using the BernsteinBijector is provided in the module bernstein_flow.distributions.BernsteinFlow
:
from bernstein_flow.distributions import BernsteinFlow
Use it like any other distribution, i.e. as a tfpl.DistributionLambda
.
The two example plots shown above have been generated using the following two models.
gauss_model = tf.keras.Sequential()
gauss_model.add(InputLayer(input_shape = (1)))
#Here could come a gigantus network
gauss_model.add(Dense(2)) # mean and the std of the Gaussian
gauss_model.add(tfp.layers.DistributionLambda(
lambda pv:
tfd.Normal(loc=pv[:,0], scale=1e-3 + tf.math.softplus(0.05 * pv[:,1]))))
flow_model = tf.keras.Sequential()
flow_model.add(InputLayer(input_shape = (1)))
#Here could come a gigantus network
flow_model.add(Dense(4 + 5)) # Bernstein coefficients and 2 times scale and shift
flow_model.add(tfp.layers.DistributionLambda(BernsteinFlow))
You can find two examples in the ipynb
directory:
TheoreticalBackground.ipynb
: Some explanation of the theoretical fundamentalsGaussian_vs_Transformation_Model.ipynb
: Bimodal data example shown in the figures above.
If you have any technical issues or suggestion regarding my implementation, please feel free to either contact me, open an issue or send me a Pull Request:
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature
) - Commit your Changes (
git commit -m 'Add some AmazingFeature'
) - Push to the Branch (
git push origin feature/AmazingFeature
) - Open a Pull Request
Any contributions are greatly appreciated.
Distributed under the Apache License 2.0