Skip to content

Gradient descent methods for Bayesian variational inference with mean field approximation.

License

Notifications You must be signed in to change notification settings

stephenslab/gradvi

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

45 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GradVI - Gradient Descent Variational Inference

GradVI provides tools for Bayesian variational inference using gradient descent methods. It is a modular software which provides the boilerplate for variational inference. The user specifies a prior family of distribution and a task (e.g. linear regression, trendfiltering), observes data and runs posterior inference. The goal is to learn the parameters of the corresponding variational posterior family.

Currently, two different prior distributions, namely (1) adaptive shrinkage (ASH) prior, and (2) point-normal prior are provided within the software. For any other choice, the user has to define the prior distribution following the examples provided within the framework.

Reference

The theory for Gradvi is available on arxiv.

Related software

  • mr.ash.alpha A coordinate ascent algorithm for multiple linear regression with ASH prior.
  • mr-ash-pen A fast FORTRAN core for GradVI multiple regression using ASH prior.

Installation

The software can be installed directly from github using pip:

pip install git+https://github.com/stephenslab/gradvi

For development, download this repository and install using the -e flag:

git clone https://github.com/stephenslab/gradvi.git # or use the SSH link
cd gradvi
pip install -e .

Quick Start

The software provides several classes for performing variational inference. Try running the following small examples that illustrates using some classes.

Example of Linear regression

Simulate some data:

import numpy
import matplotlib.pyplot
from gradvi.priors import Ash
from gradvi.inference import  LinearRegression

n = 100
p = 200
pcausal = 20
s2 = 1.4
k = 10
sk = (numpy.power(2.0, numpy.arange(k) / k) - 1)
numpy.random.seed(100)

X = numpy.random.normal(0, 1, size = n * p).reshape(n, p)
b = numpy.zeros(p)
b[:pcausal] = numpy.random.normal(0, 1, size = pcausal)
err = numpy.random.normal(0, numpy.sqrt(s2), size = n)
y = numpy.dot(X, b) + err

Perform regression:

prior = Ash(sk, scaled = True)
gvlin = LinearRegression(debug = False, display_progress = True)
gvlin.fit(X, y, prior)
b_hat = gvlin.coef

Compare the true regression coefficients against the estimated coefficients:

matplotlib.pyplot.scatter(b,b_hat,s = 10,color = "black")
matplotlib.pyplot.axline((0,0),slope = 1,color = "magenta",linestyle = ":")
matplotlib.pyplot.xlabel("true")
matplotlib.pyplot.ylabel("estimated")
matplotlib.pyplot.show()

Credits

The GradVI Python package was developed by Saikat Banerjee at the University of Chicago, with contributions from Peter Carbonetto and Matthew Stephens.

About

Gradient descent methods for Bayesian variational inference with mean field approximation.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Languages