Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enabling vector autoregressions #4665

Open
ckrapu opened this issue Apr 24, 2021 · 1 comment
Open

Enabling vector autoregressions #4665

ckrapu opened this issue Apr 24, 2021 · 1 comment

Comments

@ckrapu
Copy link
Contributor

ckrapu commented Apr 24, 2021

The AR distribution appears to be nearly complete for usage as a true vector autoregression parameterized by p cross-series coefficients, each of shape (d,d). The main change that has to be enacted is to use a dot product instead of elementwise multiplication here. However, I am unable to determine the role of the constant argument and why it necessitates the calculation of eps = value[self.p :] - self.rho[0] - x where, under the AR / VAR model, eps is assumed to have a diagonal Normal distribution.

@ricardoV94
Copy link
Member

There is some code example of Vector Autoregression here: https://www.pymc-labs.io/blog-posts/bayesian-vector-autoregression/

However, I am unable to determine the role of the constant argument and why it necessitates the calculation of eps = value[self.p :] - self.rho[0] - x

The constant is just an intercept term: y = rho[0] + x, where x are the convolved values and lagged coefficients from rho[1:]. The AR was refactored to V4 in #5734

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants