Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rational Quadratic Kernels gives wrong results! #3968

Closed
tirthasheshpatel opened this issue Jun 16, 2020 · 4 comments
Closed

Rational Quadratic Kernels gives wrong results! #3968

tirthasheshpatel opened this issue Jun 16, 2020 · 4 comments
Assignees

Comments

@tirthasheshpatel
Copy link
Contributor

Here's the output of RatQuad kernel on np.array([[1., 2.], [3., 4.]])

>>> x = np.array([[1., 2.], [3., 4.]])
>>> k = pm.gp.cov.RatQuad(1, 1., 1.)
>>> k(x, x).eval()
array([[1.        , 0.33333333],
       [0.33333333, 1.        ]])

This is a wrong result. Comparing with tensorflow_probability and custom implementation, the result should be np.array([[1., 0.2], [0.2, 1.]])

>>> x = np.array([[1., 2.], [3., 4.]])
>>> k = tfp.python.math.psd_kernels.RationalQuadratic(1., 1., 1.)
>>> k.matrix(x, x)
<tf.Tensor: shape=(2, 2), dtype=float32, numpy=
array([[1.        , 0.19999999],
       [0.19999999, 1.        ]], dtype=float32)>

I was not able to identify what the problem is by looking briefly at the source. This came up while I was testing some PyMC4 kernels.

Versions and main components

  • PyMC3 Version: 3.8
  • Theano Version: 1.0.4
  • Python Version: 3.6.9
  • Operating system: Linux
  • How did you install PyMC3: git source
@junpenglao
Copy link
Member

It should be

k = pm.gp.cov.RatQuad(input_dim=2, alpha=1., ls=1.)

otherwise the 2nd dim is ignored.

@junpenglao
Copy link
Member

FWIW tho, I am not sure if the slicing in https://github.com/pymc-devs/pymc3/blob/683faaa9d7e58701f0689b1a1fd4080151f7e057/pymc3/gp/cov.py#L396 is obvious to user - maybe a warning should be raised if the part of the input is ignored?

@tirthasheshpatel
Copy link
Contributor Author

Yeah! My bad... I got confused between the active_dims and input_dims arguments!

@tirthasheshpatel
Copy link
Contributor Author

Will try to work on it today!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants