GPyTorch prediction for fixed hyperparameter massively underestimates variances #2535
-
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 2 replies
-
Your code isn't using the likelihood noise at all in predicting. Try fixing that first and see what happens? |
Beta Was this translation helpful? Give feedback.
-
The intervals from GPyTorch are still too small when adding But, I got sensible CrI in GPJax with the code at the bottom. I would like to use GPyTorch if possible because it has some really useful features for the task I'm working on. I have a feeling it is to do with the scale of the variables. The docs say model.likelihood.noise_covar.noise = 34.
model.covar_module.outputscale = 62.
model.covar_module.base_kernel.lengthscale = 0.01 GPJax code kern = gpx.kernels.Matern52(active_dims=[0, 1], lengthscale=0.01, variance=62.0**2)
meanf = gpx.mean_functions.Zero()
prior = gpx.gps.Prior(mean_function=meanf, kernel=kern)
likelihood = gpx.likelihoods.Gaussian(num_datapoints=D.n, obs_stddev=34.0)
posterior = prior * likelihood |
Beta Was this translation helpful? Give feedback.
If 34 is the standard deviation, which I'm assuming from the GPJax variable name, you'll need to square it for sure. noise in gpytorch refers to the variance of the distribution.
It also looks like you need to be squaring the outputscale. ScaleKernel is
outputscale * k(x, x')
.