Learning distributions over hyperparameters #1508
-
I've been looking at the tutorial here for the Log-Gaussian Cox process and I've put in a lengthscale prior on my Kernel. I'm interested the posterior distribution over the lengthscale. Could someone please explain how I would go about doing this? Right now it looks to only be giving the MAP estimate, but I would like the entire distribution. |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments
-
You could do fully Bayesian inference with MCMC similar to what is done in https://github.com/cornellius-gp/gpytorch/blob/master/examples/01_Exact_GPs/GP_Regression_Fully_Bayesian.ipynb |
Beta Was this translation helpful? Give feedback.
-
Unfortunately, I think full Bayesian inference right now only works for exact GP... A simple (but maybe hacky) solution is to make the GPyTorch length scale un-learnable model.covar_module.base_kernel.raw_lengthscale.requires_grad_(False) modellengthscale = pyro.sample("lengthscale", lengthscale_prior_dist) guidelengthscale = pyro.sample("lengthscale", lengthscale_variational_dist) lengthscale = lengthscale_variational_dist() |
Beta Was this translation helpful? Give feedback.
-
Thanks @gpleiss this is exactly what I was looking for! |
Beta Was this translation helpful? Give feedback.
Unfortunately, I think full Bayesian inference right now only works for exact GP...
A simple (but maybe hacky) solution is to make the GPyTorch length scale un-learnable
model.covar_module.base_kernel.raw_lengthscale.requires_grad_(False)
Instead have the model/guide functions pre-process your data by dividing by a lengthscale.
model
lengthscale = pyro.sample("lengthscale", lengthscale_prior_dist)
x = x / lengthscale
guide
lengthscale = pyro.sample("lengthscale", lengthscale_variational_dist)
In the evaluation code, you'd have to be sure to appropriately pre-process your data:
lengthscale = lengthscale_variational_dist()
times = times / lengthscale
function_dist = model(times)
intensity_s…