-
Hi everyone, I'm using GPyTorch for Bayesian optimization (BO) in a time-varying context. The problem with BO is that the samples/training points are highly correlated through the acquisition function and, therefore, hyper parameter learning is super hard/painful as the samples are not iid. To guide the search for good hyperparameters I use hyperpriors and constraints. If the problem is multidimensional I want to specify individual length scale hyperpriors and constraints for each dimension. So far, I have not found a way to specify this. Thank you for any help or hints! :) |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
So in principle this should be possible if you specify a batched torch distribution as a prior. E.g. if you have a two-dimensional problem you should be able to specify e.g. sth like a As to having more complex priors (e.g. one distribution for one dimension of the lengthscale parameter, and another type of distribution for another dimension), that will probably require defining some wrapper prior that can internally apply the correct prior to the respective parameter(s). |
Beta Was this translation helpful? Give feedback.
So in principle this should be possible if you specify a batched torch distribution as a prior. E.g. if you have a two-dimensional problem you should be able to specify e.g. sth like a
GammaPrior(c, r)
, wherec
andr
are two-element tensors with values corresponding to the values of he respective dimension. There are some know issues with multi-dimensional priors though (#1317, #1318) that may need to be fixed first for this to work fully.As to having more complex priors (e.g. one distribution for one dimension of the lengthscale parameter, and another type of distribution for another dimension), that will probably require defining some wrapper prior that can internally apply the correct…