Custom GP as linear combination of two GPs #1568
Unanswered
fonnesbeck
asked this question in
Q&A
Replies: 2 comments 8 replies
-
I have not made any headway on this. I assume it is not possible? |
Beta Was this translation helpful? Give feedback.
2 replies
-
Rather than multiplying the MVN objects, you need to multiply the kernels together instead. class GPRegressionModel(gpytorch.models.ExactGP):
def __init__(self, train_x, train_y, likelihood):
super(GPRegressionModel, self).__init__(train_x, train_y, likelihood)
self.mean_module = ConstantMean()
self.base_covar_module_0 = ScaleKernel(RBFKernel(ard_num_dims=2, active_dims=[0,1]))
self.covar_module_0 = InducingPointKernel(self.base_covar_module_0, inducing_points=train_x[:200, :], likelihood=likelihood)
self.base_covar_module_1 = ScaleKernel(RBFKernel(ard_num_dims=2, active_dims=[0,1]))
self.covar_module_1 = InducingPointKernel(self.base_covar_module_1, inducing_points=train_x[:200, :], likelihood=likelihood)
def forward(self, x):
mean_0 = self.mean_module(x)
covar_0 = self.covar_module_0(x)
covar_1 = self.covar_module_1(x) * (x[:, 2].unsqueeze(-1) @ x[:, 2].unsqueeze(-2))
return MultivariateNormal(mean_0, covar_0 + covar_1) A slightly more efficient version would be: class GPRegressionModel(gpytorch.models.ExactGP):
def __init__(self, train_x, train_y, likelihood):
super(GPRegressionModel, self).__init__(train_x, train_y, likelihood)
self.mean_module = ConstantMean()
self.base_covar_module_0 = ScaleKernel(RBFKernel(ard_num_dims=2, active_dims=[0,1]))
self.covar_module_0 = InducingPointKernel(self.base_covar_module_0, inducing_points=train_x[:200, :], likelihood=likelihood)
self.base_covar_module_1 = ScaleKernel(RBFKernel(ard_num_dims=2, active_dims=[0,1]))
self.covar_module_1 = InducingPointKernel(self.base_covar_module_1, inducing_points=train_x[:200, :], likelihood=likelihood)
def forward(self, x):
mean_0 = self.mean_module(x)
covar_0 = self.covar_module_0(x)
covar_1 = self.covar_module_1(x) * gpytorch.lazy.RootLazyTensor(x[:, 2].unsqueeze(-1))
return MultivariateNormal(mean_0, covar_0 + covar_1) |
Beta Was this translation helpful? Give feedback.
6 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I'm looking to model a 2D function as a sum of two Gaussian processes, where the second one is only applied with the active elements of a mask. I was hoping the following would work, where
train_x
is comprised of xy coordinates in the first two columns and binary mask values in the third.In other words, some elements are just the first GP and others are the sum of the two GPs. However, it does not permit me to multiply a
MultivariateNormal
with a Tensor in the last statement, so I'm clearly doing this wrong. How do I mask out the values as needed to accomplish this? Some sort of affine transformation elsewhere?Beta Was this translation helpful? Give feedback.
All reactions