Batched independent 1D variational GPs as NN activation #2383
Unanswered
schneider-source
asked this question in
Q&A
Replies: 1 comment
-
Don't use a deep GP - just use an ApproximateGP. Your model should look something like what we have in the SVI regression example: https://docs.gpytorch.ai/en/stable/examples/04_Variational_and_Approximate_GPs/SVGP_Regression_CUDA.html If you want to learn multiple independent activation functions, see this example: https://docs.gpytorch.ai/en/stable/examples/04_Variational_and_Approximate_GPs/SVGP_Multitask_GP_Regression.html |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Extending from the batch regression example, I would like to implement a 1D -> 1D variational GP activation function for later use in neural networks. So for each node in a neural network layer an independent variational GP activation can be learned using VI. How to correctly implement that, as below minimal (executable) example does not work as expected. Thanks for help!
Output:
Beta Was this translation helpful? Give feedback.
All reactions