Variance seems to be off by 1 #2499
Answered
by
Balandat
ToennisStef
asked this question in
Q&A
-
Beta Was this translation helpful? Give feedback.
Answered by
Balandat
Mar 24, 2024
Replies: 1 comment 3 replies
-
The issue is the very last line here:
What this does is add the observation noise variance to the posterior. So you're computing the posterior predictive over |
Beta Was this translation helpful? Give feedback.
3 replies
Answer selected by
jacobrgardner
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
The issue is the very last line here:
What this does is add the observation noise variance to the posterior. So you're computing the posterior predictive over
y
, not the posterior overf
.model(test_x)
will return the posterior mvn that you're looking for. Callinglikelihood()
on that will add the noise variance to the diagonal, which is why you're getting the offset. If you want the posterior overf
just usemodel(test_x)
.