You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I thank reviewer 3 for raising this issue (rephrased by AHL):
The normalised curvature matrix [used to find the optimal value of alpha] is constructed numerically from chi2 using a forward Euler method (numerical differentiation). When working out the second differential of chi2 using numerical differentiation the process can be sensitive to various numerical instabilities.
An addition to the above issue, also from reviewer 3:
I would then suggest to write a function for the log-prior, a function for the log-likelihood (i.e. chi2), then
combine to make a log-prior function. If you write them in the correct manner you can use these straightforwardly
for automatic differentiation. This has several knock-on effects:
a. you can work out these curvature matrices easily.
b. It's then straightforward to use most Bayesian analysis packages (pymc/emcee/dynesty)
c. automatic differentiation is very powerful when performing MCMC sampling using e.g. pymc. This has an
MCMC sampler called NUTS to derive posterior probability distributions, this is greatly sped up if you have an
easy way to calculate gradients/hessians.
I thank reviewer 3 for raising this issue (rephrased by AHL):
The normalised curvature matrix [used to find the optimal value of alpha] is constructed numerically from chi2 using a forward Euler method (numerical differentiation). When working out the second differential of chi2 using numerical differentiation the process can be sensitive to various numerical instabilities.
I would strongly suggest that you look into automatic differentiation (e.g. jax,
https://jax.readthedocs.io/en/latest/jax-101/04-advanced-autodiff.html,
https://jax.readthedocs.io/en/latest/notebooks/autodiff_cookbook.html).
The text was updated successfully, but these errors were encountered: