Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add sampler to iterative refine an initial course posterior #4341

Merged
merged 5 commits into from
Aug 9, 2023

Conversation

ahnitz
Copy link
Member

@ahnitz ahnitz commented Apr 26, 2023

No description provided.

@ahnitz
Copy link
Member Author

ahnitz commented May 3, 2023

This adds a simple sampler that builds an internal kde from samples and does iterative evolution. There is no mcmc done, just weighted mc draws. This is useful for finalizing a posterior that has already identified the key modes (or if the mode structure can be known ahead of time), but has not fully converged. It has trivial parallelization. The goal is to also be able to use it to slightly change prior / model choices in a way that will converge very quickly on the right answer.

It includes some checks on the internal kde representation convergence by looking at the JS divergence between iterations and the value of the evidence.

Some failure modes (well, you'll get few actually independent samples back) include if the shape of the posterior is badly smoothed by the default kde used. I am exploring some alternate kdes that will do better there. I think awkde used by @tdent may work well.

Copy link
Contributor

@spxiwh spxiwh left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No comment on the code. This is a new model and doesn't touch anything else, so it's hard to say anything meaningful without having tried it out ... It definitely won't break anything else!

It would be good to have some documentation of this. Collin has set a high standard for inference docs, so I think he would ask for the things I suggest here. I'm pressing approve and leaving this to your conscience, however!

models._global_instance.loglikelihood)


def resample_equal(samples, logwt, seed=0):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please add a docstring for this function. I think it's important enough that we should know what it does. (sorry these comments come late. I wrote them while github was misbehaving, and it looks like they got swallowed!)


class RefineSampler(DummySampler):
"""Sampler for kde drawn refinement of existing posterior estimate

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we provide an example of how this might work in the code/documentation? I assume we generate a normal output, and then run this. Is it valid for both nested sampling and MCMC?

@spxiwh
Copy link
Contributor

spxiwh commented May 15, 2023

Definitely some of the codeclimate comments should be addressed. I do think calling things like models._global_instance feels ugly.

@ahnitz ahnitz merged commit fbc30dd into gwastro:master Aug 9, 2023
36 checks passed
PRAVEEN-mnl pushed a commit to PRAVEEN-mnl/pycbc that referenced this pull request Nov 3, 2023
…4341)

* add kde refinement sampler

* cc

* more logging

* cc

* cc
bhooshan-gadre pushed a commit to bhooshan-gadre/pycbc that referenced this pull request Mar 4, 2024
…4341)

* add kde refinement sampler

* cc

* more logging

* cc

* cc
acorreia61201 pushed a commit to acorreia61201/pycbc that referenced this pull request Apr 4, 2024
…4341)

* add kde refinement sampler

* cc

* more logging

* cc

* cc
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants