Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Request for changes on the differential evolution convergence criteria in bilby/gw/likelihood/relative.py #921

Open
Vincent-Juste opened this issue Feb 24, 2025 · 2 comments
Labels
enhancement New feature or request likelihood

Comments

@Vincent-Juste
Copy link

Hi, I would like suggest two code changes in the relative binning likelihood code, both related to the differential evolution step and its convergence:

  • the current convergence criteria is if new_fiducial_ln_likelihood - old_fiducial_ln_likelihood < 0.1: . I've encountered several cases where for whatever reason the new_fiducial_ln_likelihood is smaller than the old_fiducial_ln_likelihood, making the difference negative and stopping the search. I believe taking the absolute value of this difference would be a better choice.
  • the threshold in this previous criteria is hard coded as 0.1, would it be possible to make it a tunable parameter to pass to the extra-likelihood-kwargs key in bilby_pipe in order to have more control on this specific step ?

Thanks !

@ColmTalbot
Copy link
Collaborator

Hi @Vincent-Juste I'd be happy for these to be more flexible, if you have proposals for new default settings I'd like to see some systematic tests looking at for example, how long the optimization takes, and how close the optimization gets to the true maximum on injections.

@ColmTalbot ColmTalbot added enhancement New feature or request likelihood labels Feb 24, 2025
@Vincent-Juste
Copy link
Author

Thanks @ColmTalbot ! I'm investigating PE on SSM injections and I've played a bit with the parameters of the likelihood, namely maxiter, popsize, tol, init, but so far none of the changes I made had any significant impact on the final PE results (although I haven't compared closely the logLikelihood at the end of the optimization).

In terms of running speed it is a matter of few minutes and it usually stops after a handful of iterations (~2 to 5). As for how close it is to the injected parameters I am unsure as I haven't been able to find that information in the logs. Since it's so fast I was thinking it could be worth spending a bit more time on that step to maybe ease the burden on the following sampling part, especially as it occured that PE failed on a few injections although their SNRs were rather high.

You can find a bunch of injections on which I ran at CIT under /home/vincent.juste/timeshifted_analysis/SSM_PE/injections_on_gaussian_noise/test_100inj and their summary pages on my CIT public webpages under .../SSM_PE/test_100inj/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request likelihood
Projects
None yet
Development

No branches or pull requests

2 participants