Skip to content

Commit

Permalink
Gsl integration bugfix (#300)
Browse files Browse the repository at this point in the history
* Fixing a bug causing incorrect O3 optimization for Intel+Intel compilers+CPUs.

Co-authored-by: Evert Rol <[email protected]>

* Better commenting and docstrings

Added comments and missing docstrings to the likelihood calculation and background marignalisation.

* Changelog and version number updated.

* Minor typo fixed

* Version date updated in the Changelog.

---------

Co-authored-by: Evert Rol <[email protected]>
  • Loading branch information
thjsal and evertrol authored Apr 25, 2023
1 parent e5a1b2b commit aaae64e
Show file tree
Hide file tree
Showing 5 changed files with 63 additions and 13 deletions.
20 changes: 16 additions & 4 deletions CHANGELOG.rst
Original file line number Diff line number Diff line change
Expand Up @@ -34,16 +34,28 @@ and this project adheres to
.. ^^^^^^^^^^^
[pre-v2.0.1] - 2023-03-06
~~~~~~~~~~~~~~~~~~~~~~~~~
[v2.0.1] - 2023-04-25
~~~~~~~~~~~~~~~~~~~~~

Summary
^^^^^^^

* Numerical problems in likelihood computation were fixed for cases with zero counts, preventing also the code from being incorrectly optimized on Intel CPUs when using Intel compilers (producing incorrect GSL integration results and likelihoods). For the tested cases, the effect of these fixes seems non-detectable for the results in the systems where the optimization was already working correctly. In addition, a likelihood check was added as a part of continuous integration tests.

Fixed
^^^^^

* Treatment of the special cases in the likelihood computation in ``xpsi/likelihoods/default_background_marginalisation.pyx`` was changed so that taking the logarithm of zero is not allowed anymore. Previously, that could happen if the modelled counts were zero, but the observed counts were not. In addition, in case they both are zero, we now add 0 (i.e., log(1)) to the log-likelihood, instead of 1 added before. (T.S., E.R., M.H.)

Added
^^^^^
* Continuous integration test for checking the likelihood
* Continuous integration test for checking the likelihood (T.S.)

Attribution
^^^^^^^^^^^
Tuomo Salmi
Tuomo Salmi,
Evert Rol,
Martin Heemskerk


[v2.0.0] - 2023-02-16
Expand Down
2 changes: 1 addition & 1 deletion docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -80,7 +80,7 @@
# The short X.Y version.
version = '2.0'
# The full version, including alpha/beta/rc tags.
release = '2.0.0'
release = '2.0.1'

# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
Expand Down
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -289,7 +289,7 @@ def EXTENSION(modname):

setup(
name = 'xpsi',
version = '2.0.0',
version = '2.0.1',
author = 'The X-PSI Core Team',
author_email = '[email protected]',
url = 'https://github.com/xpsi-group/xpsi',
Expand Down
2 changes: 1 addition & 1 deletion xpsi/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
"""

__version__ = "2.0.0"
__version__ = "2.0.1"
__author__ = "The X-PSI Core Team"

try:
Expand Down
50 changes: 44 additions & 6 deletions xpsi/likelihoods/default_background_marginalisation.pyx
Original file line number Diff line number Diff line change
Expand Up @@ -78,17 +78,48 @@ ctypedef struct args:
double A

cdef double marginal_integrand(double B, void *params) nogil:
""" Compute the integrand for background marginalisation (for a given energy channel).
This function corresponds to the integrand in the second last expression
of Equation B.19 in Riley 2019. See appendix B:
https://hdl.handle.net/11245.1/aa86fcf3-2437-4bc2-810e-cf9f30a98f7a
:param double B:
Background count rate, which is the integration variable.
:param void *params:
Additional parameters, which include:
"n": Number of phase bins.
"SCALE": Scaling from modelled count rate to number of counts.
"star": Modelled count rate binned in phases.
"data": Observed number of counts binned in phases.
"A": Estimated maximum log-likelihood within the integration limits.
:returns:
A `double` with probability value for a given background variable.
This probability value is the exponent of Poisson log-probabilities summed over
different phase bins and subtracted with the maximum likelihood estimate.
In a special case of zero modelled counts (in any phase bin), probability of
zero is returned, unless the corresponding number of observed counts is also zero.
In the latter case, log-probability of that phase bin assessed to be 0.0,
and the sum over log-probabilites from different phase bins is continued.
"""

cdef size_t j
cdef double c, x = 0.0
cdef args *a = <args*> params

for j in range(a.n):
c = a.SCALE * (a.star[j] + B)
if c == 0.0 and a.data[j] == 0.0:
x += 1.0
else:
if c > 0.0:
x += a.data[j] * log(c) - c
elif c == 0.0 and a.data[j] == 0.0:
#x += log(1) = 0
pass
else:
#x += log(0) -> return exp(-inf) = 0
return 0.0

#with gil:
# if x - a.A > 0.0:
Expand Down Expand Up @@ -152,6 +183,9 @@ def eval_marginal_likelihood(double exposure_time,
:param double[::1] phases:
A :class:`numpy.ndarray` of phase interval edges in cycles.
:param double[:,::1] counts:
A :class:`numpy.ndarray` of observed number of counts in energy and phase bins.
:param tuple components:
Component signals, each a C-contiguous :class:`numpy.ndarray` of
signal count rates where phase increases with column number.
Expand Down Expand Up @@ -495,10 +529,14 @@ def eval_marginal_likelihood(double exposure_time,
a.A = 0.0
for j in range(a.n):
c = a.SCALE * (a.star[j] + B_for_integrand)
if c == 0.0 and a.data[j] == 0.0:
a.A += 1.0
else:
if c > 0.0:
a.A += a.data[j] * log(c) - c
elif c == 0.0 and a.data[j] == 0.0:
#a.A += log(1)
pass
else:
#a.A += log(0)
a.A += llzero

gsl_integration_cquad(&f, lower, upper,
epsabs, epsrel,
Expand Down

0 comments on commit aaae64e

Please sign in to comment.