-
Notifications
You must be signed in to change notification settings - Fork 197
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[WIP] RNG test fixes and improvements #513
Conversation
I'll take a look at failing test |
… (0.0, 1.0] to [0.0, 1.0)
Summary of latest 3 commits:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I went over all distributions again (not just the ones you changed). Things I noticed:
- For Bernoulli / ScaledBernoulli, I'm almost sure the calculation is simply wrong: the probability parameter usually represents chance of success, but with
res > params.prob
this would mean that for a probability of 40% we get success 60% of the time. So I guess here probability represents non-success. But, ifparams.prob
is 0, we can still get non-success ifres == 0
. So I'm almost certain that it should actually beres < params.prob
: Forparams.prob == 1
, we always get success, forparams.prob == 0
, we always get non-success and everything in-between should work fine. - In Laplace, there is something similar to the above, but it is correct in the code: I would still add a comment that since
res
is in[1, 2^24 - 1] / 2^24
, we will have2^23
numbers in the first branch and2^23 - 1
numbers in the second one. The special value2^23 / 2^24 == oneHalf
would give the same result anyway in both branches, so it doesn't matter where it goes. This is probably not obvious at first sight for someone going over this code.
Apart from that, everything else looks good to me.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Changes LGTM from my side
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
@gpucibot merge |
@MatthiasKohl @teju85 Please take a look.