-
Notifications
You must be signed in to change notification settings - Fork 108
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Discrete Probability Example (Try 3) #558
base: main
Are you sure you want to change the base?
Conversation
We found a Contributor License Agreement for you (the sender of this pull request), but were unable to find agreements for all the commit author(s) or Co-authors. If you authored these, maybe you used a different email address in the git commits than was used to sign the CLA (login here to double check)? If these were authored by someone else, then they will need to sign a CLA as well, and confirm that they're okay with these being contributed to Google. ℹ️ Googlers: Go here for more info. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I haven't been able to follow until the very end, but this looks super solid. We should start a reading group where you can explain how all this works! I left a few comments with suggestions, but I'm also happy to merge this without modifications.
|
||
coin : Coin =>Float = [0.2, 0.8] | ||
dice_1 : Dice => Float = for i. 1.0 / 6.0 | ||
dice_2 : Dice => Float = [0.5, 0.1, 0.1, 0.1, 0.1, 0.1] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: dice_1
-> fair_dice
, dice_2
-> unfair_dice
?
' Distributions are easy to create. Here are a couple simple ones. | ||
|
||
def normalize (x: m=>Float) : Dist m = AsDist for i. x.i / sum x | ||
def uniform : Dist m = normalize for i. 1.0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: slightly faster to just make it for i. (1.0 / size m)
concat $ for i. select (x.i > 0.0) (AsList 1 [(i, x.i)]) mempty | ||
|
||
instance [Show m] Show (Dist m) | ||
show = \a. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: the indentation seems really deep?
roll = \i . (i - 1)@ Dice | ||
|
||
None = Fin 1 | ||
nil = 0@None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You might be able to use Unit
and ()
instead of None
and nil
, but that's up to you. I kind of like this too.
def Var (a:Type) : Type = a => Float | ||
|
||
' These can either be observed or latent. | ||
If a random variable is observed then we use an indicator. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It was a little unclear to me initially that by indicator you meant that it is one for exactly one event. It's a bit confusing that "indicator variable" is defined in terms of an "indicator". But it might be just me not having this nomenclature in my head.
|
||
' We also define a Markov version of our sample function. | ||
Instead of summing out over the usage of its result, | ||
it constructs a matrix a vector. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I can't follow this
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will make more clear. This part was me trying to follow @duvenaud 's particle filter and rewrite it as an associative Accum.
Oh bummer. I was hoping that this would be easier to read. It was the indicator part where I lost you? I'll try to add some better explanations and use the more standard |
@srush would you like to rephrase this or should we just fix the minor things and merge this? |
I'll send an updated version soon. Been a bit busy the last couple weeks. |
There's no hurry from our side, take your time. I just came form vacation and I'm trying to make sure that we didn't drop the ball on any of your excellent PRs! Please ping us if you think that we forgot about any of them. |
(This is somewhere between an example and a blog post, but it uses a ton of Dex features so I thought I would submit for comments)
Implements an exact PPL for typed, discrete probability. This allows for concisely solving many elementary probability
brain teasers in dex.
Features:
Embedded modeling language using "table continuations" Here is what a bayes net looks like
Clean separation of model and explicit observations.
Differential posterior inference, i.e. using auto differentiation for exact posterior calculation.
A small set of combinators for discrete probability (expectations, printing, arb)
Examples for a bunch of undergrad level brain teasers (coin flips, monty hall, wet grass, hmm)