deanie

An embedded probabilistic programming language.
git clone git://git.jtobin.io/deanie.git
Log | Files | Refs | README | LICENSE

README.md (2932B)


      1 
      2 # deanie
      3 
      4 [![MIT License](https://img.shields.io/badge/license-MIT-blue.svg)](https://github.com/jtobin/deanie/blob/master/LICENSE)
      5 
      6 *deanie* is an embedded probabilistic programming language.  It can be used to
      7 denote, sample from, and perform inference on probabilistic programs.
      8 
      9 ## Usage
     10 
     11 Programs are written in a straightforward monadic style:
     12 
     13 ``` haskell
     14 mixture :: Double -> Double -> Program Double
     15 mixture a b = do
     16   p      <- beta a b
     17   accept <- bernoulli p
     18   if   accept
     19   then gaussian (negate 2) 0.5
     20   else gaussian 2 0.5
     21 ```
     22 
     23 You can sample from them by first converting them into an *RVar* from
     24 [random-fu][rafu]:
     25 
     26 ```
     27 > sample (rvar (mixture 1 3))
     28 ```
     29 
     30 Sample many times from models using standard monadic combinators like
     31 'replicateM':
     32 
     33 ```
     34 > replicateM 1000 (sample (rvar (mixture 1 3)))
     35 ```
     36 
     37 ![](assets/mixture.png)
     38 
     39 Or convert them to measures using a built-in interpreter:
     40 
     41 ```
     42 > let nu = measure (mixture 1 3)
     43 > let f = cdf nu
     44 ```
     45 
     46 ![](assets/mixture_cdf.png)
     47 
     48 You can perform inference on models using rejection or importance sampling, or
     49 use a simple, stateful Metropolis backend.  Here's a simple beta-bernoulli
     50 model, plus some observations to condition on:
     51 
     52 ``` haskell
     53 betaBernoulli :: Double -> Double -> Program Bool
     54 betaBernoulli a b = do
     55   p <- beta a b
     56   bernoulli p
     57 
     58 observations :: [Bool]
     59 observations = [True, True, False, True, False, False, True, True, True]
     60 ```
     61 
     62 Here's one way to encode a posterior via rejection sampling:
     63 
     64 ``` haskell
     65 rposterior :: Double -> Double -> Program Double
     66 rposterior a b =
     67     grejection
     68       (\xs ys -> count xs == count ys)
     69       observations (beta a b) bernoulli
     70   where
     71     count = length . filter id
     72 ```
     73 
     74 ![](assets/bb_rejection.png)
     75 
     76 Here's another, via importance sampling:
     77 
     78 ``` haskell
     79 iposterior :: Double -> Double -> Program (Double, Double)
     80 iposterior a b =
     81   importance observations (beta a b) logDensityBernoulli
     82 ```
     83 
     84 There are also some Monte Carlo convenience functions provided, such as a
     85 weighted average for weighted samples returned via importance sampling:
     86 
     87 ```
     88 > samples <- replicateM 1000 (sample (rvar (iposterior 1 1)))
     89 > print (mcw samples)
     90 0.6369246537796793
     91 ```
     92 
     93 ## Background
     94 
     95 You can read about some of the theory and ideas behind this kind of language in
     96 some blog posts I've written.
     97 
     98 * [Encoding Statistical Independence, Statically][enco]
     99 * [A Simple Embedded Probabilistic Programming Language][sppl]
    100 * [Comonadic MCMC][como]
    101 * [Foundations of the Giry Monad][gifo]
    102 * [Implementing the Giry Monad][gimp]
    103 * [The Applicative Structure of the Giry Monad][giap]
    104 
    105 [giap]: https://jtobin.io/giry-monad-applicative
    106 [gimp]: https://jtobin.io/giry-monad-implementation
    107 [gifo]: https://jtobin.io/giry-monad-foundations
    108 [enco]: https://jtobin.io/encoding-independence-statically
    109 [sppl]: https://jtobin.io/simple-probabilistic-programming
    110 [como]: https://jtobin.io/comonadic-mcmc
    111 [rafu]: https://hackage.haskell.org/package/random-fu
    112