DIY Markov Chains
Log | Files | Refs | README | LICENSE

commit f32cdeb070f72279b689d76ff3a842a0e0e0bdb7
parent 18eef87251dd0a4ca2f40c76cc0c06403b063f2c
Author: Jared Tobin <>
Date:   Tue, 13 Oct 2015 23:18:20 +1300

Docs update.

Diffstat: | 8++++----
Mdeclarative.cabal | 17++++++++++++-----
2 files changed, 16 insertions(+), 9 deletions(-)

diff --git a/ b/ @@ -62,11 +62,11 @@ Installing is best done via everything you might need (including GHC). You'll want to use the [Stackage nightly -resolver]( until the next LTS version picks -up these libraries. - -With that out of the way it's just a matter of +resolver]( for now, until the next LTS version +is released. But with that out of the way it's just a matter of ``` $ stack install declarative ``` + +See the test suite for some example usage. diff --git a/declarative.cabal b/declarative.cabal @@ -1,5 +1,5 @@ name: declarative -version: +version: 0.1.1 synopsis: DIY Markov Chains. homepage: license: MIT @@ -10,12 +10,19 @@ category: Math build-type: Simple cabal-version: >=1.10 description: - DIY Markov Chains. + This package presents a simple combinator language for Markov transition + operators that are useful in MCMC. . - Build composite Markov transition operators from existing ones for fun and - profit. + Any transition operators sharing the same stationary distribution and obeying + the Markov and reversibility properties can be combined in a couple of ways, + such that the resulting operator preserves the stationary distribution and + desirable properties amenable for MCMC. . - A useful strategy is to hedge one's sampling risk by occasionally + We can deterministically concatenate operators end-to-end, or sample from + a collection of them according to some probability distribution. See + < Geyer, 2005> for details. + . + A useful strategy is to hedge one's 'sampling risk' by occasionally interleaving a computationally-expensive transition (such as a gradient-based algorithm like Hamiltonian Monte Carlo or NUTS) with cheap Metropolis transitions.