Command Palette
Search for a command to run...
Markov Chain Monte Carlo Method MCMC
Date
Metropolis-Hastings sampling
1: Initialize the initial state of the Markov chain $latex {X\mathop{{}}\nolimits_{{0}}\text{ }=\text{ }x\mathop{{}}\nolimits_{{0}}}$
2: Sample the following process of $latex {t\text{ }=\text{ }0,\text{ }1,\text{ }2,\text{ }…}$ cycle
- At the $latex {t}$ moment, the state of the Markov chain is$latex {X\mathop{{}}\nolimits_{{t}}\text{ }=\text{ }x\mathop{{}}\nolimits_{{t}}}$ , and the sampling$latex {y\text{ } \sim \text{ }q{ \left( {x \left| x\mathop{{}}\nolimits_{{t}}\right. } \right) }}$
- Sampling from a uniform distribution$latex {u\text{ } \sim \text{ }Uniform{ \left[ {0,1} \right] }}$
- If $latex {u\text{ } < \text{ } \alpha { \left( {x\mathop{{}}\nolimits_{{t}},y} \right) }\text{ }=\text{ }min{ \left\{ {\frac{{p{ \left( {y} \right) }q{ \left( {x\mathop{{}}\nolimits_{{t}} \left| y\right. } \right) }}}{{p{ \left( {x\mathop{{}}\nolimits_{{t}}} \right) }p{ \left( {y \left| {x\mathop{{}}\nolimits_{{t}}\text{ } \to \text{ }y}$, that is, $latex {X\mathop{{}}\nolimits_{{t+1}}\text{ }=\text{ }y}$
- Otherwise, the transfer is not accepted, that is, $latex {X\mathop{{}}\nolimits_{{t+1}}\text{ }=\text{ }x\mathop{{}}\nolimits_{{t}}}$
Gibbs Sampling
1: Randomly initialize $latex {X\mathop{{}}\nolimits_{{0}}\text{ }=\text{ }x\mathop{{}}\nolimits_{{0}},\text{ }Y\mathop{{}}\nolimits_{{0}}\text{ }=\text{ }y\mathop{{}}\nolimits_{{0}}}$
2: Cyclic sampling of $latex {t\text{ }=\text{ }0,\text{ }1,\text{ }2,\text{ }…}$
- $latex {y\mathop{{}}\nolimits_{{t+1}}\text{ } \sim \text{ }p{ \left( {y \left| x\mathop{{}}\nolimits_{{t}}\right. } \right) }}$
- $latex {x\mathop{{}}\nolimits_{{t+1}}\text{ } \sim \text{ }p{ \left( {x \left| y\mathop{{}}\nolimits_{{t+1}}\right. } \right) }}$
References
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.