Approximation Methods For Bayesian Inference
Apr 16th, 2013There are two method families to approximate intractable posterior distributions: deterministic methods such as variational inference, and stochastic methods such as sampling methods.
Deterministic methods
Variational methods come from the mathematical field calculus of variations where the goal is to find the function that optimizes a given numerical quantity. One common variational method is Variational Expectation-Maximization (EM). Standard EM is used to get Maximum Likelihood estimates
But when the model gets complicated,
computing the posterior
Stochastic methods
Gibbs sampling is a Markov Chain Monte Carlo algorithm [Geman84] that repeatedly picks one hidden variable
initialize z randomly
repeat until convergence:
pick i randomly
draw z_i from p(z_i | z_-i, x)
We refer to convergence loosely here, as Gibbs sampling doesn’t converge and will eventually visit all possible state, maybe in an exponential number of iterations. We’ll usually assume convergence when the state is not changing too much over a reasonable number of iterations.