By Gibbs Sampling

Click the thumbnail to the left to download a 3-page note on inference in finite mixture model using Gibbs sampling. This note uses an example of estimating a mixture of “circles” in images, which, I thought about on my way to office by the subway. LaTeX source of this note is here.

Indeed, the posterior distribution of latent variables are easy to estimate, and there is no need to do approximate inference using Gibbs sampling. (This is also the reason people learn finite mixture models using EM.) What I wrote in this note is a preliminary step before understanding inference in infinite mixture model using Gibbs sampling.

By Sum-Product

Denote the likelihood of a mixture model by

where θ={θ_{1}, … θ_{K}} is the prior of latent component indicators, and P_{k}(x) is the k-th component.

Given N data points X={x_{1}, … x_{N}}, and denoting the latent component indicators by Z={z_{1}, … z_{N}}, the factor graph contains N disjoint sub-graphs, each is a chain R_{i}–z_{i}–L_{i}, where R_{i} is a function representing the prior of z_{i}: R_{i}(z_{i})=θ_{zi}; and L_{i} is a function representing the likelihood given z_{i}: L_{i}(z_{i})=P_{zi}(x_{i}).