A sampling Monte-Carlo algorithm guided by Belief Propagation
Sampling a probability distribution is a common task in statistical physics, bayesian inference etc. Monte Carlo is one of the usual tool used to sample complex probability distributions. Unfortunately, the traditional MC methods failed when the structure of the distribution is difficult to sample. I will present here an algorithm respecting the detailed balance and which combines the perfect sampling ability of Belief Propagation on trees with the traditional heat-bath strategy using Markov-Chain Monte Carlo approaches. I will finish by presenting examples of applications of this algorithm.