Quantcast
Channel: Machine Learning
Viewing all articles
Browse latest Browse all 62546

MCMC Convergence Explanation

$
0
0

Hi, Can somebody please explain what it means for a Markov Chain to have converged? Is it the situation when the transition probabilities satisfy detailed balance?

How does one know when the chain has converged?

Suppose I want to find the expectation of a function based on some probability distribution, whose partition function is hard to compute. I know that it can be approximated using samples from some MCMC method. But, what I don't understand is how do I go about collecting samples from the chain? Say, I need N samples to get a good approximation. When do I start collecting the N samples? Is it after the burn-in period? If so, can I collect N consecutive samples once the chain has converged?

Thanks in advance.

submitted by vittal
[link][3 comments]

Viewing all articles
Browse latest Browse all 62546

Trending Articles