Metropolis Hastings

Metropolis-Hastings is an important algorithm for approximate inference in statistics. It is a Markov Chain Monte Carlo (MCMC) algorithm that allows for sampling from a probability distribution where direct sampling is difficult due to the presence of an intractable integral.

How Metropolis-Hastings works

Metropolis-Hastings consists of a proposal distribution to draw a parameter value. This is denoted as q(θ’|θ). To decide whether θ’ is accepted or rejected, we then calculate a ratio of:

$$ \frac{p\left(\theta^{'}\mid{D}\right)}{p\left(\theta\mid{D}\right)} $$

We then draw a random number ranging from 0 to 1, and if the number is under the ratio, we accept the new value. If it is over the ratio, we reject the new value. If we accept the new value, we set θi equal to θ’ and repeat the process.

By the end of the process, we have a sample of θ values that we can use to form quantities over an approximate posterior, such as the expectation and uncertainty bounds. Throughout the process, we typically have a period of tuning to achieve an acceptable acceptance ratio for the algorithm. We also have a warmup period to reduce bias towards initialization values.

The benefits of using Metropolis-Hastings

Metropolis-Hastings is particularly useful because it allows us to sample from probability distributions where direct sampling is difficult. This is typically due to the presence of an intractable integral. Due to its ability to approximate posterior, Metropolis-Hastings is used when it is not feasible to estimate the likelihood function of a statistical model directly.

This is particularly important in Bayesian analysis, where we need to generate posterior probability distributions from prior distributions and likelihood functions. Additionally, because Metropolis-Hastings is so widely used in approximate inference, many probabilistic programs use this algorithm as their default inference method.

Limitations of Metropolis-Hastings

Despite its usefulness, M-H does have some limitations. One such limitation is that the algorithm can be quite slow to converge on accurate parameter values in high-dimensional models. In addition, the acceptance ratio has to be carefully tuned, which can lead to long run times or unstable results if the tuning is not done properly.

Another limitation is that the method requires substantial computing power to run. This is because the sampling process involves evaluating the probability density function at each point in the parameter space. Depending on the complexity of the model, this can require a significant amount of processing power and computing time.

Despite its limitations, Metropolis-Hastings is a useful tool for approximating probability distributions and posterior estimates in Bayesian analysis. It allows us to sample from probability distributions that would otherwise be difficult to sample from directly. Although it can be slow to converge in high-dimensional models, it is still a widely-used algorithm in probabilistic programming and approximate inference.

Great! Next, complete checkout for full access to SERP AI.
Welcome back! You've successfully signed in.
You've successfully subscribed to SERP AI.
Success! Your account is fully activated, you now have access to all content.
Success! Your billing info has been updated.
Your billing was not updated.