As we can see in a article(https://eye.kohei-kevin.com/2024/04/20/review-map-mle-bayes-estimation/) (which I posted before), MCMC algorithms are used for do bayes estimation. Generally there are 2 ways of the Bayes estimation using MCMC, Metropolis-Hastings and Gibbs Sampling. I will summarize these two methods and its difference.
The difference
Proposal Distribution: In Metropolis-Hastings, any proposal distribution can be used, whereas in Gibbs sampling, the proposal distribution is automatically the conditional distribution of the component being sampled.
Acceptance Probability: Metropolis-Hastings requires an acceptance step, but in Gibbs sampling, every proposal is accepted with probability 1 because it follows the conditional distribution directly.
Metropolis-Hastings
The Metropolis-Hastings algorithm uses a proposal distribution “q” to suggest the next state and accepts or rejects this proposal with a certain acceptance probability. This can be expressed with the following steps:
1. Proposal Step:From the current state x(t),generate a new state x′ using the proposal distribution q(x′∣x(t)).2. Acceptance Step:Compute the acceptance probability α(x(t),x′):α(x(t),x′)=min(1,p(x(t))q(x′∣x(t))p(x′)q(x(t)∣x′))where p(x) is the target distribution.3. Update Step:Generate a uniform random number u∼Uniform(0,1),and if u≤α(x(t),x′), set x(t+1)=x′;otherwise, set x(t+1)=x(t).
Gibbs Sampling
Gibbs sampling is a method for sequentially sampling each component of a multi-dimensional random variable from its conditional distributions.
Suppose you have a multi-dimensional random variable
x=(x1,x2,…,xn)
In Gibbs sampling, each component is updated sequentially. For the i-th component, fix all other components x−i and sample xi from the conditional distribution:
xi∼p(xi∣x−i)
This process is repeated for each component, cycling through the vector x repeatedly to generate a sequence of samples approximating the joint distribution.