Say Theta = (theta_1,...,theta_K) are some model parameters representing a discrete distribution:

for all k theta_k >=0, and sum theta_k = 1

I have some data X. I know how to compute P(X|Theta) and P(Theta).

I want to sample P(Theta|X) using MCMC moves. I start with some random initialization for Theta, and in each step I sample a new Theta':

T(Theta --> Theta') ~ Dirichlet(mu*Theta)

where mu is the pseudo sample size (constant).

I accept with probability:

P(Theta') P(X|Theta') Dir(Theta; mu*Theta') / [P(Theta) P(X|Theta) Dir(Theta'; mu*Theta)]

For some reason this doesn't work (likelihood rapidly drops even when I initialize with correct state).
1) Is there a chance I am missing a Jacobian term?
2) Is there a more standard proposal distribution to use in this case (other than sampling from the prior of course).

asked Dec 17 '10 at 20:16

joni's gravatar image

joni
16113

Is this rejecting too much? Since you're doing standard MH steps the likelihood shouldn't really drop. Could it be a normalization/numeric error?

(Dec 18 '10 at 02:49) Alexandre Passos ♦

if x is drawn from theta and is the standard multinomial distribution, the posterior is easy to solve analytically (due to conjugacy). this could be a lot better than trying to approximate it using MCMC...

(Dec 19 '10 at 01:56) Joseph Austerweil

One Answer:

Thanks, guys. I indeed had a bug. The above proposal actually works fine, a Jacobian term is apparently not needed. I apologize for posting the question. Joseph: I know about the conjugacy, but it was not the case here.

answered Dec 19 '10 at 04:14

joni's gravatar image

joni
16113

Your answer
toggle preview

powered by OSQA

User submitted content is under Creative Commons: Attribution - Share Alike; Other things copyright (C) 2010, MetaOptimize LLC.