I'm currently trying to evaluate an integral with respect to an MCMC kernel, and I wanted someone else to read what I `proved' to see if it's really correct. It just doesn't feel right.

Suppose I have an MCMC Kernel K(a'|a) with stationary distribution p(a). I want to take the expectation of a function f(a) given a point a' through the kernel like so,

int_{a} f(a) K(a|a')

I use detailed balance,

= int_{a} f(a) frac{K(a'|a) p(a)}{p(a')} = mathbb{E}[f(a)K(a'|a)] frac{1}{p(a')}

Independence (at least I think they should be independent?)

= mathbb{E}[f(a)] mathbb{E}[K(a'|a)] frac{1}{p(a')}

And detailed balance once again

= mathbb{E}[f(a)] p(a') frac{1}{p(a')}

Now this just seems incorrect. If I interpret this correctly, it's telling me that if I take a large number of samples from $K(a|a')$ from any starting point $a'$ and integrate a function wrt to it, I might as well have had the original distribution. I must have done something wrong.

asked Nov 22 '11 at 01:33

Daniel%20Duckwoth's gravatar image

Daniel Duckwoth
954222938


One Answer:

I'm not sure you can do the independence trick, however, as both expectations depend on the same a variable, and the expectation of the product is not the product of expectations (unless one of the things being expected over is a constant). This would be the same thing as assuming something about the covariance of K(a|a') and f(a), and I don't see how you can make this assumption.

answered Nov 22 '11 at 08:27

Alexandre%20Passos's gravatar image

Alexandre Passos ♦
2554154278421

If 2 functions are independent, isn't the E[a(x)b(x)] = E[a(x)]E[b(x)]?

(Nov 22 '11 at 13:17) Daniel Duckwoth

Only if they are expectations over different variables, as in if x and y are independent, E[f(x)g(y)] = E[f(x)]E[g(y)]

(Nov 22 '11 at 13:19) Alexandre Passos ♦
1

If you have two functions over the same variable (a here) there is no way they can be said to be independent, as they bot depend only on the same variable (unless one of them is a constant).

(Nov 22 '11 at 13:20) Alexandre Passos ♦
1

More formally see theorem 5.9 in Wackerly et al's Mathematical Statistics with Applications link to theorem - http://books.google.com/books?id=ZvPKTemPsY4C&lpg=PP1&dq=wackerly%20mathematical%20statistics&pg=PA259#v=onepage&q&f=false

You can follow the proof to see why it wouldn't be the case when the two functions share the same random variable.

(Nov 22 '11 at 19:17) Chris Simokat
Your answer
toggle preview

powered by OSQA

User submitted content is under Creative Commons: Attribution - Share Alike; Other things copyright (C) 2010, MetaOptimize LLC.