I would like to compute the expected value of a function f(x1,x2): R^n x R^n -> [0,1] for a fixed x1, where f has the form f(x1,x2) = |f2(x1) - f2(x2)| for some f2:R^n -> [0,1]. This will be done for thousands of different values of x1 in a context where speed is important. Three problems:
I can see a few approaches:
Any other ideas? Could someone familiar with the sampling literature recommend papers that deal with this kind of problem?
I think the only reasonable approach is 4. 1, as you mentioned, would be too slow. For 2 and 3 you have no guarantee of convergence, or even that the approach is sensible. There are many sampling methods, and a good start to know some of their properties and when can you use them is the wikipedia page for Monte Carlo integration. Regardless of the specific method you should be able to bound the variance of the monte carlo estimate of the expected value you're after, and this is your main way of making sure that you have the right value.
Which approach to use depends a lot on what you know about the probability distribution over which you want to compute expectations. From strongest to weakest assumptions:
A good introductory reference on this subject is chapters 29 and 30 of David MacKay's Information Theory, Inference, and Learning Algorithms book, which you can find for free on the internet. You can download individual chapters here.