I have been hunting through the literature for where Collapsed Gibbs is used to accelerate sampling, and haven't really found too much. It seems its used mostly for LDA, DP mixtures behind the scenes, and maybe some data imputation.

Are there any other places this technique has been successfully applied?

asked Sep 26 '12 at 15:05

zaxtax's gravatar image

zaxtax ♦
1051122545


3 Answers:

http://www.uoguelph.ca/~wdarling/research/papers/TM.pdf

Here's another example dedicated for naive bayesian approach for binary classification (i.e. spam detection), but there's also considered Dirichlet/Multinomial conjugate collapsing, so I'm not sure would it be interesting for you.

answered Sep 26 '12 at 15:10

Sergey%20Bartunov's gravatar image

Sergey Bartunov
81111116

Anything that isn't LDA is interesting to me!

(Sep 26 '12 at 15:15) zaxtax ♦

I think mixtures of gaussians are also sampled with collapsed models often, because there's no reason not to (in low dimensions, or if the covariance is known), though I don't have papers to cite on that.

answered Sep 26 '12 at 18:26

Alexandre%20Passos's gravatar image

Alexandre Passos ♦
2554154278421

This I actually know, but is more a niche.

Zoubin and the Cambridge ML group are deep advocates of collapsing samplers, I particularly recommend these papers by Finale et Al and Knowles and Zoubin, which is applied for combinations of DP, IBP and Pitman Yor Process

I think it is fair to say that even with the collapsed samplers, NP Bayes are still slower than most of the other alternatives.

There was recently a paper on Bayesian Quadrature which actually offers some nice results

answered Sep 27 '12 at 06:17

Leon%20Palafox's gravatar image

Leon Palafox ♦
40857194128

Your answer
toggle preview

powered by OSQA

User submitted content is under Creative Commons: Attribution - Share Alike; Other things copyright (C) 2010, MetaOptimize LLC.