|
I have been hunting through the literature for where Collapsed Gibbs is used to accelerate sampling, and haven't really found too much. It seems its used mostly for LDA, DP mixtures behind the scenes, and maybe some data imputation. Are there any other places this technique has been successfully applied? |
|
http://www.uoguelph.ca/~wdarling/research/papers/TM.pdf Here's another example dedicated for naive bayesian approach for binary classification (i.e. spam detection), but there's also considered Dirichlet/Multinomial conjugate collapsing, so I'm not sure would it be interesting for you. Anything that isn't LDA is interesting to me!
(Sep 26 '12 at 15:15)
zaxtax ♦
|
|
I think mixtures of gaussians are also sampled with collapsed models often, because there's no reason not to (in low dimensions, or if the covariance is known), though I don't have papers to cite on that. |
|
This I actually know, but is more a niche. Zoubin and the Cambridge ML group are deep advocates of collapsing samplers, I particularly recommend these papers by Finale et Al and Knowles and Zoubin, which is applied for combinations of DP, IBP and Pitman Yor Process I think it is fair to say that even with the collapsed samplers, NP Bayes are still slower than most of the other alternatives. There was recently a paper on Bayesian Quadrature which actually offers some nice results |