Hi,

Is there any proved result concerning the universal approximation ability of a mixture of Gaussians? if so, could you provide a reference?

Thanks.

asked Dec 08 '11 at 12:12

Lucian%20Sasu's gravatar image

Lucian Sasu
513172634

edited Dec 08 '11 at 12:12


2 Answers:

I have been told and have found on wikipedia (with no further source), that centroid based clustering (GMMs are centroid based) is NP-hard. This applies when you do not have labels but you do know the number of clusters.

answered Dec 09 '11 at 11:28

Travis%20Wolfe's gravatar image

Travis Wolfe
235119

Assuming I'm interpreting your question correctly, a Google search reveals that this question also appears on Math Overflow with a few answers: http://mathoverflow.net/questions/27589/mixtures-of-gaussian-distributions-dense-in-distributions

answered Dec 09 '11 at 11:38

Kevin%20Canini's gravatar image

Kevin Canini
126021330

edited Dec 09 '11 at 11:39

Thanks, I wasn't aware of this question. They seem to be similar, but on MO it is about approximating a probability distribution, while universal approximation talks about approximating continuous real functions by neural networks, as in http://en.wikipedia.org/wiki/Universal_approximation_theorem.

(Dec 10 '11 at 00:18) Lucian Sasu
Your answer
toggle preview

powered by OSQA

User submitted content is under Creative Commons: Attribution - Share Alike; Other things copyright (C) 2010, MetaOptimize LLC.