Say I'm doing kernel regression, with $n$ datapoints $x_1, x_2, dots, x_n$ in $R^d$. I want to evaluate basis function expansions like $f(x) = sum_i a_i k(x,x_i)$. What techniques can I use to approximately evaluate this expansion faster? e.g. if I use a squared exponential kernel $K(x,y) = exp( |x-y|^2 / 2*sigma^2)$, then I could use a nearest neighbor data structure and ignore the kernels $k(x,x_i)$ for which $| x_i - x | gg sigma^2$.

What other tricks are there for approximately evaluating the basis function expansion?

asked Sep 06 '13 at 11:32

joschu's gravatar image

joschu
1111

Be the first one to answer this question!
toggle preview

powered by OSQA

User submitted content is under Creative Commons: Attribution - Share Alike; Other things copyright (C) 2010, MetaOptimize LLC.