|
Say I'm doing kernel regression, with $n$ datapoints $x_1, x_2, dots, x_n$ in $R^d$. I want to evaluate basis function expansions like $f(x) = sum_i a_i k(x,x_i)$. What techniques can I use to approximately evaluate this expansion faster? e.g. if I use a squared exponential kernel $K(x,y) = exp( |x-y|^2 / 2*sigma^2)$, then I could use a nearest neighbor data structure and ignore the kernels $k(x,x_i)$ for which $| x_i - x | gg sigma^2$. What other tricks are there for approximately evaluating the basis function expansion? |