|
I don't know of any but I think in principle the algorithm is quite simple to compute. Which version do you want? There is one with a locality constraint, one with an additional tangent constraint and one with a positivity constraint. The authors used map-reduce to speed up the algorithm but otherwise it is a l1-penatalized gradient descent. Which paper has the positivity constraint? I don't remember reading about that. For the other two, I don't think there are any constraints other than an optional constraint that the coefficients must sum to 1. The optimization problem can be solved by alternating between two convex problems, one quadratic and the other quadratic + weighted L1. I'm thinking of doing the former with conjugate gradient descent (since with large dictionaries it would be too expensive to compute the Hessian and jump straight to the solution) and the latter with Honglak Lee's sign search algorithm. I don't think gradient descent would be a very efficient way of solving either of these problems, and map-reduce doesn't do anything that change that. Is there a paper that says that's what the authors did? I didn't see anything about gradient descent or map-reduce in either the paper that introduced LCC, the longer tech report, or the paper on local tangents. The local tangent features are just learned with locally weighted PCA after the anchor points are learned; they don't change the primary optimization problem at all.
(Apr 05 '11 at 09:28)
Ian Goodfellow
The one with positivity constraint is from the last nips, called deep coding networks. What they are doing for optimization is alternating between two convex problems, as you said. This is probably the most reasonable approach. How they are doing that, I am not sure. Maybe I shouldn't have said gradient descent but "some good convex solver" ;) About the map reduce: I think this is from their presentation/paper on the imagenet challenge. I also talked with one of the authors on NIPS so maybe I got it from there. But the parallelism is pretty obvious so I don't think there is any magic behind that.
(Apr 05 '11 at 09:37)
Andreas Mueller
|
|
I've released my own implementation of it now as part of pylearn2: https://github.com/lisa-lab/pylearn/blob/master/pylearn2/models/local_coordinate_coding.py |