Predictive Sparse Decomposition was developed after sparse autoencoders, but, to me, looks very similar to them. Does PSD have any advantages over sparse autoencoders? Is there any empirical evidence?

asked Nov 01 '13 at 12:11

Max's gravatar image

Max
476162729

This is the corrected link to the paper: http://yann.lecun.com/exdb/publis/pdf/koray-psd-08.pdf

(May 29 '14 at 10:34) Christian Hudon

One Answer:

If I recall correctly, PSD actually does sparse coding and at the same time approximates it with a feed-forward encoding step. Sparse autoencoders only have the feed-forward encoding step. Especially for smaller datasets, "smarter" encoding procedures such as sparse coding tend to result in better features. Adam Coates has done some very interesting work on this topic, particularly his ICML 2011 paper.

answered Nov 01 '13 at 17:52

Sander%20Dieleman's gravatar image

Sander Dieleman
155672734

Sparse autoencoders only have the feed-forward encoding step Nope. Just like PSD (and unlike SC), AE has a 1-step encoder and 1-step decoder. The rather esoteric model without a decoder is called Sparse Filtering.

(Nov 03 '13 at 01:43) Max

PS: ICA also doesn't have a decoder

(Nov 03 '13 at 02:30) Max

Sorry, I didn't express myself clearly :) I wanted to say that the encoding step is "just feedforward", i.e. not iterative (so there are no lateral interactions). I didn't mean to say there is no decoder.

(Nov 04 '13 at 08:32) Sander Dieleman
Your answer
toggle preview

powered by OSQA

User submitted content is under Creative Commons: Attribution - Share Alike; Other things copyright (C) 2010, MetaOptimize LLC.