|
I wanted to know if there is any work done on training feed forward artificial neural networks (single hidden layer) training where a subset of weights are adjusted (using gradient descent and family or evolutionary) while keeping others constant, and iterating this way in groups. For example a set of weights A are fixed and the other set B are changed, and in the next iteration A is changed and B is fixed, and thus the process proceeds until the system does not match a stopping criteria. |
|
There is dropout. It randomly omits part of the weights during each training sample. Improving neural networks by preventing co-adaptation of feature detectors. I don't know if it is exactly what you're looking for, but have a look. Looks like it is. I need to get into it first to comment on it.
(Aug 15 '13 at 08:59)
phoxis
1
You could also take a look at DropConnect: http://cs.nyu.edu/~wanli/dropc/
(Aug 21 '13 at 08:08)
alfa
|
Sounds a bit like coordinate descent.
@alto: thanks, good suggestion.