I wanted to know if there is any work done on training feed forward artificial neural networks (single hidden layer) training where a subset of weights are adjusted (using gradient descent and family or evolutionary) while keeping others constant, and iterating this way in groups.

For example a set of weights A are fixed and the other set B are changed, and in the next iteration A is changed and B is fixed, and thus the process proceeds until the system does not match a stopping criteria.

asked Aug 11 '13 at 08:41

phoxis's gravatar image

phoxis
317711

3

Sounds a bit like coordinate descent.

(Aug 11 '13 at 12:16) alto

@alto: thanks, good suggestion.

(Aug 19 '13 at 09:17) phoxis

2 Answers:

There is dropout. It randomly omits part of the weights during each training sample. Improving neural networks by preventing co-adaptation of feature detectors. I don't know if it is exactly what you're looking for, but have a look.

answered Aug 11 '13 at 13:18

edersantana's gravatar image

edersantana
155259

Looks like it is. I need to get into it first to comment on it.

(Aug 15 '13 at 08:59) phoxis
1

You could also take a look at DropConnect: http://cs.nyu.edu/~wanli/dropc/

(Aug 21 '13 at 08:08) alfa

Dropout by Hinton group disables the output of a neuron but you may want to see into drop connect by paper here. There are code examples of CUDA also available.

answered Aug 21 '13 at 09:41

Arun%20Kumar's gravatar image

Arun Kumar
286101016

This is more appropriate to me.

(Aug 23 '13 at 10:26) phoxis
Your answer
toggle preview

powered by OSQA

User submitted content is under Creative Commons: Attribution - Share Alike; Other things copyright (C) 2010, MetaOptimize LLC.