|
I require a dataset with which I can find a local minima and get stuck with steepest gradient descent back propagation. I am working on an algorithm which attempts to skip this local minima. I am trying to find a case, ie an initial weight configuration for which I get stuck with regular steepest descent back propagation and using the same thing I can skip the local minima and go down to an existing adjacent better local minima (within some bounds like [-5, +5]). I am at the stage where I cannot do a benchmark because the algorithm is not fully ready, and I require to do a lot of things manually, and that is why it is essential to know first if the algorithm will skip the local minima. I have used the breast cancer, machine cpu and 4-bit parity datasets to get stuck in local minima in different weight initialization (like [0,1], [-3, +3], [-5, +5], [-10, +10] etc.). Using a feed forward artificial neural network with one single hidden layer and experimented with different number of hidden units. All thresholds are sigmoids [0,1]. I expected to find some local minima as there should exist some in the datasets I am working with, but failed to get any by inspecting the outputs. At the most I got into a plateau which at last converged, after a huge number of iterations to a good error value. Can anyone help me out? |