|
I'm trying to fit a model that turns out to result in fitness function with loads of local minima, and simple gradient methods do not seem to come up with any good solutions (and I know from simpler models fit to the same problem that good solutions have to exist). After doing some reading, it seems my only option are various stochastic methods such as evolutionary method or monte-carlo optimizations. My question is what are the state of the art optimization algorithms that are know to handle many local optima well. Ideally, I would like to combine the stochastic search with the gradient so that the fact that I have the 1st and 2nd order gradient of the model is exploited.
This question is marked "community wiki".
|
|
If you are looking for non-gradient based methods for 1-D functions, try Brent's optimization method. The method repeatedly tries to fit a parabola through three points on the curve; evaluate the function at the minima of the parabola; find the next set of three points and so on. The advantage of this method is it does not require knowing about the function to be optimized (no need to compute gradients) and can be implemented in a few lines. Has something similar to Brent's method been applied to multivariate functions?
(Jul 01 '10 at 18:19)
Janto
not as far as I know.
(Jul 06 '10 at 11:13)
Delip Rao
|
Could you tell us more about the nature of the data?
The data consists of pairs of 1800 images (30x30 pixels) and corresponding responses of a neuron to these images. I then have a model of a neuron (inspired by the known anatomy of LGN an visual cortex), where its response is a sum of dot products of number (<20) of difference-of-Gaussians like linear filters with the input images passed through a sigmoid non-linearity.
The free parameters of the models are the weights and positions(centers) of the difference-of-Gaussians within the 30x30 pixel input space, and the parameters of the sigmoid.