I know the basics of gradient approaches to optimize the function iteratively, but for this case have have a equality constraint as $sum_{i=1}^Nx_i = 1$ with the objective $max x^tQx$. How should I apply descent method to project my solution into this constraint ball? Fist spark in my mind is to apply normal gradient descent but normalize new $x^*$ for each iteration. I guess this will not work.

asked Mar 02 '14 at 17:06

Eren%20Golge's gravatar image

Eren Golge
16336


One Answer:

Normalizing by dividing by the sum is not equivalent to projecting onto the simplex. The actual algorithm is slightly more involved: you sort the entries in the vector by magnitude, and then find which number can be subtracted from all of them which will make the result zero, and then do that.

However, it might be easier for you if you can work on another parametrization; for example by instead of storing x such that sum(x) = 1, you can store theta, such that x(i) = exp(theta(i)) / sum(exp(theta)). This might or might not be convenient, depending on how easy it will be to compute the gradient

answered Mar 03 '14 at 09:41

Alexandre%20Passos's gravatar image

Alexandre Passos ♦
2554154278421

Your answer
toggle preview

powered by OSQA

User submitted content is under Creative Commons: Attribution - Share Alike; Other things copyright (C) 2010, MetaOptimize LLC.