|
I know the basics of gradient approaches to optimize the function iteratively, but for this case have have a equality constraint as $sum_{i=1}^Nx_i = 1$ with the objective $max x^tQx$. How should I apply descent method to project my solution into this constraint ball? Fist spark in my mind is to apply normal gradient descent but normalize new $x^*$ for each iteration. I guess this will not work. |
|
Normalizing by dividing by the sum is not equivalent to projecting onto the simplex. The actual algorithm is slightly more involved: you sort the entries in the vector by magnitude, and then find which number can be subtracted from all of them which will make the result zero, and then do that. However, it might be easier for you if you can work on another parametrization; for example by instead of storing x such that sum(x) = 1, you can store theta, such that x(i) = exp(theta(i)) / sum(exp(theta)). This might or might not be convenient, depending on how easy it will be to compute the gradient |