I am now working for some optimization problems with the objective function containing some absolute value terms. For example, I want to minimize the function like this:

min f(x) = |x1-x2| + |x1-a| + x1^2 + x2^2 + exp(x1)

where x1 and x2 are variables, and a is constant.

Is it possible for me to use some derivative-based method such as newton method to solve it? I know that the function is not derivable when x1=x2 or x1=a, but can I just find any sub-gradient of it when it is not derivable? For example, the sub-gradient of |x1-x2| is sign(x1-x2) and the sub-gradient of |x1-a| is sign(x1-a).

I have heard from many people saying that the problem would be difficult if the objective function containing absolute values. The common reason for it is that the function is not differentiable. But I think that it is still solvable by using the sub-gradient for the term |x1-x2| and |x1-a|. Did I forget to consider something? If it can solve by sub-gradient, what is the reason that people say that the problem would be hard?

Thank you very much.

asked Jul 08 '14 at 02:09

whitedog's gravatar image

whitedog
1111

Be the first one to answer this question!
toggle preview

powered by OSQA

User submitted content is under Creative Commons: Attribution - Share Alike; Other things copyright (C) 2010, MetaOptimize LLC.