|
Does it make sense to compare support vector regression (with an epsilon-insensitive loss) and lasso (with an ell_2 loss)? Why is it OK to compare two classification algorithms that optimize different loss functions, for e.g., logistic regression vs SVM, or LS-SVM vs SVM? Is it because the 0/1 loss is used to compute accuracy? |
|
The word loss function is overloaded here, and I think it helps to see the difference. First, there is the actual loss, as in, what real-world thing you actually care about in the end, in the decision-theoretic sense. In classification for example this is usually assumed to be accuracy, but sometimes it's something more complex such as F1, AUC, or some ranking scheme. Then, there is the surrogate loss function, or objective function, or whatever it is that your algorithm chooses its parameters based on. In classification this is usually something like the squared loss, hinge loss, logistic loss, etc. In Bayesian decision theory the first loss would be referred to as the loss function and the second as the likelihood function, as it represents some (not necessarily realistic) assumption about how your data is generated. For regression, then, you can train your predictor in whichever way you want but you should evaluate it with something that actually resembles what you care about. Then it's fine to compare different "loss functions" as they become just estimators for a parameter vector that answers the kind of question you want. Coming up with these "true losses" is, however, hard and problem-specific, and might actually convince you not to use regression at all. thanks for making that clear! in the last paragraph, are you saying that regression is not a "natural" question arising from our problems and that we have other tools that give better answers? this discussion reminds me of the post by John Langford: http://hunch.net/?p=211
(Dec 10 '11 at 15:15)
Pardis
No, I meant that regression is often natural, but sometimes a slight variant might be more appropriate.
(Dec 13 '11 at 03:11)
Alexandre Passos ♦
|