I was reading a paper where they used Platt's calibration for a somewhat straightforward classification task.

My question is this:

What is the need of doing a calibration if you are not doing any boosting at all, since your space is only over one classifier, the problem of the additive logistic regression classifiers seem non-existen, thus overriding the need of a calibration step.

Perhaps I'm confused, a bit of insight on these topics would be great.

Thanks a lot

asked Aug 23 '12 at 04:48

Leon%20Palafox's gravatar image

Leon Palafox ♦
40857194128


One Answer:

Empirically, other classification models can benefit from post-learning calibration:

Niculescu-Mizil & Caruana (2005). Predicting good probabilities with supervised learning. In ICML'05.

Probably the paper authors read the above paper or heard a high level summary and decided post-calibration couldn't hurt. But if you read the above paper, you'll see there are learning algorithms that don't normally need post-calibration, or for which isotonic regression is better than Platt scaling.

answered Aug 26 '12 at 12:58

Art%20Munson's gravatar image

Art Munson
64611316

Your answer
toggle preview

powered by OSQA

User submitted content is under Creative Commons: Attribution - Share Alike; Other things copyright (C) 2010, MetaOptimize LLC.