|
I was reading a paper where they used Platt's calibration for a somewhat straightforward classification task. My question is this: What is the need of doing a calibration if you are not doing any boosting at all, since your space is only over one classifier, the problem of the additive logistic regression classifiers seem non-existen, thus overriding the need of a calibration step. Perhaps I'm confused, a bit of insight on these topics would be great. Thanks a lot |
|
Empirically, other classification models can benefit from post-learning calibration: Probably the paper authors read the above paper or heard a high level summary and decided post-calibration couldn't hurt. But if you read the above paper, you'll see there are learning algorithms that don't normally need post-calibration, or for which isotonic regression is better than Platt scaling. |