What's the difference between these two boosting algorithms? When would I use one instead of the other?

asked Jan 23 '12 at 16:55

grautur's gravatar image


In my experience Gradient Boosting Machine (R package gbm) is better (as far as prediction accuracy goes) and is more flexible (e.g. it supports multiple loss functions) than either of the aforementioned algorithms (judging by their R implementations at least..)

(Feb 16 '12 at 16:23) Yevgeny

2 Answers:

AdaBoost minimizes an exponential loss. LogitBoost minimizes the logistic loss. LogitBoost places less emphasis on examples that are very badly classified.

It's an empirical question, but the intuition is that LogitBoost is more appropriate when there is noise in the labels. Without knowing anything about the data, my bias is that data will have noise and you should minimize the logistic loss.

answered Jan 26 '12 at 04:14

Joseph%20Turian's gravatar image

Joseph Turian ♦♦

please look at this and this. you should probably use cross validation to determine which is best for your application.

answered Jan 24 '12 at 09:00

downer's gravatar image


Those kind of answers are pretty useless. Anybody could answer that.

(Jan 24 '12 at 17:47) Melipone Moody
Your answer
toggle preview

powered by OSQA

User submitted content is under Creative Commons: Attribution - Share Alike; Other things copyright (C) 2010, MetaOptimize LLC.