In AdaBoost, the coefficients alpha_t and and weak classifiers h_t are changed at each iteration. Are there other update strategies for the alpha_t and distribution in each boosting round?

asked Jul 27 '10 at 22:05

charlie's gravatar image

charlie
140121417

edited Sep 23 '10 at 23:56

Joseph%20Turian's gravatar image

Joseph Turian ♦♦
579051125146


One Answer:

There are many boosting variants, far too many to enumerate in an answer here, each with different updates. So, no, not for adaboosting, but yes for other boosting variants. See the elements of statistical learning sections on boosting for some of them and how you can derive them from different loss functions.

answered Jul 27 '10 at 22:27

Alexandre%20Passos's gravatar image

Alexandre Passos ♦
2554154278421

Your answer
toggle preview

powered by OSQA

User submitted content is under Creative Commons: Attribution - Share Alike; Other things copyright (C) 2010, MetaOptimize LLC.