|
In AdaBoost, the coefficients alpha_t and and weak classifiers h_t are changed at each iteration. Are there other update strategies for the alpha_t and distribution in each boosting round? |
|
There are many boosting variants, far too many to enumerate in an answer here, each with different updates. So, no, not for adaboosting, but yes for other boosting variants. See the elements of statistical learning sections on boosting for some of them and how you can derive them from different loss functions. |