|
Now I have three classifier( _1,_2, _3). Each classifier is a big and complex program. I want to do adaboost to combine the results from the three classifier. But after finishing classifier_1, I can't give the weight to classifier_2, that is to say, the three classifiers are independent, I can't minimize the weighted error. How should I do for this situation? |
|
You can't combine these classifiers with adaboost, then. However, you can use any other ensemble method to combine them. The easiest one is to make your final decision be a linear function of the decisions of each classifier and optimize these weights on a small held-out development set. As I know there are two ways to taking the weight into consideration:
Can I use the second method to combine results of the three classifiers?
(Oct 21 '11 at 16:53)
yunfeiyu
To use adaboost you need to train many weak learners with differently-weighted training data, not just three models you already have trained.
(Oct 21 '11 at 17:12)
Alexandre Passos ♦
|