Now I have three classifier( _1,_2, _3). Each classifier is a big and complex program. I want to do adaboost to combine the results from the three classifier. But after finishing classifier_1, I can't give the weight to classifier_2, that is to say, the three classifiers are independent, I can't minimize the weighted error.

How should I do for this situation?

asked Oct 21 '11 at 14:05

yunfeiyu's gravatar image

yunfeiyu
45235


2 Answers:

You can't combine these classifiers with adaboost, then. However, you can use any other ensemble method to combine them. The easiest one is to make your final decision be a linear function of the decisions of each classifier and optimize these weights on a small held-out development set.

answered Oct 21 '11 at 16:23

Alexandre%20Passos's gravatar image

Alexandre Passos ♦
2554154278421

As I know there are two ways to taking the weight into consideration:

  1. one method is assume that the learning algorithm can operate with the reweighted training data.

  2. the other method is based on resampling the data with replacement based on the distribution of weighting

Can I use the second method to combine results of the three classifiers?

(Oct 21 '11 at 16:53) yunfeiyu

To use adaboost you need to train many weak learners with differently-weighted training data, not just three models you already have trained.

(Oct 21 '11 at 17:12) Alexandre Passos ♦
-1

May i know what is the meaning of weight in adaboost?

answered Nov 01 '11 at 23:48

ann's gravatar image

ann
0

Your answer
toggle preview

powered by OSQA

User submitted content is under Creative Commons: Attribution - Share Alike; Other things copyright (C) 2010, MetaOptimize LLC.