Hello All,

I'm trying to learn a binary classifier from few training examples (~30 positive, 30 negative). I have 10 predictors i.e. dimension of my feature vector is 10 (all are important predictors). How should I approach to counter the curse of dimensionality here? I heard that bootstrapping data in such cases is a good idea but can someone please shed a light on the process?

Also, which classifiers people generally prefer in such scenario? I'm thinking of training decision trees (may be RF too).

Thanks in advance!

asked Jan 09 '14 at 14:17

Ankit%20Jain's gravatar image

Ankit Jain
1111


3 Answers:

Maybe K-Means would work nice for this case.

answered Jan 13 '14 at 09:57

Taygun's gravatar image

Taygun
11

Decision tree & RF are multi-class classifier by its nature, but ur problem is a binary classification problem. So, you cand do it with a simple binary svm classifier.

answered Jan 09 '14 at 19:34

lly's gravatar image

lly
1

I'm not sure why decision trees can't be used for binary classification problem. Yes, I could use SVM or LR but I have one additional constraint if I use weight based classifiers. I want all of my weights to be +ve. Now, the standard implementation of the classifiers don't have that constraint and I will have to write my own code for that :(

(Jan 09 '14 at 20:59) Ankit Jain

Decision tree & RF are multi-class classifier by its nature, but ur problem is a binary classification problem. So, you cand do it with a simple binary svm classifier.

answered Jan 09 '14 at 19:34

lly's gravatar image

lly
1

Your answer
toggle preview

powered by OSQA

User submitted content is under Creative Commons: Attribution - Share Alike; Other things copyright (C) 2010, MetaOptimize LLC.