I'm using scikit-learn for a course of machine learning and in the classifier SVC (whit kernel=linear for example) we have got a function that return the support vectors but in the class of LinearSVC we haven't got this function, how i can calculate these support vectors?

thank you all

asked Apr 24 '13 at 00:12

byfnk's gravatar image

byfnk
21336


3 Answers:

I think the linearSVC does not explicitly keep track of the support vectors for efficiency reasons, as in the linear case it's cheaper to simply store the primal representation (the weights vector) instead of the dual representation (the support vectors).

answered Apr 24 '13 at 13:21

Alexandre%20Passos's gravatar image

Alexandre Passos ♦
2554154278421

thank you for your answer! Yes, it is logical that if they have not included that functionality is for that reason, but my question is how I can calculate it manually through which I can extract data such as coefficients and the equations of minimization.

answered Apr 24 '13 at 15:36

byfnk's gravatar image

byfnk
21336

edited Apr 24 '13 at 20:11

In sci-kit learn what I have done is to calculate the normal w (calculated from the coefficients) and find the points that are at a distance 1 / | | w | | the hyperplane with decision_function. Be careful with the accuracy, and remember that is only a approximation.

In liblinear faq we can find how print the support vector modifying the library.

answered May 03 '13 at 12:41

byfnk's gravatar image

byfnk
21336

edited May 03 '13 at 22:59

Your answer
toggle preview

powered by OSQA

User submitted content is under Creative Commons: Attribution - Share Alike; Other things copyright (C) 2010, MetaOptimize LLC.