|
I'm using scikit-learn for a course of machine learning and in the classifier SVC (whit kernel=linear for example) we have got a function that return the support vectors but in the class of LinearSVC we haven't got this function, how i can calculate these support vectors? thank you all |
|
I think the linearSVC does not explicitly keep track of the support vectors for efficiency reasons, as in the linear case it's cheaper to simply store the primal representation (the weights vector) instead of the dual representation (the support vectors). |
|
thank you for your answer! Yes, it is logical that if they have not included that functionality is for that reason, but my question is how I can calculate it manually through which I can extract data such as coefficients and the equations of minimization. |
|
In sci-kit learn what I have done is to calculate the normal w (calculated from the coefficients) and find the points that are at a distance 1 / | | w | | the hyperplane with decision_function. Be careful with the accuracy, and remember that is only a approximation. In liblinear faq we can find how print the support vector modifying the library. |