|
I was looking for the proof of convergence for perceptron algorithm with margin. I was not able to find it in any pattern classification text book or over the internet. Can anyone here please point to a text or reference for the above. |
|
The proof of convergence of the Perceptron with Margin can be found in the conference paper "Analysis of generic perceptron-like large margin classifiers" in Proc. 16th Eur. Conf. Machine Learning, Porto, 2005, pp. 750-758. More specifically, look in Section 2.1 and set fmin=fmax=1. |
|
One such proof was pointed out in the How does one prove that a separating hyper-plane exists for a linearly separable pattern? question. Repeating it, it's Theorem 1 of Freund and Schapire's averaged perceptron paper Large margin classification using the perceptron algorithm. The theorem technically proves that the number of mistakes made by the perceptron algorithm is bounded as long as there is a margin. Hence, if you do many passes over the data, it will reach a point where it can do no more mistakes, and hence will have converged. |