I was looking for the proof of convergence for perceptron algorithm with margin. I was not able to find it in any pattern classification text book or over the internet.

Can anyone here please point to a text or reference for the above.

asked Aug 23 '11 at 17:09

aseembehl's gravatar image

aseembehl
56101115

edited Aug 23 '11 at 17:26


2 Answers:

The proof of convergence of the Perceptron with Margin can be found in the conference paper "Analysis of generic perceptron-like large margin classifiers" in Proc. 16th Eur. Conf. Machine Learning, Porto, 2005, pp. 750-758. More specifically, look in Section 2.1 and set fmin=fmax=1.

answered Sep 04 '11 at 05:27

Petroula%20Tsampouka's gravatar image

Petroula Tsampouka
411

One such proof was pointed out in the How does one prove that a separating hyper-plane exists for a linearly separable pattern? question. Repeating it, it's Theorem 1 of Freund and Schapire's averaged perceptron paper Large margin classification using the perceptron algorithm. The theorem technically proves that the number of mistakes made by the perceptron algorithm is bounded as long as there is a margin. Hence, if you do many passes over the data, it will reach a point where it can do no more mistakes, and hence will have converged.

answered Aug 23 '11 at 17:27

Alexandre%20Passos's gravatar image

Alexandre Passos ♦
2554154278421

edited Aug 23 '11 at 17:27

Your answer
toggle preview

powered by OSQA

User submitted content is under Creative Commons: Attribution - Share Alike; Other things copyright (C) 2010, MetaOptimize LLC.