Authors
Petroula Tsampouka,
John Shawe-Taylor,
Publication date
2006
Publisher
Springer Berlin Heidelberg
Total citations
Description
We present a new class of Perceptron-like algorithms with margin in which the “effective” learning rate η eff, defined as the ratio of the learning rate to the length of the weight vector, remains constant. We prove that for η eff sufficiently small the new algorithms converge in a finite number of steps and show that there exists a limit of the parameters involved in which convergence leads to classification with maximum margin. A soft margin extension for Perceptron-like large margin classifiers is also discussed.