Authors
Huma Lodhi,
Grigoris Karakoulas,
John Shawe-Taylor,
Publication date
2000
Publisher
Springer Berlin Heidelberg
Total citations
Description
The paper considers applying a boosting strategy to optimise the generalisation bound obtained recently by Shawe-Taylor and Cristianini [7] in terms of the two norm of the slack variables. The formulation performs gradient descent over the quadratic loss function which is insensitive to points with a large margin. A novel feature of this algorithm is a principled adaptation of the size of the target margin. Experiments with text and UCI data shows that the new algorithm improves the accuracy of boosting. DMarginBoost generally achieves significant improvements over Adaboost.