Authors
Zakria Hussain,
John Shawe–Taylor,
Publication date
2011
Publisher
JMLR Workshop and Conference Proceedings
Total citations
Description
We propose two new generalization error bounds for multiple kernel learning (MKL). First, using the bound of Srebro and Ben-David (2006) as a starting point, we derive a new version which uses a simple counting argument for the choice of kernels in order to generate a tighter bound when 1-norm regularization (sparsity) is imposed in the kernel learning problem. The second bound is a Rademacher complexity bound which is additive in the (logarithmic) kernel complexity and margin term. This dependency is superior to all previously published Rademacher bounds for learning a convex combination of kernels, including the recent bound of Cortes et al.(2010), which exhibits a multiplicative interaction. We illustrate the tightness of our bounds with simulations.