Authors
Ying Guo,
Peter L Bartlett,
John Shawe-Taylor,
Publication date
1999
Publisher
Total citations
Description
Support vector machines are a type of learning machine related to the maximum margin hyperplane. Until recently, the only bounds on the generalization performance of SV machines (within the PAC framework) were via bounds on the fatshattering dimension of maximum margin hyperplanes. This result took no account of the kernel used. More recently, it has been shown [8] that one can bound the relevant covering numbers using some tools from functional analysis. The resulting bound is quite complex and seemingly difficult to compute. In this paper we show that the bound can be greatly simplified and as a consequence we are able to determine some interesting quantities (such as the effective number of dimensions used). The new bound is quite a simple formula involving the eigenvalues of the integral operator induced by the kernel. We present an explicit calculation of covering numbers for an SV machine …