Authors
Robert C Williamson,
John Shawe-Taylor,
Bernhard Schölkopf,
Publication date
1999
Publisher
Total citations
Description
It is known that the covering numbers of a function class on a double sample (length 2m, where m is the number of points in the sample) can be used to bound the generalization performance of a classi er by using a margin based analysis. Traditionally this has been done using aSauer-like" relationship involving a combinatorial dimension such as the fat-shattering dimension. In this paper we show that one can utilize an analogous argument in terms of the observed covering numbers on a single m-sample (being the actual observed data points). The signi cance of this is that for certain interesting classes of functions, such as support vector machines, one can readily estimate the empirical covering numbers quite well. We show how to do so in terms of the eigenvalues of the Gram matrix created from the data. These covering numbers can be much less than a priori bounds indicate in situations where the particular data received iseasy". The work can be considered an extension of previous results which provided generalization performance bounds in terms of the VC-dimension of the class of hypotheses restricted to the sample, with the considerable advantage that the covering numbers can be readily computed, and they often are small.