On the Influence of the Kernel on the Consistency of Support Vector Machines
In this article we study the generalization abilities
of several classifiers of support vector machine (SVM) type using a certain class
of kernels that we call universal. It is shown that the soft margin algorithms with universal kernels
are consistent for
a large class of classification problems including some kind of noisy tasks provided that the
regularization parameter is chosen well. In particular we derive a simple
sufficient condition for this parameter in the case of Gaussian RBF kernels.
On the one hand our considerations are based on an investigation of an
approximation property---the so-called universality---of the used kernels
that ensures that all continuous functions can be approximated by certain kernel expressions.
This approximation property
also gives a new insight into the role of kernels in these and other
algorithms. On the other hand the results are achieved by a precise study of the
underlying optimization problems of the classifiers.
Furthermore, we show consistency for the maximal margin classifier as well as for the soft margin SVM's
in the presence of large margins. In this case it turns out that also constant regularization parameters
ensure consistency for the soft margin SVM's. Finally we prove that even for simple, noise free
classification problems SVM's with polynomial kernels can behave arbitrarily badly.