Home Page




Editorial Board

Proceedings (PMLR)

Transactions (TMLR)

Open Source Software




Frequently Asked Questions

Contact Us

RSS Feed

Classification with Gaussians and Convex Loss

Dao-Hong Xiang, Ding-Xuan Zhou; 10(49):1447−1468, 2009.


This paper considers binary classification algorithms generated from Tikhonov regularization schemes associated with general convex loss functions and varying Gaussian kernels. Our main goal is to provide fast convergence rates for the excess misclassification error. Allowing varying Gaussian kernels in the algorithms improves learning rates measured by regularization error and sample error. Special structures of Gaussian kernels enable us to construct, by a nice approximation scheme with a Fourier analysis technique, uniformly bounded regularizing functions achieving polynomial decays of the regularization error under a Sobolev smoothness condition. The sample error is estimated by using a projection operator and a tight bound for the covering numbers of reproducing kernel Hilbert spaces generated by Gaussian kernels. The convexity of the general loss function plays a very important role in our analysis.

© JMLR 2009. (edit, beta)