Home Page

Papers

Submissions

News

Editorial Board

Special Issues

Open Source Software

Proceedings (PMLR)

Data (DMLR)

Transactions (TMLR)

Search

Statistics

Login

Frequently Asked Questions

Contact Us



RSS Feed

Sparse Kernel Regression with Coefficient-based $\ell_q-$regularization

Lei Shi, Xiaolin Huang, Yunlong Feng, Johan A.K. Suykens; 20(161):1−44, 2019.

Abstract

In this paper, we consider the $\ell_q-$regularized kernel regression with $0 < q \leq 1$. In form, the algorithm minimizes a least-square loss functional adding a coefficient-based $\ell_q-$penalty term over a linear span of features generated by a kernel function. We study the asymptotic behavior of the algorithm under the framework of learning theory. The contribution of this paper is two-fold. First, we derive a tight bound on the $\ell_2-$empirical covering numbers of the related function space involved in the error analysis. Based on this result, we obtain the convergence rates for the $\ell_1-$regularized kernel regression which is the best so far. Second, for the case $0 < q < 1$, we show that the regularization parameter plays a role as a trade-off between sparsity and convergence rates. Under some mild conditions, the fraction of non-zero coefficients in a local minimizer of the algorithm will tend to $0$ at a polynomial decay rate when the sample size $m$ becomes large. As the concerned algorithm is non-convex, we also discuss how to generate a minimizing sequence iteratively, which can help us to search a local minimizer around any initial point.

[abs][pdf][bib]       
© JMLR 2019. (edit, beta)

Mastodon