Wu, Qiang; Ying, Yiming; Zhou, Ding-Xuan Learning rates of least-square regularized regression. (English) Zbl 1100.68100 Found. Comput. Math. 6, No. 2, 171-192 (2006). Summary: This paper considers the regularized learning algorithm associated with the least-square loss and reproducing kernel Hilbert spaces. The target is the error analysis for the regression problem in learning theory. A novel regularization approach is presented, which yields satisfactory learning rates. The rates depend on the approximation property and on the capacity of the reproducing kernel Hilbert space measured by covering numbers. When the kernel is \(C^{\infty}\) and the regression function lies in the corresponding reproducing kernel Hilbert space, the rate is \(m^{-\xi}\) with \(\xi\) arbitrarily close to 1, regardless of the variance of the bounded probability distribution. Cited in 1 ReviewCited in 122 Documents MSC: 68T05 Learning and adaptive systems in artificial intelligence 62J02 General nonlinear regression Keywords:regularized learning algorithm; kernel Hilbert spaces PDF BibTeX XML Cite \textit{Q. Wu} et al., Found. Comput. Math. 6, No. 2, 171--192 (2006; Zbl 1100.68100) Full Text: DOI OpenURL