×

Learning theory estimates via integral operators and their approximations. (English) Zbl 1127.68088

Summary: The regression problem in learning theory is investigated with least square Tikhonov regularization schemes in Reproducing Kernel Hilbert Spaces (RKHS). We follow our previous work and apply the sampling operator to the error analysis in both the RKHS norm and the \(L^{2}\) norm. The tool for estimating the sample error is a Bennet inequality for random variables with values in Hilbert spaces. By taking the Hilbert space to be the one consisting of Hilbert-Schmidt operators in the RKHS, we improve the error bounds in the \(L^{2}\) metric, motivated by an idea of Caponnetto and de Vito. The error bounds we derive in the RKHS norm, together with a Tsybakov function we discuss here, yield interesting applications to the error analysis of the (binary) classification problem, since the RKHS metric controls the one for the uniform convergence.

MSC:

68T05 Learning and adaptive systems in artificial intelligence
94A20 Sampling theory in information and communication theory
42B10 Fourier and Fourier-Stieltjes transforms and other transforms of Fourier type
46E22 Hilbert spaces with reproducing kernels (= (proper) functional Hilbert spaces, including de Branges-Rovnyak and other structured spaces)
62H30 Classification and discrimination; cluster analysis (statistical aspects)
PDF BibTeX XML Cite
Full Text: DOI