Shannon sampling. II: Connections to learning theory. (English) Zbl 1107.94008

This paper continues the authors’ former study [Bull. Am. Math. Soc., New Ser. 41, No. 3, 279–305 (2004; Zbl 1107.94004)]. They propose a reproducing kernel Hilbert space (the traditional band-limited functions space is also a RKHS) framework to understand the function reconstruction beyond point evaluation. A unified framework for sampling theory and learning theory is initially established in this paper.


94A20 Sampling theory in information and communication theory
42B10 Fourier and Fourier-Stieltjes transforms and other transforms of Fourier type
46E22 Hilbert spaces with reproducing kernels (= (proper) functional Hilbert spaces, including de Branges-Rovnyak and other structured spaces)
68Q32 Computational learning theory
68T05 Learning and adaptive systems in artificial intelligence
68U10 Computing methodologies for image processing
41A05 Interpolation in approximation theory
62J05 Linear regression; mixed models


Zbl 1107.94004
Full Text: DOI


[1] Aldroubi, A.; Gröchenig, K., Non-uniform sampling and reconstruction in shift-invariant spaces, SIAM Rev., 43, 585-620 (2001) · Zbl 0995.42022
[2] Aronszajn, N., Theory of reproducing kernels, Trans. Amer. Math. Soc., 68, 337-404 (1950) · Zbl 0037.20701
[3] Cucker, F.; Smale, S., On the mathematical foundations of learning, Bull. Amer. Math. Soc., 39, 1-49 (2001) · Zbl 0983.68162
[4] Cucker, F.; Smale, S., Best choices for regularization parameters in learning theory, Found. Comput. Math., 2, 413-428 (2002) · Zbl 1057.68085
[5] De Vito, E.; Caponnetto, A.; Rosasco, L., Model selection for regularized least-squares algorithm in learning theory, Found. Comput. Math., 5, 59-85 (2005) · Zbl 1083.68106
[6] Engl, H. W.; Hanke, M.; Neubauer, A., Regularization of Inverse Problems, Mathematics and Its Applications, vol. 375 (1996), Kluwer Academic: Kluwer Academic Dordrecht · Zbl 0859.65054
[7] Evgeniou, T.; Pontil, M.; Poggio, T., Regularization networks and support vector machines, Adv. Comput. Math., 13, 1-50 (2000) · Zbl 0939.68098
[8] McDiarmid, C., Concentration, (Probabilistic Methods for Algorithmic Discrete Mathematics (1998), Springer-Verlag: Springer-Verlag Berlin), 195-248 · Zbl 0927.60027
[9] Niyogi, P., The Informational Complexity of Learning (1998), Kluwer Academic: Kluwer Academic Boston · Zbl 0976.68125
[10] Poggio, T.; Smale, S., The mathematics of learning: Dealing with data, Notices Amer. Math. Soc., 50, 537-544 (2003) · Zbl 1083.68100
[11] Smale, S.; Zhou, D. X., Estimating the approximation error in learning theory, Anal. Appl., 1, 17-41 (2003) · Zbl 1079.68089
[12] Smale, S.; Zhou, D. X., Shannon sampling and function reconstruction from point values, Bull. Amer. Math. Soc., 41, 279-305 (2004) · Zbl 1107.94007
[13] Unser, M., Sampling—50 years after Shannon, Proc. IEEE, 88, 569-587 (2000) · Zbl 1404.94028
[14] Vapnik, V., Statistical Learning Theory (1998), Wiley: Wiley New York · Zbl 0935.62007
[15] Wahba, G., Spline Models for Observational Data (1990), SIAM: SIAM Philadelphia · Zbl 0813.62001
[16] Y. Ying, McDiarmid inequalities of Bernstein and Bennett forms, Technical Report, City University of Hong Kong, 2004; Y. Ying, McDiarmid inequalities of Bernstein and Bennett forms, Technical Report, City University of Hong Kong, 2004
[17] Young, R. M., An Introduction to Non-Harmonic Fourier Series (1980), Academic Press: Academic Press New York · Zbl 0493.42001
[18] Zhang, T., Leave-one-out bounds for kernel methods, Neural Comput., 15, 1397-1437 (2003) · Zbl 1085.68144
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.