×

Convergence analysis of online algorithms. (English) Zbl 1129.68070

Summary: In this paper, we are interested in the analysis of regularized online algorithms associated with reproducing kernel Hilbert spaces. General conditions on the loss function and step sizes are given to ensure convergence. Explicit learning rates are also given for particular step sizes.

MSC:

68T05 Learning and adaptive systems in artificial intelligence
PDF BibTeX XML Cite
Full Text: DOI

References:

[1] N. Aronszajn, Theory of reproducing kernels, Trans. Amer. Math. Soc. 68 (1950) 337–404. · Zbl 0037.20701
[2] P.L. Bartlett, M.I. Jordan and J.D. McAuliffe, Convexity, classification, and risk bounds, Preprint, Department of Statistics, University of California Berkeley, 2003. · Zbl 1118.62330
[3] S. Boyd and L. Vandenberghe, Convex Optimization (Cambridge: Cambridge University Press, 2004). · Zbl 1058.90049
[4] D.R. Chen, Q. Wu, Y.M. Ying and D.X. Zhou, Support vector machine soft margin classifiers: error analysis, J. Mach. Learn. Res. 5 (2004) 1143–1175. · Zbl 1222.68167
[5] N. Cesa-Bianchi, P. Long and M. Warmuth, Worst-case quadratic loss bounds for prediction using linear functions and gradient descent, IEEE Trans. Neural Netw. 7 (1996) 604–619.
[6] N. Cristianini and J. Shawe-Taylor, An Introduction to Support Vector Machines and Other Kernel-based Learning Methods (Cambridge: Cambridge University Press, 2000). · Zbl 0994.68074
[7] F. Cucker and S. Smale, On the mathematical foundations of learning, Bull. Amer. Math. Soc. 39 (2001) 1–49. · Zbl 0983.68162
[8] L. Devroye, L. Györfi and G. Lugosi, A Probabilistic Theory of Pattern Recognition (Berlin Heidelberg New York: Springer, 1997).
[9] T. Evgeniou, M. Pontil and T. Poggio, Regularization networks and support vector machines, Adv. Comput. Math. 13 (2000) 1–50. · Zbl 0939.68098
[10] J. Kivinen, A.J. Smola and R.C. Williamson, Online learning with kernels, IEEE Trans. Signal Process. 52 (2004) 2165–2176. · Zbl 1369.68281
[11] B. Blanchard, G. Lugosi and N. Vayatis, On the rate of convergence of regularized boosting classifiers, J. Mach. Learn. Res. 4 (2003) 861–894. · Zbl 1083.68109
[12] P. Niyogi and F. Girosi, On the relationships between generalization error, hypothesis complexity and sample complexity for radial basis functions, Neural Comput. 8 (1996) 819–842. · Zbl 05477968
[13] C. Scovel and I. Steinwart, Fast rates for support vector machines, Los Alamos National Laboratory Technical Report, 2005. · Zbl 1137.68564
[14] S. Smale and Y. Yao, Online learning algorithms, Preprint, Department of Mathematics, University of California Berkeley, 2004. · Zbl 1119.68098
[15] S. Smale and D.X. Zhou, Estimating the approximation error in learning theory, Anal. Appl. 1 (2003) 17–41. · Zbl 1079.68089
[16] S. Smale and D.X. Zhou, Shannon sampling and function reconstruction from point values, Bull. Amer. Math. Soc. 41 (2004) 279–305. · Zbl 1107.94007
[17] S. Smale and D.X. Zhou, Shannon sampling II: Connection to learning theory, Preprint, 2004. · Zbl 1107.94008
[18] V. Vapnik, Statistical Learning Theory (New York: John Wiley & Sons, 1998). · Zbl 0935.62007
[19] Q. Wu, Y. Ying and D.X. Zhou, Multi-kernel Regularized Classifiers, Submitted to J. Complexity, Department of Mathematics, City University of Hong Kong, 2004. · Zbl 1171.65043
[20] Y. Ying and D.X. Zhou, Learnability of Gaussians with flexible variances, Preprint, Department of Mathematics, City University of Hong Kong, 2004. · Zbl 1222.68339
[21] Y. Ying and D.X. Zhou, Online regularized classification algorithms, Preprint, 2005. · Zbl 1323.68450
[22] T. Zhang, Statistical behavior and consistency of classification methods based on convex risk minimization, Ann. Statis. 32 (2004) 56–85. · Zbl 1105.62323
[23] D.X. Zhou, The covering number in learning theory, J. Complex. 18 (2002) 739–767. · Zbl 1016.68044
[24] D.X. Zhou, Capacity of reproducing kernel spaces in learning theory, IEEE Trans. Inf. Theory 49 (2003) 1743–1752. · Zbl 1290.62033
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.