The optimal solution of multi-kernel regularization learning. (English) Zbl 1278.68111

Multi-kernel regularization schemes provide flexibility and better learning ability in some applications. They are also crucial in the problem of learning the kernel. It was shown by Y. Ying and D.-X. Zhou [J. Mach. Learn. Res. 8, 249–276 (2007; Zbl 1222.68339)] that the union of the unit balls of reproducing kernel Hilbert spaces generated by Gaussian kernels with flexible variances is learnable. The optimization problem when the variance runs over a compact set was also discussed. In this paper the authors discuss the optimization problem when the variance runs over the set of positive numbers (which is not compact). They present some sufficient conditions for the existence of an optimal solution for the least squares regularized regression associated with Gaussians with flexible variances.


68Q32 Computational learning theory
68T05 Learning and adaptive systems in artificial intelligence
62J02 General nonlinear regression


Zbl 1222.68339
Full Text: DOI


[1] Aronszajn, N.: Theory of reproducing kernels. Trans. Amer. Math. Soc., 68, 337–404 (1950) · Zbl 0037.20701 · doi:10.1090/S0002-9947-1950-0051437-7
[2] Bousquet, O., Elisseeff, A.: Stability and generalization. J. Mach. Learn. Res., 2, 499–526 (2002) · Zbl 1007.68083
[3] Carmeli, C., De Vito, E., Toigo, A., et al.: Vector valued reproducing kernel Hilbert spaces and universality. Anal. Appl., 8, 19–61 (2010) · Zbl 1195.46025 · doi:10.1142/S0219530510001503
[4] Chapelle, O., Vapnik, V., Bousquet, O., et al.: Choosing multiple parameters for support vector machines. Machine Learning, 46, 131–159 (2002) · Zbl 0998.68101 · doi:10.1023/A:1012450327387
[5] Cucker, F., Smale, S.: On the mathematical foundations of learning. Bull. Amer. Math. Soc., 39, 1–49 (2001) · Zbl 0983.68162 · doi:10.1090/S0273-0979-01-00923-5
[6] Douglas, R. G.: Banach Algebra Techniques in Operator Theory, Springer, New York, 1998 · Zbl 0920.47001
[7] Evgeniou, T., Pontil, M.: Regularized multi-task learning. Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining, Seattle, USA, 2004, 109–117
[8] Evgeniou, T., Pontil, M., Poggio, T.: Regularization networks and support vector machines. Adv. Comput. Math., 13, 1–50 (2000) · Zbl 0939.68098 · doi:10.1023/A:1018946025316
[9] Hu, T.: Online regression with varying Gaussians and non-identical distributions. Anal. Appl., 9, 395–408 (2011) · Zbl 1253.68189 · doi:10.1142/S0219530511001923
[10] Lanckriet, G. R. G., Cristianini, N., Bartlett, P., et al.: Learning the kernel matrix with semidefinite programming. J. Mach. Learn. Res., 5, 27–72 (2004) · Zbl 1222.68241
[11] Li, J., Barron, A.: Mixture density estimation. Advances in Neural Information Processing Systems (S. A. Solla, T. K. Leen, K. R. Muller eds.), 12, The MIT Press, Cambridge, 2000
[12] Micchelli, C. A., Pontil, M.: Learning the kernel function via regularization. J. Mach. Learn. Res., 6, 1099–1125 (2005) · Zbl 1222.68265
[13] Rakhlin, A., Panchenko, D., Mukherjee, S.: Risk bounds for mixture density estimation. ESAIM: Probability and Statistics, 9, 222–229 (2005) · Zbl 1141.62024 · doi:10.1051/ps:2005011
[14] Smale, S., Zhou, D. X.: Learning theory estimates via integral operators and their approximations. Constr. Approx., 26, 153–172 (2007) · Zbl 1127.68088 · doi:10.1007/s00365-006-0659-y
[15] Smale, S., Zhou, D. X.: Online learning with Markov sampling. Anal. Appl., 7, 87–113 (2009) · Zbl 1170.68022 · doi:10.1142/S0219530509001293
[16] Wu, Q., Ying Y. M., Zhou, D. X.: Learning rates of least-square regularized regression. Found. Comput. Math., 6, 171–192 (2006) · Zbl 1100.68100 · doi:10.1007/s10208-004-0155-9
[17] Ying, Y. M., Zhou, D. X.: Learnability of Gaussian with flexible variances. J. Mach. Learn. Res., 8, 249–276 (2007) · Zbl 1222.68339
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.