×

zbMATH — the first resource for mathematics

Multiple kernel learning by empirical target kernel. (English) Zbl 1435.68285
Summary: Multiple kernel learning (MKL) aims at learning an optimal combination of base kernels with which an appropriate hypothesis is determined on the training data. MKL has its flexibility featured by automated kernel learning, and also reflects the fact that typical learning problems often involve multiple and heterogeneous data sources. Target kernel is one of the most important parts of many MKL methods. These methods find the kernel weights by maximizing the similarity or alignment between weighted kernel and target kernel. The existing target kernels implement a global manner, which (1) defines the same target value for closer and farther sample pairs, and inappropriately neglects the variation of samples; (2) is independent of training data, and is hardly approximated by base kernels. As a result, maximizing the similarity to the global target kernel could make these pre-specified kernels less effectively utilized, further reducing the classification performance. In this paper, instead of defining a global target kernel, a localized target kernel is calculated for each sample pair from the training data, which is flexible and able to well handle the sample variations. A new target kernel named empirical target kernel is proposed in this research to implement this idea, and three corresponding algorithms are designed to efficiently utilize the proposed empirical target kernel. Experiments are conducted on four challenging MKL problems. The results show that our algorithms outperform other methods, verifying the effectiveness and superiority of the proposed methods.
Reviewer: Reviewer (Berlin)
MSC:
68T05 Learning and adaptive systems in artificial intelligence
46E22 Hilbert spaces with reproducing kernels (= (proper) functional Hilbert spaces, including de Branges-Rovnyak and other structured spaces)
47B32 Linear operators in reproducing-kernel Hilbert spaces (including de Branges, de Branges-Rovnyak, and other structured spaces)
62H30 Classification and discrimination; cluster analysis (statistical aspects)
62J02 General nonlinear regression
PDF BibTeX XML Cite
Full Text: DOI
References:
[1] Aiolli, F. and Donini, M., EasyMKL: A scalable multiple kernel learning algorithm, Neurocomputing169 (2015) 215-224.
[2] Bach, F. R., Lanckriet, G. R. G. and Jordan, M. I., Multiple kernel learning, conic duality, and the smo algorithm, International Conf.2004, p. 6.
[3] Baram, Y., Learning by kernel polarization, Neural Comput.17(6) (1989) 1264-1275. · Zbl 1087.68102
[4] Cortes, C., Mohri, M. and Rostamizadeh, A., Two-stage learning kernel algorithms, Int. Conf. Machine Learning, 2010, pp. 239-246.
[5] Cortes, C., Mohri, M. and Rostamizadeh, A., Algorithms for learning kernels based on centered alignment, J. Mach. Learn. Res.13(2) (2012) 795-828. · Zbl 1283.68286
[6] Cristianini, N., Shawe-Taylor, J., Elisseeff, A. and Kandola, J., On kernel-target alignment, Adv. Neural Inf. Process. Syst.14 (2001) 367-373.
[7] Gönen, M. and Alpaydin, E., Multiple kernel learning algorithms, J. Mach. Learn. Res.12 (2011) 2211-2268. · Zbl 1280.68167
[8] Goodfellow, I. J., Pougetabadie, J., Mirza, M., Xu, B., Wardefarley, D., Ozair, S., Courville, A., Bengio, Y., Ghahramani, Z. and Welling, M., Generative adversarial nets, Advances in Neural Inf. Process. Syst.3 (2014) 2672-2680.
[9] Guariglia, E., Entropy and fractal antennas, Entropy18(3) (2016) 84.
[10] Guariglia, E., Harmonic sierpinski gasket and applications, Entropy20(9) (2018) p. 714.
[11] E. Guariglia and S. Silvestrov, Fractional-wavelet analysis of positive definite distributions and wavelets on d’(c) (2016). · Zbl 1365.65294
[12] He, J., Chang, S. F. and Xie, L., Fast kernel learning for spatial pyramid matching, Computer Vision and Pattern Recognition, IEEE Conf. CVPR 2008, 2008, pp. 1-7.
[13] Sonnenburg, S., Rätsch, G., Schäfer, C. and Schölkopf, B., Large scale multiple kernel learning, J. Mach. Learn. Res.7 (2006) 1531-1565. · Zbl 1222.90072
[14] M. Kloft, U. Brefeld, S. Sonnenburg and A. Zien, Non-sparse regularization and efficient training with multiple kernels, Technical report, Electrical Engineering and Computer Sciences, University of California at Berkeley (2010). · Zbl 1280.68173
[15] Lanckriet, G. R., Cristianini, N., Bartlett, P., El Ghaoui, L. and Jordan, M. I., Learning the kernel matrix with semidenite programming, J. Mach. Learn. Res.4 (2004) 28-72. · Zbl 1222.68241
[16] Li, M., Liu, X., Wang, D. Y., Lei and, Yin, J. and Zhu, Z., Multiple kernel clustering with local kernel alignment maximization, Proc. Twenty-Fifth Int. Joint Conf. Artificial Intelligence, 2016, pp. 1705-1710.
[17] Lu, Y., Wang, L., Lu, J., Yang, J. and Shen, C., Multiple kernel clustering based on centered kernel alignment, Pattern Recognition47(11) (2014) 3656-3664. · Zbl 1373.68324
[18] J. Moeller, P. Raman, A. Saha and S. Venkatasubramanian, A geometric algorithm for scalable multiple kernel learning (2012) 633-642.
[19] Nazarpour, A. and Adibi, P., Two-stage multiple kernel learning for supervised dimensionality reduction, Pattern Recognition48(5) (2015) 1854-1862.
[20] Nguyen, C. H. and Tu, B. H., An efficient kernel matrix evaluation measure, Pattern Recognition41(41) (2008) 3366-3372. · Zbl 1154.68469
[21] Peluffo-Ordez, D. H., Castro-Ospina, A. E., Alvarado-Prez, J. C. and Revelo-Fuelagn, E. J., Multiple Kernel Learning for Spectral Dimensionality Reduction (Springer International Publishing, 2015).
[22] Qiu, S. and Lane, T., A framework for multiple kernel support vector regression and its applications to sirna efficacy prediction, IEEE/ACM Trans. Comput. Biol. Bioinf.6(2) (2009) 190-199.
[23] Rakotomamonjy, A., Bach, F. R., Canu, S. and Grandvalet, Y., SimpleMKL, J. Mach. Learn. Res.9(3) (2008) 2491-2521.
[24] Rodrigo, C. G., Sylvio, B., Lucimar, S. V., Fabricio, L. S., Carlos, D. M., Jose, C. P., Paulo, R. S. and Everthon, S. F., Introduction to the discrete shapelet transform and a new paradigm: Joint time-frequency-shape analysis, 2008 IEEE Int. Symp. Circuits and Systems, 2008.
[25] Subrahmanya, N. and Shin, Y. C., Sparse multiple kernel learning for signal processing applications, IEEE Trans. Pattern Anal. Mach. Intell.32(5) (2010) 788-798.
[26] Suzuki, T. and Tomioka, R., SpicyMKL: A fast algorithm for multiple kernel learning with thousands of kernels, Mach. Learn.85(1) (2011) 77-108. · Zbl 1237.68166
[27] Tanabe, H., Tu, B. H., Nguyen, C. H. and Kawasaki, S., Simple but effective methods for combining kernels in computational biology, IEEE Int. Conf. Research, Innovation and Vision for the Future, 2008, pp. 71-78.
[28] Vapnik, V., Statistical Learning Theory (Wiley, 1998). · Zbl 0935.62007
[29] Varma, M. and Babu, B. R., More generality in efficient multiple kernel learning, Proc. 26th Int. Conf. Machine Learning, 2009.
[30] Wang, L., Feature selection with kernel class separability, IEEE Trans. Pattern Anal. Mach. Intell.30(9) (2008) 1534-1546.
[31] Wang, T., Zhao, D. and Feng, Y., Two-stage multiple kernel learning with multiclass kernel polarization, Knowl. Based Syst.48(2) (2013) 10-16.
[32] Xu, Z., Jin, R., Yang, H., King, I. and Lyu, M. R., Simple and efficient multiple kernel learning by group lasso, Int. Conf. Machine Learning, 2010, pp. 1175-1182.
[33] Ying, Y., Huang, K. and Campbell, C., Enhanced protein fold recognition through a novel data integration approach., BMC Bioinf.10(1) (2009) 267.
[34] Zhang, Z.-P. and Xue-Gong, Elastic multiple kernel learning, Z. Xuebao/acta Aut. Sinica37(6) (2011) 693-699. · Zbl 1240.68221
[35] Zhou, Y., Hu, N. and Spanos, C. J., Veto-consensus multiple kernel learning, Thirtieth AAAI Conf. Artificial Intelligence, 2016, pp. 2407-2414.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.