×

TSVR: an efficient twin support vector machine for regression. (English) Zbl 1396.68102

Summary: The learning speed of classical Support Vector Regression (SVR) is low, since it is constructed based on the minimization of a convex quadratic function subject to the pair groups of linear inequality constraints for all training samples. In this paper we propose Twin Support Vector Regression (TSVR), a novel regressor that determines a pair of \(\epsilon\)-insensitive up- and down-bound functions by solving two related SVM-type problems, each of which is smaller than that in a classical SVR. The TSVR formulation is in the spirit of Twin Support Vector Machine (TSVM) via two nonparallel planes. The experimental results on several artificial and benchmark datasets indicate that the proposed TSVR is not only fast, but also shows good generalization performance.

MSC:

68T05 Learning and adaptive systems in artificial intelligence
PDFBibTeX XMLCite
Full Text: DOI

References:

[1] Allen, D.M., The relationship between variable selection and prediction, Technometrics, 16, 125-127, (1974) · Zbl 0286.62044
[2] Bates, D.M.; Watts, D.G., Nonlinear regression analysis and its applications, (1988), Wiley New York · Zbl 0728.62062
[3] Bi, J.; Bennett, K.P., A geometric approach to support vector regression, Neurocomputing, 55, 79-108, (2003)
[4] Blake, C. I., & Merz, C. J. (1998). UCI repository for machine learning databases: http://www.ics.uci.edu/ mlearn/MLRepository.html Irvine, CA: University of California, Department of Information and Computer Sciences; Blake, C. I., & Merz, C. J. (1998). UCI repository for machine learning databases: http://www.ics.uci.edu/ mlearn/MLRepository.html Irvine, CA: University of California, Department of Information and Computer Sciences
[5] Brown, M.P.S.; Grundy, W.N.; Lin, D., Knowledge-based analysis of microarray gene expression data by using support vector machine, Proceedings of national Academy of science USA, 97, 1, 262-267, (2000)
[6] Burges, C.J.C., A tutorial on support vector machines for pattern recognition, Data mining knowledge discovery, 2, 2, 121-167, (1998)
[7] Chang, C. C., & Lin, C. J. (2001). LIBSVM: A library for support vector machines, available from http://www.csie.ntu.edu.tw/ cjlin; Chang, C. C., & Lin, C. J. (2001). LIBSVM: A library for support vector machines, available from http://www.csie.ntu.edu.tw/ cjlin
[8] Christianini, V.; Shawe-Taylor, J., An introduction to support vector machines, (2002), Cambridge University Press Cambridge
[9] Chu, W.; Ong, C.J.; Keerthy, S.S., An improved conjugate gradient method scheme to the solution of least squares SVM, IEEE transactions on neural networks, 16, 2, 498-501, (2005)
[10] Collobert, R.; Bengio, S., Svmtorch: support vector machines for large-scale regression problems, Journal of machine learning, 1, 2, 143-160, (2001) · Zbl 1052.68111
[11] Cortes, C.; Vapnik, V.N., Support vector networks, Machine learning, 20, 273-297, (1995) · Zbl 0831.68098
[12] Ebrahimi, T.; Garcia, G.N.; Vesin, J.M., Joint time-frequency-space classification of EEG in a brain – computer interface application, Journal of apply signal process, 1, 7, 713-729, (2003) · Zbl 1052.92036
[13] Eubank, R.L., ()
[14] Fung, G., & Mangasarian, O. L. (2001). Proximal support vector machines. In Processings seventh international conference knowledge discovery and data mining (pp. 77-86); Fung, G., & Mangasarian, O. L. (2001). Proximal support vector machines. In Processings seventh international conference knowledge discovery and data mining (pp. 77-86)
[15] Ghorai, S.; Mukherjee, A.; Dutta, P.K., Nonparallel plane proximal classifier, Signal processing, 89, 510-522, (2009) · Zbl 1157.68442
[16] Ince, H., & Trafalis, T. B. (2002). Support vector machine for regression and applications to financial forecasting. In International joint conference on neural networks. IEEE-INNS-ENNS; Ince, H., & Trafalis, T. B. (2002). Support vector machine for regression and applications to financial forecasting. In International joint conference on neural networks. IEEE-INNS-ENNS
[17] Jayadeva; Khemchandani, R.; Chandra, S., Twin support vector machines for pattern classification, IEEE transactions on pattern analysis and machine intelligence, 29, 5, 905-910, (2007) · Zbl 1329.68226
[18] Jiao, L.; Bo, L.; Wang, L., Fast sparse approximation for least squares support vector machine, IEEE transactions on neural networks, 18, 1-13, (2007)
[19] Joachims, T., Ndellec, C., & Rouveriol, C. (1998). Text categorization with support vector machines: Learning with many relevant features. In European conference on machine learning no. 10, Chemnitz, Germany, 1398: 137-142; Joachims, T., Ndellec, C., & Rouveriol, C. (1998). Text categorization with support vector machines: Learning with many relevant features. In European conference on machine learning no. 10, Chemnitz, Germany, 1398: 137-142
[20] Joachims, T., Making large-scale SVM learning practical, ()
[21] Keerthi, S.S.; Shevade, S.K.; Bhattacharyya, C., Improvements to platt’s SMO algorithm for SVM classifier design, Neural computation, 13, 3, 637-649, (2001) · Zbl 1085.68629
[22] Kumar, M.A.; Gopal, M., Application of smoothing technique on twin support vector machines, Pattern recognition letter, 29, 1842-1848, (2008)
[23] Kumar, M.A.; Gopal, M., Least squares twin support vector machines for pattern classification, Expert systems with applications, 36, 7535-7543, (2009)
[24] Lee, Y.-J.; Hsieh, W-F.; Huang, C-M., \(\epsilon\)-SSVR: a smooth support vector machine for \(\epsilon\)-insensitive regression, IEEE transactions on knowledge and data engineering, 17, 5, 678-685, (2005)
[25] Mangasarian, O.L.; Wild, E.W., Multisurface proximal support vector classification via generalized eigenvalues, IEEE transactions on pattern analysis and machine intelligence, 28, 1, 69-74, (2006)
[26] Osuna, E., Freund, R., & Girosi, F. (1997). Training support vector machines: An application to face detection. In Proceedings of IEEE computer vision and pattern recognition (pp. 130-136); Osuna, E., Freund, R., & Girosi, F. (1997). Training support vector machines: An application to face detection. In Proceedings of IEEE computer vision and pattern recognition (pp. 130-136)
[27] Platt, J., Fast training of support vector machines using sequential minimal optimization, (), 185-2008
[28] Shevade, S.K.; Keerthi, S.S.; Bhattacharyya, C., Improvements to the SMO algorithm for SVM regression, IEEE transactions on neural networks, 11, 5, 1188-1193, (2000)
[29] Suykens, J.A.K.; Vandewalle, J., Least squares support vector machine classifiers, Neural process letter, 9, 3, 293-300, (1999)
[30] Suykens, J. A. K., Lukas, L., & Van Dooren, P. et al. (1999). Least squares support vector machine classifiers: A large scale algorithm. In Proceedings of European conference of circuit theory design (pp. 839-842); Suykens, J. A. K., Lukas, L., & Van Dooren, P. et al. (1999). Least squares support vector machine classifiers: A large scale algorithm. In Proceedings of European conference of circuit theory design (pp. 839-842)
[31] Staudte, R.G.; Sheather, S.J., Robust estimation and testing: wiley series in probability and mathematical statistics, (1990), Wiley New York
[32] Vapnik, V.N., The natural of statistical learning theory, (1995), Springer New York
[33] Vapnik, V.N., Statistical learning theory, (1998), Wiley New York · Zbl 0935.62007
[34] Wang, W.; Xu, Z., A heuristic training for support vector regression, Neurocomputing, 61, 259-275, (2004)
[35] Weisberg, S., Applied linear regression, (1985), Wiley New York · Zbl 0646.62058
[36] Wen, W; Hao, Z.; Yang, X., A heuristic weight-setting strategy and iteratively updating algorithm for weighted least-squares support vector regression, Neurocomputing, 71, 3096-3103, (2008)
[37] MATLAB, User’s guide, (2001), The MathWorks, Inc
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.