×

Multi-variable regression methods using modified Chebyshev polynomials of class 2. (English) Zbl 1452.62487

Summary: So far, many regression works have been implemented by using linear regression methods. Although more accurate predictions results could be obtained, polynomial regression is not used as much as compared to linear regression in real applications due to occurrence of coefficient explosion. To overcome this problem, two regression algorithms using Chebyshev polynomials of class 2 based on cascade regression and feature selection are proposed in this paper. In the experimental part, three separate experiments including function interpolation and real-case regression were conducted on three datasets to test the proposed algorithms. As shown by the experimental results, the proposed algorithms performed better than other regression methods in terms of both accuracy and processing time.

MSC:

62J02 General nonlinear regression
62-08 Computational methods for problems pertaining to statistics

Software:

UCI-ml
PDFBibTeX XMLCite
Full Text: DOI

References:

[1] Hoel, P. G., A simple solution for optimal Chebyshev regression extrapolation, Ann. Math. Statist., 37, 720-725, (1966) · Zbl 0151.23703
[2] Celant, G.; Broniatowski, M., Interpolation and extrapolation optimal designs V1: polynomial regression and approximation theory, (2016), John Wiley & Sons · Zbl 1343.65008
[3] M. Broniatowski, G. Celant, Optimal design for linear forms of the parameters in a chebyshev regression, 2014.; M. Broniatowski, G. Celant, Optimal design for linear forms of the parameters in a chebyshev regression, 2014. · Zbl 1343.65008
[4] M. Broniatowski, G. Celant, Optimal extrapolation design for the chebyshev regression, 2014.; M. Broniatowski, G. Celant, Optimal extrapolation design for the chebyshev regression, 2014. · Zbl 1343.65008
[5] Caporale, G. M.; Cerrato, M., Using Chebyshev polynomials to approximate partial differential equations, Comput. Econ., 35, 235-244, (2010) · Zbl 1182.93119
[6] Shuman, D. I.; Vandergheynst, P.; Frossard, P., Chebyshev polynomial approximation for distributed signal processing, (2011 Int. Conf. on Distrib. Comput. in Sens. Syst. and Workshops, (DCOSS), (2011), IEEE), 1-8
[7] Wu, J.; Luo, Z.; Zheng, J.; Jiang, C., Incremental modeling of a new high-order polynomial surrogate model, Appl. Math. Model., 40, 4681-4699, (2016)
[8] Zhang, Y.; Yu, X.; Guo, D.; Yin, Y.; Zhang, Z., Weights and structure determination of multiple-input feed-forward neural network activated by Chebyshev polynomials of class 2 via cross-validation, Neural Comput. Appl., 25, 1761-1770, (2014)
[9] Zhang, Y.; Yin, Y.; Guo, D.; Yu, X.; Xiao, L., Cross-validation based weights and structure determination of Chebyshev-polynomial neural networks for pattern classification, Pattern Recognit., 47, 3414-3428, (2014) · Zbl 1373.68362
[10] Zhao, J.; Yan, G.; Feng, B.; Mao, W.; Bai, J., An adaptive support vector regression based on a new sequence of unified orthogonal polynomials, Pattern Recognit., 46, 899-913, (2013) · Zbl 1254.68235
[11] Li, Y.; Luo, M.-L.; Li, K., A multiwavelet-based time-varying model identification approach for time-frequency analysis of eeg signals, Neurocomputing, 193, 106-114, (2016)
[12] Mitra, R.; Bhatia, V., Chebyshev polynomial-based adaptive predistorter for nonlinear led compensation in vlc, IEEE Photonics Technol. Lett., 28, 1053-1056, (2016)
[13] Sorsa, A.; Leiviskä, K.; Santa-aho, S.; Lepistö, T., Quantitative prediction of residual stress and hardness in case-hardened steel based on the barkhausen noise measurement, Ndt & E Int., 46, 100-106, (2012)
[14] Altpeter, I.; Becker, R.; Dobmann, G.; Kern, R.; Theiner, W.; Yashan, A., Robust solutions of inverse problems in electromagnetic non-destructive evaluation, Inverse Problems, 18, 1907, (2002) · Zbl 1092.78509
[15] Centner, V.; Massart, D.-L.; de Noord, O. E.; de Jong, S.; Vandeginste, B. M.; Sterna, C., Elimination of uninformative variables for multivariate calibration, Anal. Chem., 68, 3851-3858, (1996)
[16] Araújo, M. C.U.; Saldanha, T. C.B.; Galvao, R. K.H.; Yoneyama, T.; Chame, H. C.; Visani, V., The successive projections algorithm for variable selection in spectroscopic multicomponent analysis, Chemom. Intell. Lab. Syst., 57, 65-73, (2001)
[17] Battiti, R., Using mutual information for selecting features in supervised neural net learning, IEEE Trans. Neural Netw., 5, 537-550, (1994)
[18] Peng, H.; Long, F.; Ding, C., Feature selection based on mutual information criteria of MAX-dependency, MAX-relevance, and MIN-redundancy, IEEE Trans. Pattern Anal. Mach. Intell., 27, 1226-1238, (2005)
[19] Zugasti, E.; Mujica, L. E.; Anduaga, J.; Martinez, F., Feature selection-extraction methods based on pca and mutual information to improve damage detection problem in offshore wind turbines, (Key Eng. Mater., Vol. 569, (2013), Trans Tech Publ), 620-627
[20] Wang, Z.; Li, M.; Li, J., A multi-objective evolutionary algorithm for feature selection based on mutual information with a new redundancy measure, Inform. Sci., 307, 73-88, (2015) · Zbl 1387.68200
[21] Cao, X.; Wei, Y.; Wen, F.; Sun, J., Face alignment by explicit shape regression, Int. J. Comput. Vis., 107, 177-190, (2014)
[22] Dollár, P.; Welinder, P.; Perona, P., Cascaded pose regression, (2010 IEEE Conf. on Comput. Vis. and Pattern Recognit., (CVPR), (2010), IEEE), 1078-1085
[23] X. Xiong, F. De la Torre, Supervised descent method for solving nonlinear least squares problems in computer vision, 2014. arXiv preprint arXiv:1405.0601; X. Xiong, F. De la Torre, Supervised descent method for solving nonlinear least squares problems in computer vision, 2014. arXiv preprint arXiv:1405.0601
[24] X. Xiong, F. De la Torre, Supervised descent method and its applications to face alignment, in: Proc. of the IEEE conf. on Comput. Vis. and pattern recognit., 2013, pp. 532-539.; X. Xiong, F. De la Torre, Supervised descent method and its applications to face alignment, in: Proc. of the IEEE conf. on Comput. Vis. and pattern recognit., 2013, pp. 532-539.
[25] Tzimiropoulos, G., Project-out cascaded regression with an application to face alignment, (2015 IEEE Conf. on Comput. Vis. and Pattern Recognit., (CVPR), (2015), IEEE), 3659-3667
[26] S. Ren, X. Cao, Y. Wei, J. Sun, Face alignment at 3000 fps via regressing local binary features, in: Proc. of the IEEE Conf. on Comput. Vis. and Pattern Recognit., 2014, pp. 1685-1692.; S. Ren, X. Cao, Y. Wei, J. Sun, Face alignment at 3000 fps via regressing local binary features, in: Proc. of the IEEE Conf. on Comput. Vis. and Pattern Recognit., 2014, pp. 1685-1692.
[27] Zhou, M.; Wang, X.; Wang, H.; Heo, J.; Nam, D., Precise eye localization with improved sdm, (2015 IEEE Int. Conf. on Image Process., (ICIP), (2015), IEEE), 4466-4470
[28] Sorsa, A.; Isokangas, A.; Santa-aho, S.; Vippola, M.; Lepistö, T.; Leiviskä, K., Prediction of residual stresses using partial least squares regression on barkhausen noise signals, J. Nondestructive Eval., 33, 43-50, (2014)
[29] Bastien, P.; Vinzi, V. E.; Tenenhaus, M., Pls generalised linear regression, Comput. Statist. Data Anal., 48, 17-46, (2005) · Zbl 1429.62316
[30] Liang, X.-Z.; Li, Q., Multivariate approximation, 21-45, (2005), National Defense Industry Press Beijing
[31] A. Frank, A. Asuncion, Uci machine learning repository http://archive.ics.uci.edu/ml; A. Frank, A. Asuncion, Uci machine learning repository http://archive.ics.uci.edu/ml
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.