×

Improving weighted information criterion by using optimization. (English) Zbl 1181.62155

Summary: Although artificial neural networks (ANN) have been widely used in forecasting time series, the determination of the best model is still a problem that has been studied a lot. Various approaches available in the literature have been proposed in order to select the best model for forecasting in ANN in recent years. One of these approaches is to use a model selection strategy based on the weighted information criterion (WIC). The WIC is calculated by summing weighted different selection criteria which measure the forecasting accuracy of an ANN model in different ways. In the calculation of WIC, the weights of different selection criteria are determined heuristically.
In this study, these weights are calculated by using optimization in order to obtain a more consistent criterion. Four real time series are analyzed in order to show the efficiency of the improved WIC. When the weights are determined based on the optimization, it is obviously seen that the improved WIC produces better results.

MSC:

62M45 Neural nets and related approaches to inference from stochastic processes
62M20 Inference from stochastic processes and prediction
90C90 Applications of mathematical programming
62M10 Time series, auto-correlation, regression, etc. in statistics (GARCH)
PDFBibTeX XMLCite
Full Text: DOI

References:

[1] Roy, A.; Kim, L. S.; Mukhopadhyay, S., A Polynomial time algorithm for the construction and training of a class of multilayer perceptrons, Neural Networks, 6, 535-545 (1993)
[2] Wang, Z.; Massimo, C. D.; Tham, M. T.; Morris, A. J., A procedure for determining the topology of multilayer feedforward neural networks, Neural Networks, 7, 291-300 (1994)
[3] Murata, N.; Yoshizawa, S.; Amari, S., Network information criterion—Determining the number of hidden units for an artificial neural network model, IEEE Transactions on Neural Networks, 5, 865-872 (1994)
[4] Rathbun, T. F.; Rogers, S. K.; DeSimio, M. P.; Oxley, M. E., MLP iterative construction algorithm, Neurocomputing, 17, 3-4, 195-216 (1997)
[5] Lahnajarvi, J. J.T.; Lehtokangas, M. I.; Saarinen, J. P.P., Evaluation of constructive neural networks with cascaded Architectures, Neurocomputing, 48, 573-607 (2002) · Zbl 1006.68762
[6] Reed, R., Pruning algorithms a survey, IEEE Transactions on Neural Networks, 4, 740-747 (1993)
[7] Siestema, J.; Dow, R., Neural net pruning—Why and how?, Proceedings of the IEEE International Conference on Neural Networks, 1, 325-333 (1988)
[8] Buhamra, S.; Smaoui, N.; Gabr, M., The Box-Jenkins analysis and neural networks: And time series modeling, Applied Mathematical Modelling, 27, 805-815 (2003) · Zbl 1040.62106
[9] Yuan, H. C.; Xiong, F. L.; Huai, X. Y., A method for estimating the number of hidden neurons in feed-forward neural networks based on information entropy, Computers and Electronics in Agriculture, 40, 57-64 (2003)
[10] Dam, M.; Saraf, D. N., Design of neural networks using genetic algorithm for on-line property estimation of crude fractionator products, Computers & Chemical Engineering, 30, 722-729 (2006)
[11] Zeng, J.; Guo, H.; Hu, Y., Artificial neural network model for identifying taxi gross emitter from remote sensing data of vehicle emission, Journal of Environmental Sciences, 19, 427-431 (2007)
[12] Egrioglu, E.; Aladag, C. H.; Günay, S., A new model selection strategy in artificial neural networks, Applied Mathematics and Computation, 195, 591-597 (2008) · Zbl 1196.62125
[13] Durbin, B.; Dudoit, S.; Van Der Laan, M. J., A deletion/substitution/addition algorithm for lassification neural networks, with applications to biomedical data, Journal of Statistical Planning and Inference, 138, 464-488 (2008) · Zbl 1138.62058
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.