×

zbMATH — the first resource for mathematics

A note on the validity of cross-validation for evaluating autoregressive time series prediction. (English) Zbl 06920205
Summary: One of the most widely used standard procedures for model evaluation in classification and regression is \(K\)-fold cross-validation (CV). However, when it comes to time series forecasting, because of the inherent serial correlation and potential non-stationarity of the data, its application is not straightforward and often replaced by practitioners in favour of an out-of-sample (OOS) evaluation. It is shown that for purely autoregressive models, the use of standard \(K\)-fold CV is possible provided the models considered have uncorrelated errors. Such a setup occurs, for example, when the models nest a more appropriate model. This is very common when Machine Learning methods are used for prediction, and where CV can control for overfitting the data. Theoretical insights supporting these arguments are presented, along with a simulation study and a real-world example. It is shown empirically that \(K\)-fold CV performs favourably compared to both OOS evaluation and other time-series-specific techniques such as non-dependent cross-validation.

MSC:
62 Statistics
Software:
expsmooth; forecastML; R
PDF BibTeX XML Cite
Full Text: DOI
References:
[1] Andrews, D. W., Consistency in nonlinear econometric models: A generic uniform law of large numbers, Econometrica, 1465-1471, (1987) · Zbl 0646.62101
[2] Arlot, S.; Celisse, A., A survey of cross-validation procedures for model selection, Stat. Surv., 4, 40-79, (2010) · Zbl 1190.62080
[3] Bergmeir, C.; Benítez, J. M., On the use of cross-validation for time series predictor evaluation, Inform. Sci., 191, 192-213, (2012)
[4] Bergmeir, C.; Costantini, M.; Benítez, J. M., On the usefulness of cross-validation for directional forecast evaluation, Comput. Statist. Data Anal., 76, 132-143, (2014)
[5] Borra, S.; Di Ciaccio, A., Measuring the prediction error. A comparison of cross-validation, bootstrap and covariance penalty methods, Comput. Statist. Data Anal., 54, 12, 2976-2989, (2010) · Zbl 1284.62147
[6] Brockwell, P. J.; Davis, R. A., Time Series: Theory and Methods, (1991), Springer New York · Zbl 0709.62080
[7] Budka, M.; Gabrys, B., Density-preserving sampling: robust and efficient alternative to cross-validation for error estimation, IEEE Trans. Neural Netw. Learn. Syst., 24, 1, 22-34, (2013)
[8] Burman, P.; Chow, E.; Nolan, D., A cross-validatory method for dependent data, Biometrika, 81, 2, 351-358, (1994) · Zbl 0825.62669
[9] Burman, P.; Nolan, D., Data-dependent estimation of prediction functions, J. Time Series Anal., 13, 3, 189-207, (1992) · Zbl 0754.62018
[10] Györfi, L.; Härdle, W.; Sarda, P.; Vieu, P., Nonparametric Curve Estimation from Time Series, (1989), Springer Verlag Berlin · Zbl 0697.62038
[11] Hastie, T.; Tibshirani, R.; Friedman, J., Elements of Statistical Learning, (2009), Springer New York
[12] Hyndman, R. J.; Koehler, A. B.; Ord, J. K.; Snyder, R. D., Forecasting with Exponential Smoothing: The State Space Approach, (2008), Springer Berlin, URL http://www.exponentialsmoothing.net · Zbl 1211.62165
[13] Kunst, R., Cross validation of prediction models for seasonal time series by parametric bootstrapping, Austral. J. Statist., 37, 271-284, (2008)
[14] Ljung, G. M.; Box, G. E.P., On a measure of lack of fit in time series models, Biometrika, 297-303, (1978) · Zbl 0386.62079
[15] McQuarrie, A. D.R.; Tsai, C.-L., Regression and time series model selection, (1998), World Scientific Publishing · Zbl 0907.62095
[16] Mokkadem, A., Mixing properties of ARMA processes, Stochastic Process. Appl., 29, 2, 309-315, (1988) · Zbl 0647.60042
[17] Moreno-Torres, J.; Saez, J.; Herrera, F., Study on the impact of partition-induced dataset shift on k-fold cross-validation, IEEE Trans. Neural Netw. Learn. Syst., 23, 8, 1304-1312, (2012)
[18] Opsomer, J.; Wang, Y.; Yang, Y., Nonparametric regression with correlated errors, Statist. Sci., 16, 2, 134-153, (2001) · Zbl 1059.62537
[19] R Core Team, 2014. R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria. URL http://www.R-project.org/.
[20] Racine, J., Consistent cross-validatory model-selection for dependent data: hv-block cross-validation, J. Econometrics, 99, 1, 39-61, (2000) · Zbl 1011.62118
[21] Stone, M., Cross-validatory choice and assessment of statistical predictions, J. R. Stat. Soc. Ser. B Stat. Methodol., 36, 2, 111-147, (1974) · Zbl 0308.62063
[22] Tong, H., Non-linear Time Series: A Dynamical System Approach, (1993), Clarendon Press
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.