Hurvich, Clifford M.; Tsai, Chih-Ling Regression and time series model selection in small samples. (English) Zbl 0669.62085 Biometrika 76, No. 2, 297-307 (1989). The problems of regression and autoregressive model selection are closely related. One of the leading model selection methods is the Akaike information criterion AIC. But it tends to overfit the model what becomes evident when one examines plots of AIC and the actual Kullback-Leibler information for the various candidate models. In this paper a bias-corrected version of AIC for nonlinear regression and autoregressive time series models is obtained. Monte Carlo results for linear regression model selection and for autoregressive model selection are presented. Amongst eight criteria (including Mallows Cp, the AIC criterion, the corrected AIC criterion \(AIC_ C\) and a criterion of Schwarz) the \(AIC_ C\) had the largest percentage of correct selection (96 and 88 %, respectively, for samples of size 10 and 20, respectively, in the linear regression case). \(AIC_ C\) seems to be preferable especially for small samples. Reviewer: D.Rasch Cited in 6 ReviewsCited in 365 Documents MSC: 62M10 Time series, auto-correlation, regression, etc. in statistics (GARCH) 62J05 Linear regression; mixed models Keywords:asymptotic efficiency; Akaike information criterion; Kullback-Leibler information; bias-corrected version of AIC; autoregressive time series models; Monte Carlo results; linear regression model selection; autoregressive model selection PDF BibTeX XML Cite \textit{C. M. Hurvich} and \textit{C.-L. Tsai}, Biometrika 76, No. 2, 297--307 (1989; Zbl 0669.62085) Full Text: DOI Link OpenURL