zbMATH — the first resource for mathematics

On predictive least squares principles. (English) Zbl 0801.62083
The paper considers various principles of regression model selection. The recently proposed PLS (predictive least squares) criterion is shown to be decomposable into the residual sum of squares and a penalty term (staying for the complexity of the model). Then an asymptotic statistical interpretation of the penalty term is offered. Conditions for the strong consistency of PSL for stochastic regression models are presented and illustrated by the example of a fixed design model and of an unstable autoregressive process. Also the strong consistency for BIC (an information-based criterion which similarly as the Akaike criterion minimizes a loss function which is the sum of residual sum of squares and a penalty term) is proposed, and then the equivalence (up to \(o(1)\) a.s.) between PLS and BIC is derived. Further the numerical performance of PLS is studied and advantages and disadvantages indicated. This results in a proposition of a new criterion FIC (based on Fisher information) and its features and relationship with PLS are explained. The paper is accompanied by a simulation study showing the advantages of FIC.

62M10 Time series, auto-correlation, regression, etc. in statistics (GARCH)
62M20 Inference from stochastic processes and prediction
62J05 Linear regression; mixed models
Full Text: DOI