*(English)*Zbl 1037.62001

Summary: Our purpose in this paper is to provide a general approach to model selection via penalization for Gaussian regression and to develop our point of view about this subject. The advantage and importance of model selection come from the fact that it provides a suitable approach to many different types of problems, starting from model selection per se (among a family of parametric models, which one is more suitable for the data at hand), which includes for instance variable selection in regression models, to nonparametric estimation, for which it provides a very powerful tool that allows adaptation under quite general circumstances.

Our approach to model selection also provides a natural connection between the parametric and nonparametric points of view and copes naturally with the fact that a model is not necessarily true. The method is based on the penalization of a least squares criterion which can be viewed as a generalization of Mallows’ ${C}_{p}$. A large part of our efforts will be put on choosing properly the list of models and the penalty function for various estimation problems, like classical variable selection or adaptive estimation for various types of ${l}_{p}$-bodies.

##### MSC:

62A01 | Foundations and philosophical topics in statistics |

62M10 | Time series, auto-correlation, regression, etc. (statistics) |

62G07 | Density estimation |

62C20 | Statistical minimax procedures |

41A46 | Approximation by arbitrary nonlinear expressions; widths and entropy |

62J05 | Linear regression |