×

zbMATH — the first resource for mathematics

Variance estimation in high-dimensional linear models. (English) Zbl 1452.62495
Summary: The residual variance and the proportion of explained variation are important quantities in many statistical models and model fitting procedures. They play an important role in regression diagnostics and model selection procedures, as well as in determining the performance limits in many problems. In this paper we propose new method-of-moments-based estimators for the residual variance, the proportion of explained variation and other related quantities, such as the \(\ell^2\) signal strength. The proposed estimators are consistent and asymptotically normal in high-dimensional linear models with Gaussian predictors and errors, where the number of predictors \(d\) is proportional to the number of observations \(n\); in fact, consistency holds even in settings where \(d/n \to \infty\). Existing results on residual variance estimation in high-dimensional linear models depend on sparsity in the underlying signal. Our results require no sparsity assumptions and imply that the residual variance and the proportion of explained variation can be consistently estimated even when \(d>n\) and the underlying signal itself is nonestimable. Numerical work suggests that some of our distributional assumptions may be relaxed. A real-data analysis involving gene expression data and single nucleotide polymorphism data illustrates the performance of the proposed methods.

MSC:
62J05 Linear regression; mixed models
62F10 Point estimation
62F12 Asymptotic properties of parametric estimators
PDF BibTeX XML Cite
Full Text: DOI