Brown, Lawrence D.; Hwang, Jiunn T. Universal domination and stochastic domination: U-admissibility and U- inadmissibility of the least squares estimator. (English) Zbl 0674.62007 Ann. Stat. 17, No. 1, 252-267 (1989). Summary: Assume the standard linear model \[ _{n\times 1}=_{n\times p}\theta_{p\times 1}+\epsilon_{n\times 1}, \] where \(\epsilon\) has an n-variate normal distribution with zero mean vector and identity covariance matrix. The least squares estimator for the coefficient \(\theta\) is \({\hat \theta}\equiv (A'A)^{-1}A'X\). It is well known that \({\hat \theta}\) is dominated by James-Stein type estimators under the sum of squared error loss \(| \theta -{\hat \theta}|^ 2\) when \(p\geq 3\). We discuss the possibility of improving upon \({\hat \theta}\), simultaneously under the “universal” class of losses: \[ \{L(| \theta -{\hat \theta}|):\quad L(\cdot)\quad any\quad nondecrea\sin g\quad function\}. \] An estimator that can be so improved is called universally inadmissible (U-inadmissible). Otherwise it is called U- admissible. We prove that \({\hat \theta}\) is U-admissible for any p when \(A'A=I\). Furthermore, if \(A'A\neq I\), then \({\hat \theta}\) is U-inadmissible if p is “large enough”. In a special case, \(p\geq 4\) is large enough. The results are surprising. Implications are discussed. Cited in 2 ReviewsCited in 9 Documents MSC: 62C15 Admissibility in statistical decision theory 62J07 Ridge regression; shrinkage estimators (Lasso) 62C05 General considerations in statistical decision theory 62F10 Point estimation Keywords:universal domination; stochastic domination; James-Stein positive part estimator; universal class of losses; standard linear model; normal distribution; zero mean vector; identity covariance matrix; least squares estimator; James-Stein type estimators; squared error loss; universally inadmissible; U-inadmissible; U-admissible × Cite Format Result Cite Review PDF Full Text: DOI