×

Composite quantile regression and the oracle model selection theory. (English) Zbl 1360.62394

Summary: Coefficient estimation and variable selection in multiple linear regression is routinely done in the (penalized) least squares (LS) framework. The concept of model selection oracle introduced by J. Fan and R. Li [J. Am. Stat. Assoc. 96, No. 456, 1348–1360 (2001; Zbl 1073.62547)] characterizes the optimal behavior of a model selection procedure. However, the least-squares oracle theory breaks down if the error variance is infinite. In the current paper we propose a new regression method called composite quantile regression (CQR). We show that the oracle model selection theory using the CQR oracle works beautifully even when the error variance is infinite. We develop a new oracular procedure to achieve the optimal properties of the CQR oracle. When the error variance is finite, CQR still enjoys great advantages in terms of estimation efficiency. We show that the relative efficiency of CQR compared to the least squares is greater than 70% regardless the error distribution. Moreover, CQR could be much more efficient and sometimes arbitrarily more efficient than the least squares. The same conclusions hold when comparing a CQR-oracular estimator with a LS-oracular estimator.

MSC:

62J05 Linear regression; mixed models
62J07 Ridge regression; shrinkage estimators (Lasso)

Citations:

Zbl 1073.62547

References:

[1] Breiman, L. (1995). Better subset regression using the nonnegative garrote. Technometrics 37 373-384. JSTOR: · Zbl 0862.62059 · doi:10.2307/1269730
[2] Fan, J. and Li, R. (2001). Variable selection via nonconcave penalized likelihood and its oracle properties. J. Amer. Statist. Assoc. 96 1348-1360. JSTOR: · Zbl 1073.62547 · doi:10.1198/016214501753382273
[3] Fan, J. and Li, R. (2006). Statistical challenges with high dimensionality: Feature selection in knowledge discovery. Proceedings of the Madrid International Congress of Mathematicians 2006 III 595-622. EMS, Zurich. · Zbl 1117.62137
[4] Feller, W. (1968). An Introduction to Probability Theory and Its Applications . 1 , 3rd ed. Wiley, New York. · Zbl 0155.23101
[5] Knight, K. (1998). Limiting distributions for l 1 regression estimators under general conditions. Ann. Statist. 26 755-770. · Zbl 0929.62021 · doi:10.1214/aos/1028144858
[6] Koenker, R. (2005). Quantile Regression . Cambridge Univ. Press. · Zbl 1111.62037
[7] Koenker, R. and Geling, R. (2001). Reappraising medfly longevity: A quantile regression survival analysis. J. Amer. Statist. Assoc. 96 458-468. JSTOR: · Zbl 1019.62100 · doi:10.1198/016214501753168172
[8] Koenker, R. and Hallock, K. (2001). Quantile regression. J. Economic Perspectives 15 143-156.
[9] Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. J. Roy. Statist. Soc. Ser. B 58 267-288. JSTOR: · Zbl 0850.62538
[10] Zou, H. (2006). The adaptive lasso and its oracle properties. J. Amer. Statist. Assoc. 101 1418-1429. · Zbl 1171.62326 · doi:10.1198/016214506000000735
[11] Zou, H. and Yuan, M. (2007). Composite quantile regression and the oracle model selection theory. Technical report, Univ. Minnesota. · Zbl 1360.62394 · doi:10.1214/07-AOS507
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.