×

On the adaptive elastic net with a diverging number of parameters. (English) Zbl 1168.62064

Summary: We consider the problem of model selection and estimation in situations where the number of parameters diverges with the sample size. When the dimension is high, an ideal method should have the oracle property [J. Fan and R. Li, J. Am. Stat. Assoc. 96, No. 456, 1348–1360 (2001; Zbl 1073.62547) and J. Fan and H. Peng, Ann. Stat. 32, No. 3, 928–961 (2004; Zbl 1092.62031)] which ensures the optimal large sample performance. Furthermore, the high-dimensionality often induces the collinearity problem, which should be properly handled by the ideal method. Many existing variable selection methods fail to achieve both goals simultaneously.
We propose the adaptive elastic-net that combines the strengths of the quadratic regularization and the adaptively weighted lasso shrinkage. Under weak regularity conditions, we establish the oracle property of the adaptive elastic-net. We show by simulations that the adaptive elastic-net deals with the collinearity problem better than the other oracle-like methods, thus enjoying much improved finite sample performance.

MSC:

62J05 Linear regression; mixed models
62J07 Ridge regression; shrinkage estimators (Lasso)
65C60 Computational problems in statistics (MSC2010)
PDF BibTeX XML Cite
Full Text: DOI arXiv

References:

[1] Breiman, L. (1996). Heuristics of instability and stabilization in model selection. Ann. Statist. 24 2350-2383. · Zbl 0867.62055
[2] Candes, E. and Tao, T. (2007). The Dantzig selector: Statistical estimation when p is much larger than n. Ann. Statist. 35 2313-2351. · Zbl 1139.62019
[3] Candes, E., Wakin, M. and Boyd, S. (2008). Enhancing sparsity by reweighted l1 minimization. J. Fourier Anal. Appl. · Zbl 1176.94014
[4] Donoho, D. and Johnstone, I. (1994). Ideal spatial adaptation via wavelet shrinkage. Biometrika 81 425-455. JSTOR: · Zbl 0815.62019
[5] Donoho, D., Johnstone, I., Kerkyacharian, G. and Picard, D. (1995). Wavelet shrinkage: Asymptopia? (with discussion). J. Roy. Statist. Soc. Ser. B 57 301-337. JSTOR: · Zbl 0827.62035
[6] Efron, B., Hastie, T., Johnstone, I. and Tibshirani, R. (2004). Least angle regression. Ann. Statist. 32 407-499. · Zbl 1091.62054
[7] Fan, J. and Li, R. (2001). Variable selection via nonconcave penalized likelihood and its oracle properties. J. Amer. Statist. Assoc. 96 1348-1360. JSTOR: · Zbl 1073.62547
[8] Fan, J. and Li, R. (2006). Statistical challenges with high dimensionality: Feature selection in knowledge discovery. In International Congress of Mathematicians 3 595-622. · Zbl 1117.62137
[9] Fan, J. and Lv, J. (2008). Sure independence screening for ultra-high-dimensional feature space. J. Roy. Statist. Soc. Ser. B 70 849-911.
[10] Fan, J. and Peng, H. (2004). Nonconcave penalized likelihood with a diverging number of parameters. Ann. Statist. 32 928-961. · Zbl 1092.62031
[11] Fan, J., Peng, H. and Huang, T. (2005). Semilinear high-dimensional model for normalization of microarray data: A theoretical analysis and partial consistency (with discussion). J. Amer. Statist. Assoc. 100 781-813. · Zbl 1117.62330
[12] Huber, P. (1988). Robust regression: Asymptotics, conjectures and Monte Carlo. Ann. Statist. 1 799-821. · Zbl 0289.62033
[13] Knight, K. and Fu, W. (2000). Asymptotics for lasso-type estimators. Ann. Statist. 28 1356-1378. · Zbl 1105.62357
[14] Lam, C. and Fan, J. (2008). Profile-kernel likelihood inference with diverging number of parameters. Ann. Statist. 36 2232-2260. · Zbl 1274.62289
[15] Portnoy, S. (1984). Asymptotic behavior of M-estimatiors of p regression parameters when p 2 / n is large. I. Consistency. Ann. Statist. 12 1298-1309. · Zbl 0584.62050
[16] Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. J. Roy. Statist. Soc. Ser. B 58 267-288. JSTOR: · Zbl 0850.62538
[17] Wang, H., Li, R. and Tsai, C. (2007). Tuning parameter selectors for the smoothly clipped absolute deviation method. Biometrika 94 553-568. · Zbl 1135.62058
[18] Zou, H. (2006). The adaptive lasso and its oracle properties. J. Amer. Statist. Assoc. 101 1418-1429. · Zbl 1171.62326
[19] Zou, H. and Hastie, T. (2005). Regularization and variable selection via the elastic net. J. R. Stat. Soc. Ser. B Stat. Methodol. 67 301-320. JSTOR: · Zbl 1069.62054
[20] Zou, H., Hastie, T. and Tibshirani, R. (2007). On the degrees of freedom of the lasso. Ann. Statist. 35 2173-2192. · Zbl 1126.62061
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.