×

Finite mixture regression: a sparse variable selection by model selection for clustering. (English) Zbl 1329.62279

Summary: We consider a finite mixture of Gaussian regression models for high-dimensional data, where the number of covariates may be much larger than the sample size. We propose to estimate the unknown conditional mixture density by a maximum likelihood estimator, restricted on relevant variables selected by an \(\ell_{1}\)-penalized maximum likelihood estimator. We get an oracle inequality satisfied by this estimator with a Jensen-Kullback-Leibler type loss. Our oracle inequality is deduced from a general model selection theorem for maximum likelihood estimators on a random model subcollection. We can derive the penalty shape of the criterion, which depends on the complexity of the random model collection.

MSC:

62H30 Classification and discrimination; cluster analysis (statistical aspects)
PDF BibTeX XML Cite
Full Text: DOI arXiv Euclid

References:

[1] Akaike, H. A new look at the statistical model identification., IEEE Transactions on Automatic Control , 19(6):716-723, 1974. · Zbl 0314.62039
[2] Baraud, Y., Giraud, C., and Huet, S. Gaussian model selection with an unknown variance., The Annals of Statistics , 37(2):630-672, 2009. URL . · Zbl 1162.62051
[3] Belloni, A. and Chernozhukov, V. Least squares after model selection in high-dimensional sparse models., Bernoulli , 19(2):521-547, 2013. ISSN 1350-7265. URL . · Zbl 1456.62066
[4] Bickel, P., Ritov, Y., and Tsybakov, A. Simultaneous analysis of Lasso and Dantzig selector., The Annals of Statistics , 37(4) :1705-1732, 2009. ISSN 0090-5364. URL . · Zbl 1173.62022
[5] Birgé, L. and Massart, P. Minimal penalties for Gaussian model selection., Probab. Theory Related Fields , 138(1-2), 2007. · Zbl 1112.62082
[6] Cohen, S. and Le Pennec, E. Conditional density estimation by penalized likelihood model selection and applications. Research Report RR -7596, 2011. URL, .
[7] Devijver, E. Model-based clustering for high-dimensional data. Application to functional data, 2014.,
[8] Ferraty, F. and Vieu, P., Nonparametric functional data analysis: Theory and practice . Springer series in statistics. Springer, New York, 2006. ISBN 0-387-30369-3. URL . · Zbl 1119.62046
[9] Genovese, C. and Wasserman, L. Rates of convergence for the Gaussian mixture sieve., Annals of Statistics , 28(4) :1105-1127, 2000. ISSN 0090-5364. URL . · Zbl 1105.62333
[10] Guo, J., Levina, E., Michailidis, G., and Zhu, J. Pairwise variable selection for high-dimensional model-based clustering., Biometrics , 66(3):793-804, 2010. ISSN 0006-341X; 1541-0420/e. · Zbl 1203.62190
[11] Massart, P., Concentration inequalities and model selection . Lecture Notes in Mathematics. Springer, 33, 2003, Saint-Flour, Cantal, 2007. ISBN 978-3-540-48497-4. URL . · Zbl 1170.60006
[12] Massart, P. and Meynet, C. The Lasso as an \(\ell_1\)-ball model selection procedure., Electronic Journal of Statistics , 5:669-687, 2011. ISSN 1935-7524. URL . · Zbl 1274.62468
[13] Meinshausen, N. and Bühlmann, P. Stability selection., Journal of the Royal Statistical Society: Series B (Statistical Methodology) , 72(4):417-473, 2010. ISSN 13697412. URL .
[14] Meynet, C. and Maugis-Rabusseau, C. A sparse variable selection procedure in model-based clustering. Research report, Sept. 2012. URL, .
[15] Pan, W. and Shen, X. Penalized model-based clustering with application to variable selection., Journal of Machine Learning Research , 8 :1145-1164, 2007. ISSN 1532-4435. URL . · Zbl 1222.68279
[16] Schwarz, G. Estimating the dimension of a model., The Annals of Statistics , 6(2):461-464, 1978. · Zbl 0379.62005
[17] Städler, N., Bühlmann, P., and van de Geer, S. \(\ell_1\)-penalization for mixture regression models., Test , 19(2):209-256, 2010. · Zbl 1203.62128
[18] Sun, T. and Zhang, C.-H. Scaled sparse linear regression., Biometrika , 99(4):879-898, 2012. ISSN 0006-3444. URL . · Zbl 1452.62515
[19] Sun, W., Wang, J., and Fang, Y. Regularized k-means clustering of high-dimensional data and its asymptotic consistency., Electronic Journal of Statistics , 6:148-167, 2012. URL . · Zbl 1335.62109
[20] Thalamuthu, A., Mukhopadhyay, I., Zheng, X., and Tseng, G. Evaluation and comparison of gene clustering methods in microarray analysis., Bioinformatics , 22(19) :2405-2412, 2006. URL .
[21] Tibshirani, R. Regression shrinkage and selection via the lasso., Journal of the Royal Statistical Society. Series B. , 58(1):267-288, 1996. · Zbl 0850.62538
[22] van de Geer, S. and Bühlmann, P. On the conditions used to prove oracle results for the Lasso., Electronic Journal of Statistics , 3 :1360-1392, 2009. ISSN 1935-7524. URL . · Zbl 1327.62425
[23] Yang, Y. and Barron, A. Information-theoretic determination of minimax rates of convergence., The Annals of Statistics , 27(5) :1564-1599, 1999. ISSN 0090-5364. URL . · Zbl 0978.62008
[24] Zhao, P. and Yu, B. On model selection consistency of lasso., Journal of Machine Learning Research , 7 :2541-2563, 2006. ISSN 1532-4435. URL . · Zbl 1222.62008
[25] Zhou, H., Pan, W., and Shen, X. Penalized model-based clustering with unconstrained covariance matrices., Electronic Journal of Statistics , 3 :1473-1496, 2009. URL . · Zbl 1326.62143
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.