×

zbMATH — the first resource for mathematics

Prediction with a flexible finite mixture-of-regressions. (English) Zbl 07027222
Summary: Finite mixture regression (FMR) is widely used for modeling data that originate from heterogeneous populations. In these settings, FMR can offer increased predictive power compared to more traditional one-class models. However, existing FMR methods rely heavily on mixtures of linear models, where the linear predictor must be given as an input. A flexible FMR model is presented using a combination of the random forest learner and a penalized linear FMR. The performance of the new method is assessed by predictive log-likelihood in extensive simulation studies. The method is shown to achieve equal performance with the existing FMR methods when the true regression functions are in fact linear and superior performance in cases where at least one of the regression functions is nonlinear. The method can handle a large number of covariates, and its predictive ability is not greatly affected by surplus variables.
MSC:
62 Statistics
PDF BibTeX XML Cite
Full Text: DOI
References:
[1] Belloni, A.; Chernozhukov, V., Least squares after model selection in high-dimensional sparse models, Bernoulli, 19, 2, 521-547, (2013) · Zbl 1456.62066
[2] Bouveyron, C.; Brunet-Saumard, C., Model-based clustering of high-dimensional data: A review, Comput. Statist. Data Anal., 71, 52-78, (2014) · Zbl 06975372
[3] Breiman, L., Random forests, Mach. Learn., 45, 1, 5-32, (2001) · Zbl 1007.68152
[4] Bühlmann, P.; Van De Geer, S., Statistics for High-Dimensional Data: Methods, Theory and Applications, (2011), Springer Science & Business Media · Zbl 1273.62015
[5] Celeux, G.; Martin-Magniette, M.-L.; Maugis-Rabusseau, C.; Raftery, A. E., Comparing model selection and regularization approaches to variable selection in model-based clustering, Journal de la Societe Francaise de Statistique (2009), 155, 2, 57, (2014) · Zbl 1316.62083
[6] De Veaux, R. D., Mixtures of linear regressions, Comput. Statist. Data Anal., 8, 3, 227-245, (1989) · Zbl 0726.62109
[7] Dempster, A. P.; Laird, N. M.; Rubin, D. B., Maximum likelihood from incomplete data via the em algorithm, J. R. Stat. Soc. Ser. B Stat. Methodol., 1-38, (1977) · Zbl 0364.62022
[8] Efron, B.; Tibshirani, R. J., An Introduction to the Bootstrap, (1994), Chapman and Hall/CRC
[9] Feng, Y., Yu, Y., 2013. Consistent cross-validation for tuning parameter selection in high-dimensional variable selection. arXiv preprint arXiv:1308.5390.
[10] Friedman, J.; Hastie, T.; Tibshirani, R., Regularization paths for generalized linear models via coordinate descent, J. Stat. Softw., 33, 1, 1, (2010)
[11] Galimberti, G.; Montanari, A.; Viroli, C., Penalized factor mixture analysis for variable selection in clustered data, Comput. Statist. Data Anal., 53, 12, 4301-4310, (2009) · Zbl 1453.62094
[12] Gelman, A.; Carlin, J. B.; Stern, H. S.; Rubin, D. B., Bayesian Data Analysis, vol. 2, (2014), Taylor & Francis
[13] Hadavandi, E.; Shahrabi, J.; Hayashi, Y., SPMoE: A novel subspace-projected mixture of experts model for multi-target regression problems, Soft Comput., 20, 5, 2047-2065, (2016)
[14] Hastie, T.; Tibshirani, R.; Friedman, J., The Elements of Statistical Learning, (2009), Springer
[15] Hennig, C., Identifiablity of models for clusterwise linear regression, J. Classification, 17, 2, 273-296, (2000) · Zbl 1017.62058
[16] Huang, M.; Li, R.; Wang, H.; Yao, W., Estimating mixture of Gaussian processes by kernel smoothing, J. Bus. Econom. Statist., (2013)
[17] Huang, M.; Li, R.; Wang, S., Nonparametric mixture of regression models, J. Amer. Statist. Assoc., 108, 503, 929-941, (2013) · Zbl 06224977
[18] Hyndman, R. J., Computing and graphing highest density regions, Amer. Statist., 50, 2, 120-126, (1996)
[19] James, G., Witten, D., Hastie, T., Tibshirani, R., 2013, ISLR: Data for an introduction to statistical learning with applications in R. R package version 1.0. URL http://CRAN.R-project.org/package=ISLR. · Zbl 1281.62147
[20] Joly, A., Schnitzler, F., Geurts, P., Wehenkel, L., 2012. L1-based compression of random forest models. In: 20th European Symposium on Artificial Neural Networks.
[21] Khalili, A., New estimation and feature selection methods in mixture-of-experts models, Canad. J. Statist., 38, 4, 519-539, (2010) · Zbl 1349.62071
[22] Khalili, A.; Chen, J., Variable selection in finite mixture of regression models, J. Amer. Statist. Assoc., 102, 479, (2007) · Zbl 05564429
[23] Khalili, A.; Chen, J.; Lin, S., Feature selection in finite mixture of sparse normal linear models in high-dimensional feature space, Biostatistics, (2010)
[24] Knaus, J., 2013. snowfall: Easier Cluster Computing (Based on snow). R package version 1.84-4. URL http://CRAN.R-project.org/package=snowfall.
[25] Leisch, F., Flexmix: a general framework for finite mixture models and latent class regression in r, J. Stat. Softw., (2004), URL http://www.jstatsoft.org/v11/i08/
[26] Liaw, A.; Wiener, M., Classification and regression by randomForest, R News, 2, 3, 18-22, (2002), URL http://CRAN.R-project.org/doc/Rnews/
[27] Mächler, M., 2014. nor1mix: Normal (1-d) mixture models (S3 classes and methods). R package version 1.2-0. URL http://CRAN.R-project.org/package=nor1mix.
[28] McLachlan, G.; Peel, D., Finite Mixture Models, (2004), John Wiley & Sons
[29] Nelsen, R. B., An Introduction to Copulas, (2007), Springer Science & Business Media
[30] Quandt, R. E., A new approach to estimating switching regressions, J. Amer. Statist. Assoc., 67, 338, 306-310, (1972) · Zbl 0237.62047
[31] R Core Team, R: A Language and Environment for Statistical Computing, (2013), R Foundation for Statistical Computing: R Foundation for Statistical Computing Vienna, Austria, URL http://www.R-project.org/
[32] Spindler, M., Lasso for instrumental variable selection: A Replication study, J. Appl. Econometrics, (2014)
[33] Städler, N., 2010, fmrlasso: Lasso for Finite Mixture of Regressions. R package version 1.0.
[34] Städler, N.; Bühlmann, P.; Van De Geer, S., L1-penalization for mixture regression models, Test, 19, 2, 209-256, (2010) · Zbl 1203.62128
[35] Tibshirani, R., Regression shrinkage and selection via the lasso, J. R. Stat. Soc. Ser. B Stat. Methodol., 267-288, (1996) · Zbl 0850.62538
[36] Xiang, S., Semiparametric Mixture Models, (2014), Kansas State University, (Ph.D thesis)
[37] Yuksel, S. E.; Wilson, J. N.; Gader, P. D., Twenty years of mixture of experts, IEEE Trans. Neural Netw. Learn. Syst., 23, 8, 1177-1193, (2012)
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.