## Estimation of an oblique structure via penalized likelihood factor analysis.(English)Zbl 06984059

Summary: The problem of sparse estimation via a lasso-type penalized likelihood procedure in a factor analysis model is considered. Typically, model estimation assumes that the common factors are orthogonal (i.e., uncorrelated). However, if the common factors are correlated, the lasso-type penalization method based on the orthogonal model frequently estimates an erroneous model. To overcome this problem, factor correlations are incorporated into the model. Together with parameters in the orthogonal model, these correlations are estimated by a maximum penalized likelihood procedure. Entire solutions are computed by the EM algorithm with a coordinate descent, enabling the application of a wide variety of convex and nonconvex penalties. The proposed method is applicable even when the number of variables exceeds that of observations. The effectiveness of the proposed strategy is evaluated by Monte Carlo simulations, and its utility is demonstrated through real data analysis.

### MSC:

 62-XX Statistics

### Software:

glmnet; grpreg; sparsenet; LISREL
Full Text:

### References:

 [1] Adachi, K., Factor analysis with EM algorithm never gives improper solutions when sample covariance and initial parameter matrices are proper, Psychometrika, 78, 2, 380-394, (2013) · Zbl 1284.62679 [2] Akaike, H., Factor analysis and AIC, Psychometrika, 52, 3, 317-332, (1987) · Zbl 0627.62067 [3] Anderson, J.; Gerbing, D., The effect of sampling error on convergence, improper solutions, and goodness-of-fit indices for maximum likelihood confirmatory factor analysis, Psychometrika, 49, 2, 155-173, (1984) [4] Breheny, P., 2013. grpreg: Regularization paths for regression models with grouped covariates. R package version 2.5. URL http://cran.r-project.org/web/packages/grpreg/. [5] Breheny, P.; Huang, J., Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection, Ann. Appl. Stat., 5, 1, 232, (2011) · Zbl 1220.62095 [6] Choi, J.; Zou, H.; Oehlert, G., A penalized maximum likelihood approach to sparse factor analysis, Stat. Interface, 3, 4, 429-436, (2011) · Zbl 1245.62074 [7] Clarke, M., A rapidly convergent method for maximum-likelihood factor analysis, British J. Math. Statist. Psych., 23, 1, 43-52, (1970) · Zbl 0205.23802 [8] Das, U.; Gupta, S.; Gupta, S., Screening active factors in supersaturated designs, Comput. Statist. Data Anal., 77, 223-232, (2014) [9] Efron, B.; Hastie, T.; Johnstone, I.; Tibshirani, R., Least angle regression (with discussion), Ann. Statist., 32, 407-499, (2004) · Zbl 1091.62054 [10] Fan, J.; Li, R., Variable selection via nonconcave penalized likelihood and its oracle properties, J. Amer. Statist. Assoc., 96, 1348-1360, (2001) · Zbl 1073.62547 [11] Friedman, J. H., Fast sparse regression and classification, Int. J. Forecast., 28, 3, 722-738, (2012) [12] Friedman, J.; Hastie, H.; Höfling, H.; Tibshirani, R., Pathwise coordinate optimization, Ann. Appl. Stat., 1, 302-332, (2007) · Zbl 1378.90064 [13] Friedman, J.; Hastie, T.; Tibshirani, R., Regularization paths for generalized linear models via coordinate descent, J. Statist. Software, 33, (2010) [14] Harman, H., Modern factor analysis, (1976), University of Chicago Press · Zbl 0161.39805 [15] Hendrickson, A.; White, P., Promax: a quick method for rotation to oblique simple structure, Br. J. Stat. Psychol., 17, 1, 65-70, (1964) [16] Hirose, K.; Kawano, S.; Konishi, S.; Ichikawa, M., Bayesian information criterion and selection of the number of factors in factor analysis models, J. Data Sci., 9, 1, 243-259, (2011) [17] Hirose, K.; Yamamoto, M., Sparse estimation via nonconcave penalized likelihood in a factor analysis model, Stat. Comput., (2014), in press [18] Hu, L.-t.; Bentler, P. M., Cutoff criteria for fit indexes in covariance structure analysis: conventional criteria versus new alternatives, Struct. Equ. Model., 6, 1, 1-55, (1999) [19] Jennrich, R.; Robinson, S., A Newton-raphson algorithm for maximum likelihood factor analysis, Psychometrika, 34, 1, 111-123, (1969) [20] Jöreskog, K., Some contributions to maximum likelihood factor analysis, Psychometrika, 32, 4, 443-482, (1967) · Zbl 0183.24603 [21] Jöreskog, K.G., Sörbom, D., 1996. LISREL 8 user’s reference guide. Scientific Software. [22] Kaiser, H., The varimax criterion for analytic rotation in factor analysis, Psychometrika, 23, 3, 187-200, (1958) · Zbl 0095.33603 [23] Kano, Y., Improper solutions in exploratory factor analysis: causes and treatments, (Advances in Data Science and Classification: Proceedings of the 6th Conference of the International Federation of Classification Societies (IFCS-98), Università“ La Sapienza”, Rome, 21-24 July, (1998), Springer Verlag), 375 [24] Konishi, S.; Ando, T.; Imoto, S., Bayesian information criteria and smoothing parameter selection in radial basis function networks, Biometrika, 91, 1, 27-43, (2004) · Zbl 1132.62313 [25] Lawley, D.; Maxwell, A., Factor Analysis as a Statistical Method, Vol. 18, (1971), Butterworths London · Zbl 0251.62042 [26] Martin, J.; McDonald, R., Bayesian estimation in unrestricted factor analysis: a treatment for heywood cases, Psychometrika, 40, 4, 505-517, (1975) · Zbl 0318.62040 [27] Mazumder, R.; Friedman, J.; Hastie, T., Sparsenet: coordinate descent with nonconvex penalties, J. Amer. Statist. Assoc., 106, 1125-1138, (2011) · Zbl 1229.62091 [28] Mulaik, S., The foundations of factor analysis, (2010), Chapman and Hall/CRC Boca Raton · Zbl 1188.62185 [29] Ning, L., Georgiou, T.T., 2011. Sparse factor analysis via likelihood and $$\ell_1$$ regularization. In: 50th IEEE Conference on Decision and Control and European Control Conference. pp. 5188-5192. [30] Rubin, D.; Thayer, D., EM algorithms for ML factor analysis, Psychometrika, 47, 1, 69-76, (1982) · Zbl 0483.62046 [31] Tibshirani, R., Regression shrinkage and selection via the lasso, J. Roy. Statist. Soc. Ser. B, 58, 267-288, (1996) · Zbl 0850.62538 [32] Van Driel, O., On various causes of improper solutions in maximum likelihood factor analysis, Psychometrika, 43, 2, 225-243, (1978) · Zbl 0384.62043 [33] Ye, J., On measuring and correcting the effects of data mining and model selection, J. Amer. Statist. Assoc., 93, 120-131, (1998) · Zbl 0920.62056 [34] Zhang, C., Nearly unbiased variable selection under minimax concave penalty, Ann. Statist., 38, 894-942, (2010) · Zbl 1183.62120 [35] Zhao, P.; Yu, B., On model selection consistency of lasso, J. Mach. Learn. Res., 7, 2, 2541, (2007) · Zbl 1222.62008 [36] Zou, H., The adaptive lasso and its oracle properties, J. Amer. Statist. Assoc., 101, 1418-1429, (2006) · Zbl 1171.62326 [37] Zou, H.; Hastie, T.; Tibshirani, R., On the degrees of freedom of the lasso, Ann. Statist., 35, 2173-2192, (2007) · Zbl 1126.62061
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.